Friday, 30 April 2021

Computational Blurring As Resolution For AI Autonomous Car Roving Eye 

Self-driving cars are able to capture the surroundings on the path to the destination. To protect privacy, sensors could be calibrated to ensure anything in the background is blurry. (Credit: Getty Images)  

By Lance Eliot, the AI Trends Insider 

Have you ever taken a picture and realized that a person in the snapshot appears blurry due to your camera being out of focus? I’m sure you have. 

In some cases, the blur happens by accident, whereby you should have set the focus but failed to do so. It was merely an oversight. If you discovered the issue right away, hopefully, you were able to quickly take another photo and delete the remiss one. Problem solved.   

There are also situations involving the use of a blur for intentional purposes. 

Perhaps you have a few friends that want their pictures taken. Behind them are people partying and making quite a scene. You don’t want the background to overtake the attention to the foreground, namely the gaggle of your best friends. So, you set the focus to make the background blurry and ensure that the foreground is nice and sharp.  

Suppose that you indeed take such a picture and keep it around on your smartphone. A few weeks later, someone asks you if George or Samantha were in attendance at the event. You are pretty sure they were, though your memory is a bit hazy (potentially due to the numerous margaritas that you had).   

Wait for a second, they might have been in the background of that photo that you took of your dearest friends. You go ahead and pull up the picture to check it out. Unfortunately, the blur is overwhelming and there are no easy means to discern who else was captured in the snapshot. 

Why all this discussion about blurs in images?   

Blurring techniques can potentially offer a level of privacy for those that might be captured on an image or a video. This might entail blurring the face of someone. It could involve blurring their entire body and whatever kinds of motions they made. The blur is obscuring information and making it difficult to ascertain what was in the image or video. That can be a good thing if you are aiming to provide privacy to those caught on tape.   

As with most things in life, there is also a downside. In the example above, the blurred portion of the photo was unhelpful in answering the question about whether George or Samantha was at the party. There are likely lots of instances where blur makes some potentially useful information only marginally handy and possibly entirely unusable.   

Shifting gears (I promise to come back to the blurs in a moment), let’s talk about cars. 

The future of cars consists of self-driving cars. These are cars that have an AI driving system at the wheel of the vehicle. The driving actions are undertaken by the AI. No human driver is making use of the driving controls. 

An important element of self-driving cars is the use of various sensors to detect the driving scene. These sensory devices are the veritable eyes and ears of the AI driving system. For most self-driving cars, the types of sensors encompass video cameras, radar units, LIDAR units, ultrasonic devices, and the like. The sensors are mounted on the vehicle and are used to figure out where the roadway is, where other cars are, where pedestrians are standing, and so on. 

This seems relatively innocuous and there isn’t much attention being given to the plethora of sensors that self-driving cars contain. No big deal, it seems, since the sensors are logically required to sense the world that surrounds the vehicle. Without those sensory devices, the AI driving system would be blind to what is happening around the car.   

Here’s the rub. Those sensors can capture a lot more than you might at first imagine that they do.   

Imagine that you put a video camera on the top of your conventional car. You turned it on and set it to record continuously. You then drive from your home to the local grocery store. What happens during that rather routine drive? Your video camera is capturing all the activity that you perchance come across.  

For example, after backing down your driveway, you drive down the block to the corner. Turns out that your neighbors next door were in their front yard. They were tossing a baseball back and forth with their children. That activity is now captured onto your video camera recording.   

I believe you get the gist of the matter.   

With merely one video camera mounted on your conventional car, it will be collecting videos about the daily lives of anyone that you happen to drive past. Let’s up the ante. Suppose that all of your neighbors put video cameras on the rooftops of their cars too. They will also now be recording anything that they encounter while on any driving journey.   

Welcome to the emerging world of self-driving cars.   

Those heralded self-driving cars are going to be capturing imagery and other data about whatever they detect, wherever they go, all the time that they are underway. Keep in mind there is an assumption that eventually, we will predominantly have zillions of self-driving cars on our roadways, and very few conventional cars. 

Returning to your neighborhood and the idea of video cameras mounted on a few neighborhood conventional cars, ratchet this up to assume that all cars that come down your street will have a full suite of state-of-the-art sensors (because they are self-driving cars). Those advanced vehicles will be amassing a lot of information about the comings and goings on your block.   

If self-driving cars were empty and simply roaming the community to be available for any ride requests, they would be capturing daily activity.   

Whenever someone takes a ride in a self-driving car, it will capture the surroundings that are along the path to the stated destination. By riding in a self-driving car from your home to the local store, the self-driving car will record video and other data about whatever was taking place during that time period. 

Please sit down for this next shocker.   

I don’t want to get you on the edge of your seat, but imagine that this massive amount of data was collated and assembled to try and piece together the daily efforts in a city or town. In theory, you could pull together the data from all the self-driving cars and pretty much recreate a semblance of where people were, when they were there, what they did while there (assuming they were outside or otherwise visible), etc.   

I’ve referred to this as the roving eye of the coming era of self-driving cars. 

You could say that this roving eye will be a marvelous addition to our society. There are numerous positive uses. For example, you want to see the latest real estate in an area that you are considering buying a home. It is conceivable that the data from self-driving cars could be used to see exactly what the homes look like, nearly up-to-the-minute.   

This can also be used for crime-fighting. A burglar tries to break into someone’s house. The crook scoots away before being caught. Turns out that there were self-driving cars that happened to be along that street during the time period of the criminal activity. The sensory data is examined, and the identity of the thief is figured out.   

There are some notable downsides too. 

Do you want just anyone to know where you were on last Monday or Tuesday? Presumably, an inspection of data from self-driving cars might show that you were in front of your house, mowing the lawn, on Monday morning. You then left your house and walked down the street to visit a friend at another house. You stayed there for about two hours. And so on.   

Some people are worried about privacy intrusion from video cameras that are mounted on telephone poles or that are used by people as they carry their smartphones. Those are peanuts in comparison to the magnitude of video and other sensory data capturing those self-driving cars will undertake. The more we adopt and utilize self-driving cars, the greater the amount of observing of our daily lives that will occur. It’s as simple as that.   

Are we doomed to come under the crush of a Big Brother dystopian world by accepting self-driving cars as our preferred mode of transportation?   

Sadly, not many are considering this issue, and it won’t visibly arise until there are enough self-driving cars that the kind of overlapping and semi-continuous recording rises to a level high enough to be noticed. Until then, we will be laying the seeds for the future that will catch us by “surprise” about what we have done to ourselves over time.   

Shucks, you might be thinking, if this is a looming problem, perhaps something ought to be done, sooner rather than later. There must be some means to keep from digging a hole that appears to be a quite disturbing abyss.   

Aha, allow me to bring up an old friend of sorts, namely the blur. 

The earlier discussion about the blurring of images was in fact the “answer” before I had presented you with the question at hand.   

Similar to how a blurring effect was able to mask whether George or Samantha was at the wild party, the same kind of notion and capacity could be used for dealing with the data that the roving eye detects and collects. 

Here is an intriguing question to ponder: Will the advent of AI-based true self-driving cars and their roving eye be potentially made more societally palatable via the use of blurring? 

Let’s unpack the matter and see.   

For my framework about AI autonomous cars, see the link here: https://aitrends.com/ai-insider/framework-ai-self-driving-driverless-cars-big-picture/   

Why this is a moonshot effort, see my explanation here: https://aitrends.com/ai-insider/self-driving-car-mother-ai-projects-moonshot/ 

For more about the levels as a type of Richter scale, see my discussion here: https://aitrends.com/ai-insider/richter-scale-levels-self-driving-cars/   

For the argument about bifurcating the levels, see my explanation here: https://aitrends.com/ai-insider/reframing-ai-levels-for-self-driving-cars-bifurcation-of-autonomy/ 

Understanding The Levels Of Self-Driving Cars   

As a clarification, true self-driving cars are ones where the AI drives the car entirely on its own and there isn’t any human assistance during the driving task.   

These driverless vehicles are considered Level 4 and Level 5, while a car that requires a human driver to co-share the driving effort is usually considered at Level 2 or Level 3. The cars that co-share the driving task are described as being semi-autonomous, and typically contain a variety of automated add-on’s that are referred to as ADAS (Advanced Driver-Assistance Systems).   

There is not yet a true self-driving car at Level 5, which we don’t yet even know if this will be possible to achieve, and nor how long it will take to get there.   

Meanwhile, the Level 4 efforts are gradually trying to get some traction by undergoing very narrow and selective public roadway trials, though there is controversy over whether this testing should be allowed per se (we are all life-or-death guinea pigs in an experiment taking place on our highways and byways, some contend). 

Since semi-autonomous cars require a human driver, the adoption o/f those types of cars won’t be markedly different than driving conventional vehicles, so there’s not much new per se to cover about them on this topic (though, as you’ll see in a moment, the points next made are generally applicable). 

For semi-autonomous cars, it is important that the public needs to be forewarned about a disturbing aspect that’s been arising lately, namely that despite those human drivers that keep posting videos of themselves falling asleep at the wheel of a Level 2 or Level 3 car, we all need to avoid being misled into believing that the driver can take away their attention from the driving task while driving a semi-autonomous car.   

You are the responsible party for the driving actions of the vehicle, regardless of how much automation might be tossed into a Level 2 or Level 3. 

For why remote piloting or operating of self-driving cars is generally eschewed, see my explanation here: https://aitrends.com/ai-insider/remote-piloting-is-a-self-driving-car-crutch/   

To be wary of fake news about self-driving cars, see my tips here: https://aitrends.com/ai-insider/ai-fake-news-about-self-driving-cars/ 

The ethical implications of AI driving systems are significant, see my indication here: https://aitrends.com/selfdrivingcars/ethically-ambiguous-self-driving-cars/   

Be aware of the pitfalls of normalization of deviance when it comes to self-driving cars, here’s my call to arms: https://aitrends.com/ai-insider/normalization-of-deviance-endangers-ai-self-driving-cars/   

Self-Driving Cars And Roving Eye Blurring   

For Level 4 and Level 5 true self-driving vehicles, there won’t be a human driver involved in the driving task. All occupants will be passengers; the AI is doing the driving.   

One aspect to immediately discuss entails the fact that today’s AI is not sentient. 

In other words, the AI is altogether a collective of computer-based programming and algorithms, and most assuredly not able to reason in the same manner that humans can. I mention this aspect because many headlines boldly proclaim or imply that AI has turned the corner and become equal to human intelligence. As if that wasn’t bad enough, the outsized headlines seek to amp further the matter by contending that AI is reaching superhuman capabilities (for why the use of “superhuman” as a moniker is especially misleading and inappropriate). 

Why this emphasis about the AI not being sentient? Because I want to underscore that when discussing the role of the AI driving system, I am not ascribing human qualities to the AI.   

Please be aware that there is an ongoing and dangerous tendency these days to anthropomorphize AI. In essence, people are assigning human-like sentience to today’s AI, despite the undeniable and inarguable fact that no such AI exists as yet.   

With that clarification, you can envision the AI driving system doesn’t natively somehow “know” that the sensors are capturing a lot of information that might be considered intrusive.   

Those are facets that would need to be programmatically devised by the automaker or self-driving tech firm that makes the AI driving system. If they aren’t considering those facets, there won’t be anything somehow innately in the AI driving system that will “realize” that society doesn’t want that kind of pell-mell collecting of our daily activities.   

With that important context, let’s dig into how this might work.   

Recall that when discussing the act of taking a picture, one point made was that the background could be out of focus and thus blurry, and likewise the foreground could be out of focus and blurry.   

There’s not much debate that the foreground of any sensory detection by a self-driving car is going to be crucial for the driving of the vehicle. As such, the foreground is ostensibly going to have to be kept in focus.   

The more open-ended question is whether the background needs to be kept in focus too. In other words, suppose that the sensors were calibrated to ensure that anything in the background was blurry. This might help to overcome the otherwise wanton avid capturing of daily activities that are not particularly crucial to the driving of the vehicle.   

Of course, you can readily argue that this dividing line between the foreground and the background is altogether untenable.   

Suppose that a dog is running around in someone’s front yard. We probably would want the self-driving car to detect that a dog is up ahead, and though currently inside a yard, the dog might decide to dart into the street once the self-driving car comes along.   

If there is a purposeful blurring when the sensors are trying to detect the driving scene, it could be that vital clues about the surroundings would no longer be readily usable. This in turn could mean that the AI driving systems will not be able to drive as safely as we would hope for.   

Some would assert that it makes absolutely no sense to intentionally undercut the capabilities of the sensors. Indeed, those proponents would undoubtedly contend that we need even stronger sensors that have increasingly piercing capabilities, being able to do detection that is far above that of what humans might be able to do.   

Under that rather strident thinking, we might wish to momentarily herein set aside the notion of trying to prevent the sensors from capturing whatever they can potentially detect. Assume that the sensors are going to be allowed to detect as much as they can.   

The sensory data flows into the onboard processors of the self-driving car. At that juncture, the data is mathematically examined for purposes of driving the car. The AI driving system tries to computationally interpret the data to figure out the driving scene. Upon doing so, the AI driving system figures out the driving action to undertake and emits commands to the vehicle accordingly. 

You could suggest that the data from the sensors could now be discarded since it has been used for its primary purpose. In that way of thinking, there is no need to worry about what is contained in the data. Just dump it out, after it has been used for the driving act.  Ergo, the data cannot now presumably be used for any nefarious purposes since it isn’t sitting around anymore. The moment that the sensory data has been analyzed for driving purposes, make sure it gets deleted. That is the end of the road for the sensory data. 

Well, that presents a couple of challenges.   

First, it means that you can’t potentially use the data for the other augmentable upside possibilities that were mentioned earlier. The data won’t be around, and therefore it can’t be used to figure out the latest aspects of real estate or be used to catch those despicable criminals. Some would argue that you are possibly tossing out the baby with the bathwater (an old-time expression).   

Secondly, you cannot necessarily guarantee that the data will be deleted. Once you’ve let the data into the onboard systems, this is like letting the horse out of the barn, or the cat out of the bag. Sure, you might believe that the data is going to be deleted, and the system might be programmed accordingly. Nonetheless, the data can potentially be kept, despite the otherwise desired notion of deleting it.   

In that viewpoint, you either are going to not capture the data at all, or you are going to capture it and need to do something with it. 

This is where an intentional blurring comes to play. 

One approach consists of taking the data after it has been examined for driving purposes and then blurring the data so that those elements that are generally considered irrelevant to the driving act can no longer be readily discerned. You don’t necessarily delete those aspects, you blur them. 

A difficult question arises about when the right timing is to do the blurring.   

If you do so while the data is fresh and just brought into the onboard systems, this means that you need the computational resources on-board to do this type of blurring action while the car is underway. Some would argue that whatever computational processing you’ve got ought to go entirely towards the driving act. Do not usurp those precious processing cycles from the life-or-death matters of driving the vehicle.   

Okay, from that perspective, we might have some background process that does the blurring when the self-driving car is parked and not underway. Or basically whenever there is spare processing time available.   

Another notion is that you could do the blurring once the data has been uploaded into the cloud. You see, it turns out that self-driving cars are going to be using OTA (Over-The-Air) electronic communications to connect with the cloud of the fleet operator or automaker of the self-driving car. This would be used to readily push down crucial updates to the AI driving system. It can also be used to upload data from the self-driving car and into the cloud.   

Thus, some would say that you shouldn’t use any processing on-board the self-driving car for the blurring and instead let it happen in the cloud. The self-driving car would upload whatever data it has collected. This data would be in its rawest form. The cloud processing by the fleet operator or automaker would be programmed to then blur the data. 

Sorry to report that this chain of where the data is going and what its status consists of will open a bit of Pandora’s box.   

For example, suppose that the raw data in its entirety is being kept on board the self-driving car. Once it gets loaded up into the cloud, perhaps at that juncture it is deleted from the onboard processors (a copy now exists in the cloud). Unfortunately, this does mean that for some length of time, the data is sitting there in the vehicle, in all its glory. There is nothing blurred as yet. This leaves open the chance that the data could be somehow siphoned or copied and now be made available with everything it has to show. 

That’s why some vehemently argue that the data ought to be blurred at the soonest possible opportunity. 

Of course, there are other matters intertwined. The data is likely in an unencrypted format upon first flowing into the onboard systems. Some would urge that the data be encrypted right away. In that manner, you don’t necessarily need to worry about the blurring, since anyone that could surreptitiously get the data won’t have anything useful due to the encryption. 

This brings up that there are two camps typically at loggerheads here. One camp says that the full and unblurred data should never be allowed to leave the car. In that sense, it should not be allowed to be uploaded to the cloud. Only once it has been blurred, and possibly encrypted too, can it be uploaded. The other camp says that it is fine to upload the whole shebang, and as long as it is encrypted, you just blur it after getting into the cloud. 

For more details about ODDs, see my indication at this link here: https://www.aitrends.com/ai-insider/amalgamating-of-operational-design-domains-odds-for-ai-self-driving-cars/ 

On the topic of off-road self-driving cars, here’s my details elicitation: https://www.aitrends.com/ai-insider/off-roading-as-a-challenging-use-case-for-ai-autonomous-cars/ 

I’ve urged that there must be a Chief Safety Officer at self-driving car makers, here’s the scoop: https://www.aitrends.com/ai-insider/chief-safety-officers-needed-in-ai-the-case-of-ai-self-driving-cars/ 

Expect that lawsuits are going to gradually become a significant part of the self-driving car industry, see my explanatory details here: https://aitrends.com/selfdrivingcars/self-driving-car-lawsuits-bonanza-ahead/   

Conclusion   

There are a lot more monkey wrenches that can be thrown into this thorny matter. 

Let’s suppose that the data does get blurred. We are presuming this implies that there is no longer the qualm about being able to detect that your neighbors were playing catch with their kids in their front yard.   

Sometimes, that which can be blurred can, later on, be unblurred.  

This means that the blurring might be undone. If the images or video is allowed to be copied, you could use all sorts of unblurring techniques to try and turn the blurred aspects into something discernible. It might not be recast into its original pristine state, but at least given sufficient definition that perhaps it once again is intruding on privacy.   

The cat and mouse gambit of the blurring algorithms is an ongoing battle. Someone comes up with a newer and better blurring routine. Someone else then comes out with a new and improved unblurring approach. Round and round it goes. 

At this time, few are worrying about the roving eye of self-driving cars. 

Over time, perhaps my exhortations will only become a blur, though I am really hoping they become unblurred in time for appropriate thought and action to be taken about this mesmerizing and rather clear-cut dilemma.   

There really is no blur about it. 

Copyright 2021 Dr. Lance Eliot  

http://ai-selfdriving-cars.libsyn.com/website 

Source

The post Computational Blurring As Resolution For AI Autonomous Car Roving Eye  appeared first on abangtech.



source https://abangtech.com/computational-blurring-as-resolution-for-ai-autonomous-car-roving-eye/

Apple AirTags use UWB wireless tech. Here's how ultra wideband makes your life easier – CNET

Apple UWB car patent

Apple has patented the use of UWB, or ultra wideband, to recognize when you’re approaching your car, unlock its doors and govern when you can turn it on.

Apple via US PTO; Stephen Shankland/CNET This story is part of Apple Event, our full coverage of the latest news from Apple headquarters.

You’ve heard of Wi-Fi, Bluetooth and 5G. Now it’s time to learn another wireless communications term: ultra wideband, or UWB. The technology lets you pinpoint the exact location of phones, key fobs, wallets and tracking tags, helping you find lost dogs or automatically unlock your car. And now it’s the foundation to Apple’s $29 AirTag trackers, announced in April and currently on sale.

Smartphone leaders Apple and Samsung have built UWB into their high-end models, including the iPhone 11, iPhone 12, Galaxy Note 20 Ultra, Galaxy S21 Plus and S21 Ultra. The Apple Watch Series 6 also has UWB built in. The same U1 chip Apple uses there also powers the AirTags, which means you can use a newer iPhone for “precision finding” that tells you the direction and distance to your AirTag-equipped keychain or purse as long as you’re within range.

UWB calculates locations to within less than a half inch by measuring how long it takes super-short radio pulses to travel between devices. It’ll come later to Samsung’s new SmartTags, which use Bluetooth to start, and carmakers including Audi, BMW and Ford are also hot for UWB.

Now playing: Watch this: Apple AirTags help you find anything with your iPhone

3:13

Right now UWB’s uses are limited, but as it matures and spreads to more devices, UWB could lead to a world where just carrying your phone or wearing your watch helps log you into your laptop as you approach it or lock your house when you leave.

“Being able to determine precisely where you are in an environment is increasingly important,” said ABI Research analyst Andrew Zignani, who expects shipments of UWB-enabled devices to surge from 150 million in 2020 to 1 billion in 2025. “Once a technology becomes embedded in a smartphone, that opens up very significant opportunities for wireless technology.”

Here’s a look at UWB and its uses.

What’s UWB good for?

Satellite-based GPS is useful for finding yourself on a map but struggles with anything much more precise and indoors. UWB doesn’t have those handicaps.

UWB could switch your TV from your child’s Netflix profile to yours. Your smart speaker could give calendar alerts only for the people in the room. Your laptop could wake up when you enter the home office.

Imagine this scenario: You leave the office and as you near your car, receivers in its doors recognize your phone and unlock the vehicle for you. When you get out of the car at home, the receivers recognize you’re no longer in the vehicle and lock the doors.

With UWB, your home could recognize that you’re returning at night and illuminate your walkway. The technology could then automatically unlock your front door and turn on your home sound system, which follows you from room to room. “I’m walking in a sound and light cocoon in my house,” said Lars Reger, chief technology officer of NXP Semiconductors, a UWB proponent whose chips are widely used in cars.

Samsung promises UWB technology for precisely tracking your location will automatically unlock car doors with digital keys in your smartphone.

Samsung promises UWB technology for precisely tracking your location will automatically unlock car doors with digital keys in your smartphone.

Samsung promises UWB technology for precisely tracking your location will automatically unlock car doors with digital keys in your smartphone.

Screenshot by Stephen Shankland/CNET

Bluetooth-based location sensing takes at least two seconds to get an accurate fix on your location, but UWB is a thousand times faster, Reger said.

UWB will add more than convenience, supporters say. Conventional key fobs have security problems in regard to remotely unlocking cars: criminals can use relay attacks that mimic car and key communications to steal a vehicle. UWB has cryptographic protections against that sort of problem.

This same ability to track your movements has downsides, particularly if you don’t like the idea of the government following your movements or coffee shops flooding your phone with coupons as you walk by. But with today’s privacy push, it’s likely phone makers won’t let devices track your phone without your permission.

How is Apple supporting UWB?

iPhones since the iPhone 11 family have included Apple’s UWB chip, the U1. It joins a handful of other processors Apple has developed, including the A series that powers iPhones and iPads, the M1 at the heart of new Macs, iPad Pros and iMacs, and the T series that handles Touch ID and other security duties on Macs.

Apple AirTags

Apple AirTags

Apple unveiled its AirTags on Tuesday. 

Credit: Apple/Screenshot by CNET

AirTags really bring the technology alive, though. UWB communicates with an iPhone 11 or 12 so a big arrow leads you to the tag. When UWB isn’t in range, a Bluetooth connection means AirTags tap into Apple’s Find My system, which lets other people’s devices discover your AirTag’s location and share it privately with you.

“The new Apple-designed U1 chip uses ultra wideband technology for spatial awareness — allowing iPhone 11 Pro to precisely locate other U1-equipped Apple devices. It’s like adding another sense to [the] iPhone,” Apple said of the U1 chip when it arrived. “With U1 and iOS 13, you can point your iPhone toward someone else’s, and AirDrop will prioritize that device so you can share files faster. And that’s just the beginning.”

Apple only promises UWB links between its own devices for now. But UWB standardization should open up a world of other connections, and software tweaks should let Apple adapt as UWB standards mature.

Apple’s years of UWB work are evident in several patents. That includes patents for shaping UWB pulses for more accuracy in distance measurements, using a phone, watch or key fob location to enter and start a car, calculating your path toward a car so your car can send your phone a request for biometric authentication, and letting Bluetooth and UWB cooperate to grant you access to your car.

Apple hopes UWB will help you find your dog, control your thermostat and unlock your front door.

Apple hopes UWB will help you find your dog, control your thermostat and unlock your front door.

Apple hopes UWB will help you find your dog, control your thermostat and unlock your front door.

Apple via US PTO

How is Samsung supporting UWB?

At its Galaxy S21 launch event in January, Samsung touted UWB as a wireless technology that’ll bring new convenience to your life. That includes unlocking your house or car as you walk up to it.

“With Digital Key, you’ll be able to open the door of your house with your mobile device,” said Kevin Chung of Samsung’s direct-to-consumer center during the launch event. “You’ll be able to unlock your car door with your phone. The door will unlock when you reach it — no sooner, no later.”

You’ll be able to send digital keys to friends or family members, and Samsung’s AR finder app will point the direction to your car in a crowded parking lot. Samsung announced digital key partnerships with BMW, Audi, Ford and Hyundai’s Genesis Motor.

Samsung SmartTags also use UWB.

Who else is interested in UWB?

Other companies involved with UWB include consumer electronics giants Samsung and Sony; chipmakers Decawave, Qualcomm, NXP and STMicroelectronics; carmakers Volkswagen, Hyundai, and Jaguar Land Rover; and car electronics powerhouse Bosch. Another notable player is Tile, which has sold tracking tags for years to help you find things like keychains and wallets.

Confusingly, those companies have banded together into two industry groups, the UWB Alliance formed in December 2018 and the FiRa Consortium (short for “fine ranging”) that formed in August 2019. Samsung joined FiRa, Apple isn’t listed as a member of either.

On top of that, there’s the Car Connectivity Consortium that’s working on digital key technology. The three groups have figured out who’s doing what now to avoid stepping on each other’s toes, Harrington said.

FiRa is working on standards to ensure UWB devices work together properly, while the UWB Alliance is trying to minimize UWB problems from the expansion of Wi-Fi into the 6GHz radio band that UWB also uses. For example, there are brief pauses in Wi-Fi signals sent in the 6GHz band, and UWB transmissions could sneak into those gaps, said UWB Alliance executive director Tim Harrington. 

How does UWB work?

The idea behind UWB has been around for decades — indeed, the University of Southern California established an ultra wideband laboratory called UltRa in 1996. Some of the concepts date back to radio pioneer Guglielmo Marconi, Harrington says.

UWB devices send lots of very short, low-power pulses of energy across an unusually wide spectrum of radio airwaves. UWB’s frequency range spans at least 500MHz, compared with Wi-Fi channels about a tenth as wide. UWB’s low-power signals cause little interference with other radio transmissions.

UWB sends up to 1 billion pulses per second — that’s 1 per nanosecond. By sending pulses in patterns, UWB encodes information. It takes between 32 and 128 pulses to encode a single bit of data, Harrington said, but given how fast the bits arrive, that enables data rates of 7 to 27 megabits per second.

apple-iphone-11-u1-chip

apple-iphone-11-u1-chip

Apple marketing chief Phil Schiller touted the company’s U1 chip for UWB in the iPhone 11.

Screenshot and illustration by Stephen Shankland/CNET

The IEEE (Institute of Electrical and Electronic Engineers) developed a UWB standard called 802.15.4 more than 15 years ago, but it didn’t catch on for its original intended use, sending data fast.

But location sensing made UWB a hot topic again?

Companies like Spark Microsystems use UWB for data transfer, but most tech giants like it for measuring location precisely. Even though 802.15.4 flopped when first created years ago, UWB’s renaissance is occurring because its super-short radio pulses let computers calculate distances very precisely.

Now UWB development is active again, for example with the 802.15.4z standard that bolsters security for key fobs and payments and improves location accuracy to less than a centimeter. Fixing today’s relay attack problems, where someone with radio technology essentially copies and pastes radio communications of key fobs or smartphone unlocking systems, was a top priority for 802.15.4z. “With the precise timing you get off UWB and the ability to know exactly where you are, you can cut the man in the middle [relay] attack completely,” Harrington said.

Another area of active development is improving how you can use your phone to make payments at a payment terminal.

Radio waves travel about 30 centimeters (1 foot) in a billionth of a second, but with short pulses, devices can calculate distances very exactly by measuring the “time of flight” of a radio signal to another device that responds with its own signal. With multiple antennas positioned in different spots, UWB radios can calculate the direction to another device, not just the distance.

UWB dovetails nicely with the internet of things, the networking of doorbells, speakers, lightbulbs and other devices.

It’s already used for location sensing. NFL players have UWB transmitters in each shoulder pad, part of broadcast technology used for instant replay animations. A football’s location is updated 2,000 times per second, according to Harrington.

Boeing uses UWB tags to track more than 10,000 tools, carts and other items on its vast factory floors.

UWB uses very little power. A sensor that sends a pulse once every second is expected to work for seven years off a single coin battery. 

Verizon has something called 5G Ultra Wideband. Is that the same thing?

No. Verizon uses the same words, but it’s merely a branding label.

“5G Ultra Wideband is our brand name for our 5G service,” said spokesman Kevin King. “It’s not a technology.”

Source

The post Apple AirTags use UWB wireless tech. Here's how ultra wideband makes your life easier – CNET appeared first on abangtech.



source https://abangtech.com/apple-airtags-use-uwb-wireless-tech-heres-how-ultra-wideband-makes-your-life-easier-cnet/

More US agencies potentially hacked, this time with Pulse Secure exploits

More US agencies potentially hacked, this time with Pulse Secure exploits

Getty Images

At least five US federal agencies may have experienced cyberattacks that targeted recently discovered security flaws that give hackers free rein over vulnerable networks, the US Cybersecurity and Infrastructure Security Agency said on Friday.

The vulnerabilities in Pulse Connect Secure, a VPN that employees use to remotely connect to large networks, include one that hackers had been actively exploiting before it was known to Ivanti, the maker of the product. The flaw, which Ivanti disclosed last week, carries a severity rating of 10 out of a possible 10. The authentication bypass vulnerability allows untrusted users to remotely execute malicious code on Pulse Secure hardware, and from there, to gain control of other parts of the network where it’s installed.

Federal agencies, critical infrastructure, and more

Security firm FireEye said in a report published on the same day as the Ivanti disclosure that hackers linked to China spent months exploiting the critical vulnerability to spy on US defense contractors and financial institutions around the world. Ivanti confirmed in a separate post that the zeroday vulnerability, tracked as CVE-2021-22893, was under active exploit.

In March, following the disclosure of several other vulnerabilities that have now been patched, Ivanti released the Pulse Secure Connect Integrity Tool, which streamlines the process of checking whether vulnerable Pulse Secure devices have been compromised. Following last week’s disclosure that CVE-2021-2021-22893 was under active exploit, CISA mandated that all federal agencies run the tool

“CISA is aware of at least five federal civilian agencies who have run the Pulse Connect Secure Integrity Tool and identified indications of potential unauthorized access,” Matt Hartman, deputy executive assistant director at CISA, wrote in an emailed statement. “We are working with each agency to validate whether an intrusion has occurred and will offer incident response support accordingly.”

CISA said it’s aware of compromises of federal agencies, critical infrastructure entities, and private sector organizations dating back to June 2020.

They just keep coming

The targeting of the five agencies is the latest in a string of large-scale cyberattacks to hit sensitive government and business organizations in recent months. In December, researchers uncovered an operation that infected the software build and distribution system of network management tools maker SolarWinds. The hackers used their control to push backdoored updates to about 18,000 customers. Nine government agencies and fewer than 100 private organizations—including Microsoft, antivirus maker Malwarebytes, and Mimecast—received follow-on attacks. In March, hackers exploiting newly discovered vulnerability in Microsoft Exchange compromised an estimated 30,000 Exchange servers in the US and as many as 100,000 worldwide. Microsoft said that Hafnium, its name for a group operating in China, was behind the attacks. In the days that followed, hackers not affiliated by Hafnium began infecting the already-compromised servers to install a new strain of ransomware. Two other serious breaches have also occurred, one against the maker of the Codecov software developer tool and the other against the seller of Passwordstate, a password manager used by large organizations to store credentials for firewalls, VPNs, and other network-connected devices. Both breaches are serious, because the hackers can use them to compromise the large number of customers of the companies’ products.

Ivanti said it’s helping to investigate and respond to exploits, which the company said have been “discovered on a very limited number of customer systems.”

“The Pulse team took swift action to provide mitigations directly to the limited number of impacted customers that remediates the risk to their system, and we plan to issue a software update within the next few days,” a spokesperson added.

Source

The post More US agencies potentially hacked, this time with Pulse Secure exploits appeared first on abangtech.



source https://abangtech.com/more-us-agencies-potentially-hacked-this-time-with-pulse-secure-exploits/

Oppo Find X3 Neo smartphone in review: Focus on the camera

Source

The post Oppo Find X3 Neo smartphone in review: Focus on the camera appeared first on abangtech.



source https://abangtech.com/oppo-find-x3-neo-smartphone-in-review-focus-on-the-camera/
Xiaomi Mi 11
Qualcomm Snapdragon 888 5G, Adreno 660, 8192

4674 Points ∼100% +5%

OnePlus 9
Qualcomm Snapdragon 888 5G, Adreno 660, 12288

4566 Points ∼98% +2%

Oppo Find X3 Neo
Qualcomm Snapdragon 865, Adreno 650, 12288

4464 Points ∼96%

Average Qualcomm Snapdragon 865
  (2896 – 4464, n=14)

3247 Points ∼69% -27%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

3123 Points ∼67% -30%

Average of class Smartphone
  (255 – 7514, n=94, last 2 years)

2084 Points ∼45% -53%

Oppo Find X3 Neo
Qualcomm Snapdragon 865, Adreno 650, 12288

4737 Points ∼100%

Xiaomi Mi 11
Qualcomm Snapdragon 888 5G, Adreno 660, 8192

4279 Points ∼90% -10%

OnePlus 9
Qualcomm Snapdragon 888 5G, Adreno 660, 12288

3894 Points ∼82% -18%

Average Qualcomm Snapdragon 865
  (2634 – 4737, n=14)

3196 Points ∼67% -33%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

3017 Points ∼64% -36%

Average of class Smartphone
  (72 – 6524, n=93, last 2 years)

1949 Points ∼41% -59%

Xiaomi Mi 11
Qualcomm Snapdragon 888 5G, Adreno 660, 8192

3784 Points ∼100% +20%

OnePlus 9
Qualcomm Snapdragon 888 5G, Adreno 660, 12288

3653 Points ∼97% +16%

Average Qualcomm Snapdragon 865
  (2687 – 3449, n=22)

3267 Points ∼86% +4%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

3252 Points ∼86% +4%

Oppo Find X3 Neo
Qualcomm Snapdragon 865, Adreno 650, 12288

3141 Points ∼83%

Average of class Smartphone
  (248 – 4201, n=196, last 2 years)

1914 Points ∼51% -39%

Xiaomi Mi 11
Qualcomm Snapdragon 888 5G, Adreno 660, 8192

1136 Points ∼100% +25%

OnePlus 9
Qualcomm Snapdragon 888 5G, Adreno 660, 12288

1124 Points ∼99% +23%

Oppo Find X3 Neo
Qualcomm Snapdragon 865, Adreno 650, 12288

911 Points ∼80%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

903 Points ∼79% -1%

Average Qualcomm Snapdragon 865
  (764 – 924, n=22)

903 Points ∼79% -1%

Average of class Smartphone
  (58 – 1604, n=196, last 2 years)

554 Points ∼49% -39%

OnePlus 9
Qualcomm Snapdragon 888 5G, Adreno 660, 12288

5751 Points ∼100% +14%

Xiaomi Mi 11
Qualcomm Snapdragon 888 5G, Adreno 660, 8192

5678 Points ∼99% +13%

Oppo Find X3 Neo
Qualcomm Snapdragon 865, Adreno 650, 12288

5039 Points ∼88%

Average Qualcomm Snapdragon 865
  (3789 – 5039, n=6)

4020 Points ∼70% -20%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

3826 Points ∼67% -24%

Average of class Smartphone
  (205 – 8672, n=57, last 2 years)

3155 Points ∼55% -37%

OnePlus 9
Qualcomm Snapdragon 888 5G, Adreno 660, 12288

5786 Points ∼100% +15%

Xiaomi Mi 11
Qualcomm Snapdragon 888 5G, Adreno 660, 8192

5599 Points ∼97% +11%

Oppo Find X3 Neo
Qualcomm Snapdragon 865, Adreno 650, 12288

5038 Points ∼87%

Average Qualcomm Snapdragon 865
  (3789 – 5038, n=6)

4021 Points ∼69% -20%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

3832 Points ∼66% -24%

Average of class Smartphone
  (153 – 7275, n=58, last 2 years)

2914 Points ∼50% -42%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

3712 Points ∼100%

Average Qualcomm Snapdragon 865
  (1786 – 4061, n=19)

3414 Points ∼92%

Average of class Smartphone
  (1786 – 4061, n=206, last 2 years)

2650 Points ∼71%

Average Qualcomm Snapdragon 865
  (7618 – 9104, n=19)

8249 Points ∼100%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

7618 Points ∼92%

Average of class Smartphone
  (210 – 11259, n=206, last 2 years)

3018 Points ∼37%

Average Qualcomm Snapdragon 865
  (4582 – 6961, n=20)

6202 Points ∼100%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

6174 Points ∼100%

Average of class Smartphone
  (262 – 6977, n=206, last 2 years)

2663 Points ∼43%

Average Qualcomm Snapdragon 865
  (4733 – 5780, n=21)

5315 Points ∼100% +12%

Xiaomi Mi 11
Qualcomm Snapdragon 888 5G, Adreno 660, 8192

5249 Points ∼99% +11%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

5176 Points ∼97% +9%

Oppo Find X3 Neo
Qualcomm Snapdragon 865, Adreno 650, 12288

4733 Points ∼89%

Average of class Smartphone
  (620 – 5780, n=268, last 2 years)

2868 Points ∼54% -39%

Xiaomi Mi 11
Qualcomm Snapdragon 888 5G, Adreno 660, 8192

12023 Points ∼100% 0%

Oppo Find X3 Neo
Qualcomm Snapdragon 865, Adreno 650, 12288

11999 Points ∼100%

Average Qualcomm Snapdragon 865
  (8633 – 11999, n=21)

9504 Points ∼79% -21%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

8633 Points ∼72% -28%

Average of class Smartphone
  (73 – 12914, n=268, last 2 years)

3392 Points ∼28% -72%

Xiaomi Mi 11
Qualcomm Snapdragon 888 5G, Adreno 660, 8192

9343 Points ∼100% +4%

Oppo Find X3 Neo
Qualcomm Snapdragon 865, Adreno 650, 12288

8947 Points ∼96%

Average Qualcomm Snapdragon 865
  (7517 – 8947, n=22)

8072 Points ∼86% -10%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

7517 Points ∼80% -16%

Average of class Smartphone
  (91 – 9839, n=268, last 2 years)

3052 Points ∼33% -66%

OnePlus 9
Qualcomm Snapdragon 888 5G, Adreno 660, 12288

0 Points ∼0% -100%

Oppo Find X3 Neo
Qualcomm Snapdragon 865, Adreno 650, 12288

5454 Points ∼100%

Average Qualcomm Snapdragon 865
  (3956 – 5765, n=21)

5126 Points ∼94% -6%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

5110 Points ∼94% -6%

Average of class Smartphone
  (620 – 5765, n=266, last 2 years)

2824 Points ∼52% -48%

Oppo Find X3 Neo
Qualcomm Snapdragon 865, Adreno 650, 12288

16809 Points ∼100%

Average Qualcomm Snapdragon 865
  (11956 – 16809, n=21)

12917 Points ∼77% -23%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

11956 Points ∼71% -29%

Average of class Smartphone
  (122 – 22052, n=268, last 2 years)

4724 Points ∼28% -72%

Oppo Find X3 Neo
Qualcomm Snapdragon 865, Adreno 650, 12288

11492 Points ∼100%

Average Qualcomm Snapdragon 865
  (8499 – 11492, n=21)

9633 Points ∼84% -16%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

9213 Points ∼80% -20%

Average of class Smartphone
  (149 – 11895, n=268, last 2 years)

3758 Points ∼33% -67%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

5144 Points ∼100%

Average Qualcomm Snapdragon 865
  (4582 – 5209, n=20)

4941 Points ∼96%

Average of class Smartphone
  (435 – 5318, n=259, last 2 years)

2753 Points ∼54%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

8629 Points ∼100%

Average Qualcomm Snapdragon 865
  (6500 – 9167, n=20)

8128 Points ∼94%

Average of class Smartphone
  (62 – 11573, n=259, last 2 years)

2786 Points ∼32%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

7500 Points ∼100%

Average Qualcomm Snapdragon 865
  (5996 – 7653, n=21)

7103 Points ∼95%

Average of class Smartphone
  (78 – 9138, n=259, last 2 years)

2609 Points ∼35%

OnePlus 9
Qualcomm Snapdragon 888 5G, Adreno 660, 12288

0 Points ∼0%

Average Qualcomm Snapdragon 865
  (3965 – 5274, n=20)

4844 Points ∼100%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

4841 Points ∼100%

Average of class Smartphone
  (607 – 5301, n=259, last 2 years)

2712 Points ∼56%

Average Qualcomm Snapdragon 865
  (10599 – 13305, n=20)

11750 Points ∼100%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

11632 Points ∼99%

Average of class Smartphone
  (54 – 16670, n=259, last 2 years)

3922 Points ∼33%

Average Qualcomm Snapdragon 865
  (8215 – 9611, n=20)

8915 Points ∼100%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

8868 Points ∼99%

Average of class Smartphone
  (68 – 11256, n=259, last 2 years)

3321 Points ∼37%

OnePlus 9
Qualcomm Snapdragon 888 5G, Adreno 660, 12288

0 Points ∼0%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

46963 Points ∼100% +13%

Oppo Find X3 Neo
Qualcomm Snapdragon 865, Adreno 650, 12288

41681 Points ∼89%

Average Qualcomm Snapdragon 865
  (17817 – 58293, n=20)

35224 Points ∼75% -15%

Average of class Smartphone
  (4811 – 59268, n=233, last 2 years)

22688 Points ∼48% -46%

Oppo Find X3 Neo
Qualcomm Snapdragon 865, Adreno 650, 12288

180392 Points ∼100%

Average Qualcomm Snapdragon 865
  (110875 – 180392, n=20)

147707 Points ∼82% -18%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

110875 Points ∼61% -39%

Average of class Smartphone
  (2177 – 224130, n=232, last 2 years)

55031 Points ∼31% -69%

Oppo Find X3 Neo
Qualcomm Snapdragon 865, Adreno 650, 12288

103701 Points ∼100%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

85130 Points ∼82% -18%

Average Qualcomm Snapdragon 865
  (56045 – 112989, n=20)

84555 Points ∼82% -18%

Average of class Smartphone
  (2920 – 117606, n=232, last 2 years)

38369 Points ∼37% -63%

OnePlus 9
Qualcomm Snapdragon 888 5G, Adreno 660, 12288

0 Points ∼0% -100%

OnePlus 9
Qualcomm Snapdragon 888 5G, Adreno 660, 12288

41 fps ∼100% +3%

Oppo Find X3 Neo
Qualcomm Snapdragon 865, Adreno 650, 12288

40 fps ∼98%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

32 fps ∼78% -20%

Average Qualcomm Snapdragon 865
  (18 – 40, n=24)

28.3 fps ∼69% -29%

Xiaomi Mi 11
Qualcomm Snapdragon 888 5G, Adreno 660, 8192

25 fps ∼61% -37%

Average of class Smartphone
  (1.1 – 60, n=268, last 2 years)

13.8 fps ∼34% -65%

Xiaomi Mi 11
Qualcomm Snapdragon 888 5G, Adreno 660, 8192

29 fps ∼100% +7%

OnePlus 9
Qualcomm Snapdragon 888 5G, Adreno 660, 12288

28 fps ∼97% +4%

Oppo Find X3 Neo
Qualcomm Snapdragon 865, Adreno 650, 12288

27 fps ∼93%

Average Qualcomm Snapdragon 865
  (20 – 27, n=24)

20.5 fps ∼71% -24%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

20 fps ∼69% -26%

Average of class Smartphone
  (0.26 – 101, n=267, last 2 years)

9.53 fps ∼33% -65%

OnePlus 9
Qualcomm Snapdragon 888 5G, Adreno 660, 12288

58 fps ∼100% +2%

Oppo Find X3 Neo
Qualcomm Snapdragon 865, Adreno 650, 12288

57 fps ∼98%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

48 fps ∼83% -16%

Average Qualcomm Snapdragon 865
  (28 – 57, n=24)

43.8 fps ∼76% -23%

Xiaomi Mi 11
Qualcomm Snapdragon 888 5G, Adreno 660, 8192

41 fps ∼71% -28%

Average of class Smartphone
  (1.5 – 66, n=267, last 2 years)

20.8 fps ∼36% -64%

Xiaomi Mi 11
Qualcomm Snapdragon 888 5G, Adreno 660, 8192

79 fps ∼100% +18%

Oppo Find X3 Neo
Qualcomm Snapdragon 865, Adreno 650, 12288

67 fps ∼85%

OnePlus 9
Qualcomm Snapdragon 888 5G, Adreno 660, 12288

57 fps ∼72% -15%

Average Qualcomm Snapdragon 865
  (27 – 67, n=24)

52.7 fps ∼67% -21%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

47 fps ∼59% -30%

Average of class Smartphone
  (0.94 – 257, n=267, last 2 years)

23.9 fps ∼30% -64%

Xiaomi Mi 11
Qualcomm Snapdragon 888 5G, Adreno 660, 8192

67 fps ∼100% +8%

Oppo Find X3 Neo
Qualcomm Snapdragon 865, Adreno 650, 12288

62 fps ∼93%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

51 fps ∼76% -18%

Average Qualcomm Snapdragon 865
  (39 – 62, n=23)

51 fps ∼76% -18%

OnePlus 9
Qualcomm Snapdragon 888 5G, Adreno 660, 12288

42 fps ∼63% -32%

Average of class Smartphone
  (0.89 – 75, n=213, last 2 years)

23.4 fps ∼35% -62%

Oppo Find X3 Neo
Qualcomm Snapdragon 865, Adreno 650, 12288

54 fps ∼100%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

48 fps ∼89% -11%

OnePlus 9
Qualcomm Snapdragon 888 5G, Adreno 660, 12288

48 fps ∼89% -11%

Average Qualcomm Snapdragon 865
  (25 – 54, n=23)

41.1 fps ∼76% -24%

Xiaomi Mi 11
Qualcomm Snapdragon 888 5G, Adreno 660, 8192

33 fps ∼61% -39%

Average of class Smartphone
  (1.5 – 60, n=213, last 2 years)

20.5 fps ∼38% -62%

OnePlus 9
Qualcomm Snapdragon 888 5G, Adreno 660, 12288

1627 Points ∼100% +6%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

1604 Points ∼99% +4%

Oppo Find X3 Neo
Qualcomm Snapdragon 865, Adreno 650, 12288

1542 Points ∼95%

Average Qualcomm Snapdragon 865
  (1276 – 2169, n=22)

1506 Points ∼93% -2%

Xiaomi Mi 11
Qualcomm Snapdragon 888 5G, Adreno 660, 8192

1434 Points ∼88% -7%

Average of class Smartphone
  (10 – 2169, n=204, last 2 years)

1215 Points ∼75% -21%

Oppo Find X3 Neo
Qualcomm Snapdragon 865, Adreno 650, 12288

13833 Points ∼100%

OnePlus 9
Qualcomm Snapdragon 888 5G, Adreno 660, 12288

12938 Points ∼94% -6%

Xiaomi Mi 11
Qualcomm Snapdragon 888 5G, Adreno 660, 8192

12732 Points ∼92% -8%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

11833 Points ∼86% -14%

Average Qualcomm Snapdragon 865
  (11399 – 13833, n=22)

11798 Points ∼85% -15%

Average of class Smartphone
  (186 – 16996, n=204, last 2 years)

5155 Points ∼37% -63%

Xiaomi Mi 11
Qualcomm Snapdragon 888 5G, Adreno 660, 8192

8789 Points ∼100% +8%

Oppo Find X3 Neo
Qualcomm Snapdragon 865, Adreno 650, 12288

8144 Points ∼93%

OnePlus 9
Qualcomm Snapdragon 888 5G, Adreno 660, 12288

7526 Points ∼86% -8%

Average Qualcomm Snapdragon 865
  (5304 – 8874, n=22)

7150 Points ∼81% -12%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

5304 Points ∼60% -35%

Average of class Smartphone
  (434 – 9044, n=204, last 2 years)

3952 Points ∼45% -51%

OnePlus 9
Qualcomm Snapdragon 888 5G, Adreno 660, 12288

11627 Points ∼100% +38%

Xiaomi Mi 11
Qualcomm Snapdragon 888 5G, Adreno 660, 8192

10377 Points ∼89% +23%

Average Qualcomm Snapdragon 865
  (8412 – 10147, n=22)

9662 Points ∼83% +15%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

9458 Points ∼81% +12%

Oppo Find X3 Neo
Qualcomm Snapdragon 865, Adreno 650, 12288

8412 Points ∼72%

Average of class Smartphone
  (1160 – 14189, n=204, last 2 years)

6257 Points ∼54% -26%

OnePlus 9
Qualcomm Snapdragon 888 5G, Adreno 660, 12288

6537 Points ∼100% +6%

Xiaomi Mi 11
Qualcomm Snapdragon 888 5G, Adreno 660, 8192

6361 Points ∼97% +3%

Oppo Find X3 Neo
Qualcomm Snapdragon 865, Adreno 650, 12288

6182 Points ∼95%

Average Qualcomm Snapdragon 865
  (5264 – 6402, n=22)

5898 Points ∼90% -5%

Samsung Galaxy S20 FE 5G
Qualcomm Snapdragon 865, Adreno 650, 6144

5555 Points ∼85% -10%

Average of class Smartphone
  (179 – 6959, n=204, last 2 years)

3384 Points ∼52% -45%