Saturday 31 October 2020

Is Wall Street losing its tech enthusiasm?

This is The TechCrunch Exchange, a newsletter that goes out on Saturdays, based on the column of the same name. You can sign up for the email here.

Over the past few months the IPO market made it plain that some public investors were willing to pay more for growth-focused technology shares than private investors. We saw this in both strong tech IPO pricing — the value set on companies as they debut — and in resulting first-day valuations, which were often higher.

One way to consider how far public valuations rose for tech startups, especially those with a software core in 2020, is to ask yourself how often you heard about a down IPO this year. Maybe a single time? At most? (You can catch up on 2020 IPO performance here, if you need to.)

IPO enthusiasm exposed a gap between what many venture capitalists and private investors were paying for tech shares, and what the public market was doing with its own valuation calculations. Insurtech startup Hippo’s $150 million private round from July is a good example. The company was valued at $1.5 billion in the round, a healthy uptick from its preceding private valuation. But if we valued it like the then-newly-public Lemonade, a related company, at the time, Hippo was priced inexpensively.

This week, however, the concept of private investors being more conservative than public investors in certain cases (some eight-figure private rounds happened this year at valuations that were even more bullish than public investor treatment of IPOs, to be clear) took a ding as most big tech companies lost ground, SaaS stocks sold off, and other tech firms struggled to keep up with investor enthusiasm.

Not only tech companies took a beating, but as I write to you on this Friday afternoon, the American stock markets were on a path for their worst week since March, CNBC reported, “led by major tech shares.”

A change in the wind? Perhaps. 

Notable is that it was just in September that VCs seemed resigned to having startup valuations pulled higher by public markets’ endless optimism for related companies. Canaan’s Maha Ibrahim told me during Disrupt 2020 that it was a time when VCs had to “play the game” and pay up for startups, so long as companies were being “rewarded in the public markets for high growth the way that Snowflake” was at the time. A16z’s David Ulevitch concurred.

Perhaps that dynamic is changing as stocks dip. If so, startup valuations could decline en masse, along with the more exotic areas of startup-related finance. The SPAC boom, for example, may wane. Chatting with Hippo’s CEO Assaf Wand this week, he posited that SPACs were a market-response to the public-private valuation gap, an accelerant-cum-bridge to help startups get public while demand was hot for their equity.

Without the same red-hot demand for growth and risk, SPACs could cool. So, too, could private valuations that the hottest startups have taken for granted. Whether what we’re feeling in the wind this week is a hiccup or tipping point is not clear. But the public market’s fever for tech equities may have broken at a somewhat awkward time for Airbnb, Coinbase, DoorDash and other not-quite-yet-IPOs.

Market Notes

It started to snow this week where I live, putting a somewhat sad cap on an otherwise turbulent week. Still! There’s lots from our world to get into. Here’s our week’s market notes:

  • Remember when we dug into how quickly startups grew in Q3? Another company that I’ve covered before, Drift, wrote in. The Boston-based marketing software company reported to The Exchange that it grew more than 50% in Q3 compared to the year-ago quarter, with its CEO adding that June and Q3 were the strongest month and three-month periods in its history.
  • The fintech boom continued with DriveWealth raising nearly $57 million this week, with the startup being yet another API-driven play. That a company sitting in-between two key startup trends of the year is doing well is not surprising. DriveWealth helps other fintech companies provide users access to the American equities markets. Alpaca, which also recently raised, is working along similar lines.

This week featured two IPOs that we cared about. MediaAlpha’s debut, giving the advertising-and-insurtech company a $19 per-share IPO price, quickly exploded out of the gate. Today the company is worth nearly $38 per share. Why? On its IPO day MediaAlpha CEO Steve Yi said that he had chosen the current moment because public markets had garnered an appreciation for insurtech. His share price growth seems to concur.

Until we look at Root, to some degree. Root, a neo-insurance provider focused on the automotive space, priced at $27 when it debuted this week, $2 above the top-end of its range. The company is now worth less than $24 per share. So, whatever wave MediaAlpha caught appears to have missed Root. 

I honestly don’t know what to make of the difference in the two debuts, but please email in if you do know (you can just reply to this email, and I’ll get your note).

Regardless, I chatted with Root CEO Alex Timm after his company went public. The executive said that Root had laid down plans to go public a year ago, and that it can’t control market noise around the time of its debut. Timm stressed the amount of capital that Root added to its coffers — north of $1 billion — is a win. I asked how the company intended to not fuck up its newly swollen accounts, to which Timm said that his company was going to stay “laser focused” on its core automotive insurance opportunity.

Oh, and Root is based in Ohio. I asked what its debut might mean for Midwest startups. Timm was positive, saying that the IPO could highlight that there are a lot of smart folks and GDP in the middle of the country, even if venture capital tallies for the region remain underdeveloped.

  • I know that by now you are tired of earnings, but Five9 did something that other companies struggled to accomplish, namely, beat expectations and bolstered its forward guidance. Its shares soared. The Exchange got on the phone with the call center software company to chat about its latest acquisition and earnings. How did it crush expectations as it did? By selling a product that its market needed when COVID-19 hit, the accelerating digital transformation more broadly, and rising e-commerce spend, which is driving more customer support work onto phone lines, it said. A lot of stuff at once, in other words. 
  • Five9 took on a bunch of convertible debt earlier this year, despite making gobs of adjusted profit. I asked its CEO Rowan Trollope how he was going to go about investing cash to take advantage of market tailwinds, while not overspending. He said that the company takes very regular looks at revenue performance, helping it tailor new spend nimbly. It’s apparently working.
  • What else? Peek this week at big, important rounds from SimilarWeb, PrimaryBid and EightFold, a company that I have known for some time. Oh, and I covered The Wanderlust Group’s Series B and Teampay’s Series A extension, which were good fun.

Various and Sundry

  • What’s going on in the world of venture debt as VC gets back to form? We dug in.
  • For the Europhiles amongst us, here’s what’s up with the continent’s VC receipts.
  • Here are 10 favorites from recent Techstars demo days.
  • And here’s some mathmagic about Databricks, after it was rumored to have an H1 2021 IPO target.
  • We’re way out of space this week, but I have some fun stuff in the tank for later, including a Capital G investor’s take on RPA, a call with the CEO of Zapier about no-code/low-code growth and notes from a chat about developer ecosystems with Dell Capital. More on all of that when the news calms down.

Stay safe, and vote.

Alex

Source

The post Is Wall Street losing its tech enthusiasm? appeared first on abangtech.



source https://abangtech.com/is-wall-street-losing-its-tech-enthusiasm/

Huawei P Smart 2021 in for review

The latest arrival to our office is the Huawei P Smart 2021 smartphone, also sold at some markets as Huawei Y7a. It is an entry-level smartphone that lacks Google services, but brings other promising features like a 48MP camera, a massive battery with fast charging support, and a 6.67” screen with Full HD+ resolution.

Huawei P Smart 2021 in for review

The retail box of the phone consists of a Huawei SuperCharger that lets you top the phone at up to 22.5W. The technology was initially introduced with the company flagships and it is good to see it trickle to the lower echelons. We have to go through our usual battery tests to see how the power cell performs, but we are hopeful.

On the inside, we have a Kirin 710A chipset which is rather unimpressive, but at least the platform is coupled with 4GB RAM. Our unit brings 128GB internal storage and it can be expanded further through the regular microSD slot – rather than the less popular NM slots Huawei is putting in its latest flagship smartphones.

Huawei P Smart 2021 in for review

The quad camera on the back is quite impressive for an entry-level device – a big 48MP f/1.8 main shooter and an 8MP ultrawide cam. The two 2MP modules are mostly there for marketing purposes (officially macro cam and a depth sensor), but that’s still a very solid setup. The cameras are lined vertically and look rather nice. We can’t talk about performance before our tests have concluded, but this is a camera setup seen in phones that cost twice as the P Smart 2021.

Huawei P Smart 2021 in for review

Price-wise, the phone was initially launched at around $200 in Malaysia (under the Y7a moniker) before popping up in Germany at €229. The full review is already underway, so once it is done, we’ll be able to tell whether it is worth it.

Source

The post Huawei P Smart 2021 in for review appeared first on abangtech.



source https://abangtech.com/huawei-p-smart-2021-in-for-review/

How to make sure your ‘AI for good’ project actually does good

Artificial intelligence has been front and center in recent months. The global pandemic has pushed governments and private companies worldwide to propose AI solutions for everything from analyzing cough sounds to deploying disinfecting robots in hospitals. These efforts are part of a wider trend that has been picking up momentum: the deployment of projects by companies, governments, universities, and research institutes aiming to use AI for societal good. The goal of most of these programs is to deploy cutting-edge AI technologies to solve critical issues such as poverty, hunger, crime, and climate change, under the “AI for good” umbrella.

But what makes an AI project good? Is it the “goodness” of the domain of application, be it health, education, or environment? Is it the problem being solved (e.g. predicting natural disasters or detecting cancer earlier)? Is it the potential positive impact on society, and if so, how is that quantified? Or is it simply the good intentions of the person behind the project? The lack of a clear definition of AI for good opens the door to misunderstandings and misinterpretations, along with great chaos.

AI has the potential to help us address some of humanity’s biggest challenges like poverty and climate change. However, as any technological tool, it is agnostic to the context of application, the intended end-user, and the specificity of the data. And for that reason, it can ultimately end up having both beneficial and detrimental consequences.

In this post, I’ll outline what can go right and what can go wrong in AI for good projects and will suggest some best practices for designing and deploying AI for good projects.

Success stories

AI has been used to generate lasting positive impact in a variety of applications in recent years. For example, Statistics for Social Good out of Stanford University has been a beacon of interdisciplinary work at the nexus of data science and social good. In the last few years, it has piloted a variety of projects in different domains, from matching nonprofits with donors and volunteers to investigating inequities in palliative care. Its bottom-up approach, which connects potential problem partners with data analysts, helps these organizations find solutions to their most pressing problems. The Statistics for Social Good team covers a lot of ground with limited manpower. It documents all of its findings on its website, curates datasets, and runs outreach initiatives both locally and abroad.

Another positive example is the Computational Sustainability Network, a research group applying computational techniques to sustainability challenges such as conservation, poverty mitigation, and renewable energy. This group adopts a complementary approach for matching computational problem classes like optimization and spatiotemporal prediction with sustainability challenges such as bird preservation, electricity usage disaggregation and marine disease monitoring. This top-down approach works well given that members of the network are experts in these techniques and so are well-suited to deploy and fine-tune solutions to the specific problems at hand. For over a decade, members of CompSustNet have been creating connections between the world of sustainability and that of computing, facilitating data sharing and building trust. Their interdisciplinary approach to sustainability exemplifies the kind of positive impacts AI techniques can have when applied mindfully and coherently to specific real-world problems.

Even more recent examples include the use of AI in the fight against COVID-19. In fact, a plethora of AI approaches have emerged to address various aspects of the pandemic, from molecular modeling of potential vaccines to tracking misinformation on social media — I helped write a survey article about these in recent months. Some of these tools, while built with good intentions, had inadvertent consequences. However, others produced positive lasting impacts, especially several solutions created in partnership with hospitals and health providers. For instance, a group of researchers at the University of Cambridge developed the COVID-19 Capacity Planning and Analysis System tool to help hospitals with resource and critical care capacity planning. The system, whose deployment across hospitals was coordinated with the U.K.’s National Health Service, can analyze information gathered in hospitals about patients to determine which of them require ventilation and intensive care. The collected data was percolated up to the regional level, enabling cross-referencing and resource allocation between the different hospitals and health centers. Since the system is used at all levels of care, the compiled patient information could not only help save lives but also influence policy-making and government decisions.

Unintended consequences

Despite the best intentions of the project instigators, applications of AI towards social good can sometimes have unexpected (and sometimes dire) repercussions. A prime example is the now-infamous COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) project, which various justice systems in the United States deployed. The aim of the system was to help judges assess risk of inmate recidivism and to lighten the load on the overflowing incarceration system. Yet, the tool’s risk of recidivism score was calculated along with factors not necessarily tied to criminal behaviour, such as substance abuse and stability. After an in-depth ProPublica investigation of the tool in 2016 revealed the software’s undeniable bias against blacks, usage of the system was stonewalled. COMPAS’s shortcomings should serve as a cautionary tale for black-box algorithmic decision-making in the criminal justice system and other areas of government, and efforts must be made to not repeat these mistakes in the future.

More recently, another well-intentioned AI tool for predictive scoring spurred much debate with regard to the U.K. A-level exams. Students must complete these exams in their final year of school in order to be accepted to universities, but they were cancelled this year due to the ongoing COVID-19 pandemic. The government therefore endeavored to use machine learning to predict how the students would have done on their exams had they taken them, and these estimates were then going to be used to make university admission decisions. Two inputs were used for this prediction: any given student’s grades during the 2020 year, and the historical record of grades in the school the student attended. This meant that a high-achieving student in a top-tier school would have an excellent prediction score, whereas a high-achieving student in a more average institution would get a lower score, despite both students having equivalent grades. As a result, two times as many students from private schools received top grades compared to public schools, and over 39% of students were downgraded from the cumulative average they had achieved in the months of the school year before the automatic assessment. After weeks of protests and threats of legal action by parents of students across the country, the government backed down and announced that it would use the average grade proposed by teachers instead. Nonetheless, this automatic assessment serves as a stern reminder of the existing inequalities within the education system, which were amplified through algorithmic decision-making.

While the the goals of COMPAS and the UK government were not ill-intentioned, they highlight the fact that AI  projects do not always have the intended outcome. In the best case, these misfires can still validate our perception of AI as a tool for positive impact even if they haven’t solved any concrete problems. In the worst case, they experiment on vulnerable populations and result in harm.

Improving AI for good

Best practices in AI for good fall into two general categories — asking the right questions and including the right people.

1. Asking the right questions

Before jumping head-first into a project intending to apply AI for good, there are a few questions you should ask. The first one is: What is the problem, exactly? It is impossible to solve the real problem at hand, whether it be poverty, climate change, or overcrowded correctional facilities. So projects inevitably involve solving what is, in fact, a proxy problem: detecting poverty from satellite imagery, identifying extreme weather events, producing a recidivism risk score. There is also often a lack of adequate data for the proxy problem, so you rely on surrogate data, such as average GDP per census block, extreme climate events over the last decade, or historical data regarding inmates committing crimes when on parole. But what happens when the GDP does not tell the whole story about income, when climate events are progressively becoming more extreme and unpredictable, or when police data is biased? You end up with AI solutions that optimize the wrong metric, make erroneous assumptions, and have unintended negative consequences.

It is also crucial to reflect upon whether AI is the appropriate solution. More often than not, AI solutions are too complex, too expensive, and too technologically demanding to be deployed in many environments. It is therefore of paramount importance to take into account the context and constraints of deployment, the intended audience, and even more straightforward things like whether or not there is a reliable energy grid present at the time of deployment. Things that we take for granted in our own lives and surroundings can be very challenging in other regions and geographies.

Finally, given the current ubiquity and accessibility of machine learning and deep learning approaches, you may take for granted that they are the best solution for any problem, no matter its nature and complexity. While deep neural networks are undoubtedly powerful in certain use cases and given a large amount of high-quality data relevant to the task, these factors are rarely the norm in AI-for-good projects. Instead, teams should prioritize simpler and more straightforward approaches, such as random forests or Bayesian networks, before jumping to a neural network with millions of parameters. Simpler approaches also have the added value of being more easily interpretable than deep learning, which is a useful characteristic in real-world contexts where the end users are often not AI specialists.

Generally speaking, here are some questions you should answer before developing an AI-for-good project:

  • Who will define the problem to be solved?
  • Is AI the right solution for the problem?
  • Where will the data come from?
  • What metrics will be used for measuring progress?
  • Who will use the solution?
  • Who will maintain the technology?
  • Who will make the ultimate decision based on the model’s predictions?
  • Who or what will be held accountable if the AI has unintended consequences?

While there is no guaranteed right answer to any of the questions above, they are a good sanity check before deploying such a complex and impactful technology as AI when vulnerable people and precarious situations are involved. In addition, AI researchers must be transparent about the nature and limitations of the data they are using. AI requires large amounts of data, and ingrained in that data are the inherent inequities and imperfections that exist within our society and social structures. These can disproportionately impact any system trained on the data leading to applications that amplify existing biases and marginalization. It is therefore critical to analyze all aspects of the data and ask the questions listed above, from the very start of your research.

When you are promoting a project, be clear about its scope and limitations; don’t just focus on the potential benefits it can deliver. As with any AI project, it is important to be transparent about the approach you are using, the reasoning behind this approach, and the advantages and disadvantages of the final model. External assessments should be carried out at different stages of the project to identify potential issues before they percolate through the project. These should cover aspects such as ethics and bias, but also potential human rights violations, and the feasibility of the proposed solution.

2. Including the right people

AI solutions are not deployed in a vacuum or in a research laboratory but involve real people who should be given a voice and ownership of the AI that is being deployed to “help’” them — and not just at the deployment phase of the project. In fact, it is vital to include non-governmental organizations (NGOs) and charities, since they have the real-world knowledge of the problem at different levels and a clear idea of the solutions they require. They can also help deploy AI solutions so they have the biggest impact — populations trust organizations such as the Red Cross, sometimes more than local governments. NGOs can also give precious feedback about how the AI is performing and propose improvements. This is essential, as AI-for-good solutions should include and empower local stakeholders who are close to the problem and to the populations affected by it. This should be done at all stages of the research and development process, from problem scoping to deployment. The two examples of successful AI-for-good initiatives I cited above (CompSusNet and Stats for Social Good) do just that, by including people from diverse, interdisciplinary backgrounds and engaging them in a meaningful way around impactful projects.

In order to have inclusive and global AI, we need to engage new voices, cultures, and ideas. Traditionally, the dominant discourse of AI is rooted in Western hubs like Silicon Valley and continental Europe. However, AI-for-good projects are often deployed in other geographical areas and target populations in developing countries. Limiting the creation of AI projects to outside perspectives does not provide a clear picture about the problems and challenges faced in these regions. So it is important to engage with local actors and stakeholders. Also, AI-for-good projects are rarely a one-shot deal; you will need domain knowledge to ensure they are functioning properly in the long term. You will also need to commit time and effort toward the regular maintenance and upkeep of technology supporting your AI-for-good project.

Projects aiming to use AI to make a positive impact on the world are often received with enthusiasm, but they should also be subject to extra scrutiny. The strategies I’ve presented in this post merely serve as a guiding framework. Much work still needs to be done as we move forward with AI-for-good projects, but we have reached a point in AI innovation where we are increasingly having these discussions and reflecting on the relationship between AI and societal needs and benefits. If these discussions turn into actionable results, AI will finally live up to its potential to be a positive force in our society.

Thank you to Brigitte Tousignant for her help in editing this article.

Sasha Luccioni is a postdoctoral researcher at MILA, a Montreal-based research institute focused on artificial intelligence for social good.


How startups are scaling communication: The pandemic is making startups take a close look at ramping up their communication solutions. Learn how


Source

The post How to make sure your ‘AI for good’ project actually does good appeared first on abangtech.



source https://abangtech.com/how-to-make-sure-your-ai-for-good-project-actually-does-good/

Hands-on with iOS 14.2 RC top changes and features [Video]

Yesterday Apple issued the iOS 14.2 Release Candidate (GM) to developers, signifying that a public-facing release is not too far behind. iOS 14.2 RC includes over 100 new emoji characters, features eight beautiful new wallpapers, Shazam music recognition CC toggle, a redesigned AirPlay 2 interface, and much more. This iOS 14.2 release also fixes the annoying “A new iOS update is now available” message that appears upon each unlock for those on the previous beta. Watch our hands-on video as we explore iOS 14.2 top changes and features.

What’s new in iOS 14.2 RC?

Note: It looks as if Apple is replacing the term “GM Seed” for near-final versions of its software with “Release Candidate,” so we will refer to this as the iOS 14.2 (RC) Release Candidate.

  • A fix for the annoying “A new iOS update is now available. Please update from the iOS 14 beta” message that occurred with each unlock.
  • Eight beautiful new wallpapers in light and dark versions
  • Hundreds of new emoji characters
  • A fix for HDR videos thumbnails exported from Final Cut Pro X
  • Redesigned AirPlay 2 interface
  • Updated now playing controls and AirPlay 2 interface on the Lock screen
  • Redesigned AirPlay 2 controls in Control Center
  • New animation lets you see if other AirPlay 2 devices are active on your network
  • Source icon indicator for music, podcasts, etc.
  • Updated AirPlay pop-over

Video: iOS 14.2 RC top changes and features

Subscribe to 9to5mac on YouTube for more videos

  • Shazam music recognition CC toggle
  • Watch app with new Apple Watch Solo Loop app icon
  • ‘Reduce Loud Sounds’ is now renamed ‘Headphone Safety’ in Settings
  • Apple Card users now have ‘Yearly activity’ tab in the Wallet app
  • Ask Siri to stop playing music on HomePod
  • Intercom support in Home app and via Siri

There are many takeaways from the iOS 14.2 Release Candidate. If you’re coming from the previous iOS 14.2 beta, you’ll be happy to know that the annoying “A new iOS update is now available. Please update from the iOS 14 beta” that occurred every time you unlocked your iPhone has been fixed with this update. That alone is enough to warrant updating in my opinion.

One of the biggest new features found in iOS 14.2 is the revamped AirPlay 2 interface in Control Center. You’ll now find much bigger album artwork in Control Center for now-playing media. You’ll also find an icon in the bottom right-hand corner of the album artwork denoting the source of the media.

In addition to these changes, both Control Center and the Lock screen will now present suggested media when nothing is playing. On the Lock screen, you’ll need to have headphones connected before suggested media is presented.

There’s also a handy new AirPlay 2 pop-over for accessing other AirPlay 2-compatible devices on your network, including a brand new animation that occurs when media is currently playing on those devices. This pop-over is accessible from anywhere AirPlay 2 devices can be selected, such as the Lock screen, Control Center, Music app, etc.

There is also a brand new Shazam toggle that can be added to Control Center. This toggle allows you to inconspicuously learn details about songs playing in your environment without needing to invoke Siri.

Arguably the two biggest new features in iOS 14.2 are the eight new wallpapers, and the 100+ new emoji characters. The new wallpapers stand out, because they are full screen wallpapers that are less abstract than Apple wallpapers in the recent past. There are also darker versions of the eight new wallpapers included for when users switch to Dark mode.

The new emojis include all sorts of new additions, such as the ninja, disguise face, boomerang, and many more. I recommend using the iOS 14 emoji search feature to find new favorites.

iOS 14.2 also brings about the new Intercom functionality demonstrated alongside the unveiling of the HomePod mini at Apple’s iPhone 12 event. You’ll find a new Intercom button in the upper-right hand corner of the Home app that can be used to communicate with HomePods in the various locations of your house. Users can also use Siri to invoke the new Intercom feature as a means to communicate with household members.

9to5mac’s take

iOS 14.2 is a huge release that brings forth all of the features listed here, and many additional changes, bug fixes, and improvements. What’s your favorite new addition in iOS 14.2? Sound off in the comments with your thoughts.

FTC: We use income earning auto affiliate links. More.


Check out 9to5Mac on YouTube for more Apple news:

Source

The post Hands-on with iOS 14.2 RC top changes and features [Video] appeared first on abangtech.



source https://abangtech.com/hands-on-with-ios-14-2-rc-top-changes-and-features-video/

Today in Apple history: iTunes video takes world by storm

October 31: Today in Apple history: iTunes video takes world by stormOctober 31, 2005: Less than three weeks after launching video downloads with iTunes 6, Apple reveals that it has already sold more than 1 million music videos.

Apple’s dive into the online digital video market — with 2,000 music videos, Pixar short films and a selection of hit TV shows for $1.99 — looks like the logical next step after selling individual songs on iTunes. Passing the 1 million download benchmark so quickly suggests the plan is a roaring success.

This post contains affiliate links. Cult of Mac may earn a commission when you use our links to buy items. Read our reviews policy.

As with so much of what Apple has done over the years, the move to start selling TV shows and other video content came at the perfect time. YouTube was still in its infancy (hence people actually paid money for music videos). However, increased high-speed internet penetration finally made video streaming possible for ordinary users.

I got my first broadband connection around this time. Previously, anything more than downloading a short video clip proved totally unfeasible. Getting broadband was like being blasted into the future.

iTunes video: A logical step

Apple has a long history of letting users watch videos on their machines. As far back as the 1980s, Apple experimented with demos, like Steve Perlman’s QuickScan, that allowed video playback to run on a Mac. In 1991, Mac users became able to use QuickTime, which served as the standardized video tool for computer users for a long time.

By 2005, Apple was working toward a future in which it would produce mobile devices able to support video. This was crucial if downloading music videos and other material was going to live up to its potential — just as the iPod had been key to iTunes music downloads.

iPod Classic with video

In October 2005, Apple launched the fifth-gen iPod Classic, with a larger-than-ever screen. This introduced video playback to the music player for the first time. According to the biography Becoming Steve Jobs, the Apple chief pitched new Disney CEO Bob Iger on the idea of opening up his shows to digital distribution by showing him the new video iPod.

“Would you consider putting your TV shows on this?” Jobs asked. Iger answered in the affirmative without missing a beat.

Ultimately, Apple secured deals to sell downloads of hit shoes like Desperate Housewives, Lost and Grey’s Anatomy. The music video offerings merged this new focus on video with Apple’s existing deals with record labels. Videos by artists like Michael Jackson, Fatboy Slim and Kanye West helped push iTunes past the 1 million downloads mark.

All about context

Today, when top YouTubers score millions of views within days and music videos hit the “Billion View Club” faster than ever, the idea that Apple would put out a press release to crow about notching 1 million music video downloads seems astonishingly quaint. However, at the time it was big news. This early success also laid the foundation for Apple expanding into a whole new area of business.

Here in 2020, Apple goes beyond merely distributing video. With Apple TV+, the company is now producing its own TV shows and movies. The streaming service features a plethora of shows commissioned by Cupertino, all for $4.99 a month. Whether Apple TV+ ultimately becomes a Netflix beater or not, it all started for Apple back in 2005.

Do you remember the first music video or TV show you downloaded using iTunes? Leave your comments below.

Source

The post Today in Apple history: iTunes video takes world by storm appeared first on abangtech.



source https://abangtech.com/today-in-apple-history-itunes-video-takes-world-by-storm/

Human Capital: Uber Eats hit with claims of ‘reverse racism’

Nellie Peshkov, formerly Reddit’s VP of People and Culture, is now Chief People Officer. Her appointment to the C-suite is part of the much-needed, growing trend of tech companies elevating employees focused on diversity and inclusion to the highest leadership ranks.

Uber Eats hit with claims of “reverse racism”

Uber said it has received more than 8,500 demands for arbitration as a result of it ditching delivery fees for Black-owned restaurants via Uber Eats.

Uber Eats made this change in June, following racial justice protests around the police killing of George Floyd, an unarmed Black man. Uber Eats said it wanted to make it easier for customers to support Black-owned businesses in the U.S. and Canada. To qualify, the restaurant must be a small or medium-sized business and, therefore, not part of a franchise. In contrast, delivery fees are still in place for other restaurants.

In one of these claims, viewed by TechCrunch, a customer says Uber Eats violates the Unruh civil Rights Act by “charging discriminatory delivery fees based on race (of the business owner).” That claim seeks $12,000 as well as a permanent injunction that would prevent Uber from continuing to offer free delivery from Black-owned restaurants.

Uber driver claims rating system is racially biased
Uber is no stranger to lawsuits, so this one shouldn’t come as a surprise. Uber is now facing a lawsuit regarding its customer ratings and how the company deactivates drivers whose ratings fall below a certain threshold. The suit alleges the system “constitues race discrimination, as it is widely recognized that customer evaluations of workers are frequently racially biased.”

In a statement to NPR, Uber called the suit “flimsy” and said “ridesharing has greatly reduced bias for both drivers and riders, who now have fairer, more equitable access to work and transportation than ever before.”

Yes on Prop 22 gets another $3.75 million influx of cash
DoorDash put in an additional $3.75 million into the Yes on 22 campaign, according to a late contribution filing. Proposition 22 is the California ballot measure that aims to keep gig workers classified as independent contractors.

The latest influx of cash brought Yes on 22’s total contributions north of $200 million. As of October 14, the campaign had raised $189 million. But thanks to a number of late contributions, the total put toward Yes on 22 comes out to about $202,955,106.38, or, $203 million.

Prop 22 hit the most-funded California ballot measure long ago, but it’s now surpassed the $200 million mark.

TechCrunch Sessions: Justice is back

I am pleased to announce TechCrunch Sessions: Justice is officially happening again! Save the date for March 3, 2021.

We’ll explore inclusive hiring, access to funding for Black, Latinx and Indigenous people, and workplace tools to foster inclusion and belonging. We’ll also examine the experiences of gig workers and formerly incarcerated people who are often left out of Silicon Valley’s wealth cycle. Rounding out the program will be a discussion about the role of venture capital in creating a more inclusive tech ecosystem. We’ll discuss all of that and more at TC Sessions: Justice.

Source

The post Human Capital: Uber Eats hit with claims of ‘reverse racism’ appeared first on abangtech.



source https://abangtech.com/human-capital-uber-eats-hit-with-claims-of-reverse-racism/

Apple Slips to Fourth Place for Smartphone Market Share, Overtaken by Xiaomi – MacRumors

Apple has shipped 10.6 percent fewer iPhones year-on-year in the third quarter of 2020, meaning that it has been overtaken by Xiaomi for the first time, according to new data shared by IDC.

The report details how Apple is now ranked as the fourth-largest smartphone manufacturer by market share. This is the first time that Apple has ranked fourth, with Xiaomi, Huawei, and Samsung exceeding Apple’s 11.8 percent share.

In total, Apple is believed to have shipped 41.2 million devices in the third quarter, which is five million less than the same time last year.

The drop was expected on the back of the delay in launching the iPhone 12 lineup, which usually appears in the third quarter. Irrespective of the belated arrival of the ‌iPhone 12‌, the iPhone 11 and iPhone SE contributed to the majority of Apple’s volume and performed “exceptionally well.”

Going forwards, IDC expects Apple to grow in coming quarters due to strong early demand for the ‌iPhone 12‌ and solid trade-in offers from major carriers, particularly within the United States.

Samsung reclaimed the top position with a market share of 22.7 percent, shipping over 80 million smartphones. Huawei followed with a 14.7 percent share, and this was a significant reduction of 40 percent year-on-year. vivo also returned to the top five with a market share of 8.9 percent.

Xiaomi overtook Apple for the first time with a market share of 13.1 percent, achieving a 42 percent growth. The rise is supposedly due to strong gains in India and China.

Overall, the global smartphone market declined by only 1.3 percent year-on-year in the third quarter of 2020. The results were stronger than IDC’s previous forecast of a nine percent year-over-year decline. An important trend was the strength of shipments in India, which is the second-largest market globally, and other emerging markets, such as Brazil, Indonesia, and Russia.

Source

The post Apple Slips to Fourth Place for Smartphone Market Share, Overtaken by Xiaomi – MacRumors appeared first on abangtech.



source https://abangtech.com/apple-slips-to-fourth-place-for-smartphone-market-share-overtaken-by-xiaomi-macrumors/

Google Pixel 5 smartphone review: Powerful mid-range with Android 11

Source

The post Google Pixel 5 smartphone review: Powerful mid-range with Android 11 appeared first on abangtech.



source https://abangtech.com/google-pixel-5-smartphone-review-powerful-mid-range-with-android-11/
Google Pixel 5
Qualcomm Snapdragon 765G, Adreno 620, 8192

2979 Points ∼86%

Google Pixel 4
Qualcomm Snapdragon 855, Adreno 640, 6144

3044 Points ∼88% +2%

Google Pixel 4 XL
Qualcomm Snapdragon 855, Adreno 640, 6144

3452 Points ∼100% +16%

OnePlus Nord
Qualcomm Snapdragon 765G, Adreno 620, 12288

2867 Points ∼83% -4%

Vivo X50 Pro
Qualcomm Snapdragon 765G, Adreno 620, 8192

2893 Points ∼84% -3%

Xiaomi Mi 10 Lite 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

2893 Points ∼84% -3%

ZTE Axon 11 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

2609 Points ∼76% -12%

Average Qualcomm Snapdragon 765G
  (2145 – 2979, n=14)

2718 Points ∼79% -9%

Average of class Smartphone
  (1740 – 4061, n=183)

2666 Points ∼77% -11%

Google Pixel 5
Qualcomm Snapdragon 765G, Adreno 620, 8192

2364 Points ∼42%

Google Pixel 4
Qualcomm Snapdragon 855, Adreno 640, 6144

5228 Points ∼92% +121%

Google Pixel 4 XL
Qualcomm Snapdragon 855, Adreno 640, 6144

5685 Points ∼100% +140%

OnePlus Nord
Qualcomm Snapdragon 765G, Adreno 620, 12288

3185 Points ∼56% +35%

Vivo X50 Pro
Qualcomm Snapdragon 765G, Adreno 620, 8192

3149 Points ∼55% +33%

Xiaomi Mi 10 Lite 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

3198 Points ∼56% +35%

ZTE Axon 11 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

3192 Points ∼56% +35%

Average Qualcomm Snapdragon 765G
  (2145 – 3198, n=14)

3033 Points ∼53% +28%

Average of class Smartphone
  (203 – 11259, n=183)

3096 Points ∼54% +31%

Google Pixel 5
Qualcomm Snapdragon 765G, Adreno 620, 8192

2478 Points ∼50%

Google Pixel 4
Qualcomm Snapdragon 855, Adreno 640, 6144

4509 Points ∼91% +82%

Google Pixel 4 XL
Qualcomm Snapdragon 855, Adreno 640, 6144

4970 Points ∼100% +101%

OnePlus Nord
Qualcomm Snapdragon 765G, Adreno 620, 12288

3108 Points ∼63% +25%

Vivo X50 Pro
Qualcomm Snapdragon 765G, Adreno 620, 8192

3087 Points ∼62% +25%

Xiaomi Mi 10 Lite 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

3130 Points ∼63% +26%

ZTE Axon 11 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

3041 Points ∼61% +23%

Average Qualcomm Snapdragon 765G
  (2287 – 3130, n=14)

2941 Points ∼59% +19%

Average of class Smartphone
  (253 – 6977, n=183)

2742 Points ∼55% +11%

Google Pixel 5
Qualcomm Snapdragon 765G, Adreno 620, 8192

3530 Points ∼70%

Apple iPhone 11
Apple A13 Bionic, A13 Bionic GPU, 4096

3219 Points ∼64% -9%

Google Pixel 4
Qualcomm Snapdragon 855, Adreno 640, 6144

4112 Points ∼82% +16%

Google Pixel 4 XL
Qualcomm Snapdragon 855, Adreno 640, 6144

5024 Points ∼100% +42%

OnePlus Nord
Qualcomm Snapdragon 765G, Adreno 620, 12288

3544 Points ∼71% 0%

Vivo X50 Pro
Qualcomm Snapdragon 765G, Adreno 620, 8192

3765 Points ∼75% +7%

Xiaomi Mi 10 Lite 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

3495 Points ∼70% -1%

ZTE Axon 11 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

3225 Points ∼64% -9%

Average Qualcomm Snapdragon 765G
  (1898 – 3765, n=14)

3248 Points ∼65% -8%

Average of class Smartphone
  (573 – 5780, n=536)

2246 Points ∼45% -36%

Google Pixel 5
Qualcomm Snapdragon 765G, Adreno 620, 8192

2834 Points ∼35%

Apple iPhone 11
Apple A13 Bionic, A13 Bionic GPU, 4096

8119 Points ∼100% +186%

Google Pixel 4
Qualcomm Snapdragon 855, Adreno 640, 6144

6383 Points ∼79% +125%

Google Pixel 4 XL
Qualcomm Snapdragon 855, Adreno 640, 6144

6950 Points ∼86% +145%

OnePlus Nord
Qualcomm Snapdragon 765G, Adreno 620, 12288

3556 Points ∼44% +25%

Vivo X50 Pro
Qualcomm Snapdragon 765G, Adreno 620, 8192

3556 Points ∼44% +25%

Xiaomi Mi 10 Lite 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

3589 Points ∼44% +27%

ZTE Axon 11 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

3579 Points ∼44% +26%

Average Qualcomm Snapdragon 765G
  (2814 – 3592, n=14)

3448 Points ∼42% +22%

Average of class Smartphone
  (75 – 12146, n=536)

2208 Points ∼27% -22%

Google Pixel 5
Qualcomm Snapdragon 765G, Adreno 620, 8192

2964 Points ∼46%

Apple iPhone 11
Apple A13 Bionic, A13 Bionic GPU, 4096

6067 Points ∼95% +105%

Google Pixel 4
Qualcomm Snapdragon 855, Adreno 640, 6144

5685 Points ∼89% +92%

Google Pixel 4 XL
Qualcomm Snapdragon 855, Adreno 640, 6144

6404 Points ∼100% +116%

OnePlus Nord
Qualcomm Snapdragon 765G, Adreno 620, 12288

3553 Points ∼55% +20%

Vivo X50 Pro
Qualcomm Snapdragon 765G, Adreno 620, 8192

3597 Points ∼56% +21%

Xiaomi Mi 10 Lite 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

3605 Points ∼56% +22%

ZTE Axon 11 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

3494 Points ∼55% +18%

Average Qualcomm Snapdragon 765G
  (2934 – 3605, n=14)

3381 Points ∼53% +14%

Average of class Smartphone
  (93 – 9643, n=537)

2052 Points ∼32% -31%

Google Pixel 5
Qualcomm Snapdragon 765G, Adreno 620, 8192

3544 Points ∼76%

Apple iPhone 11
Apple A13 Bionic, A13 Bionic GPU, 4096

3411 Points ∼73% -4%

Google Pixel 4
Qualcomm Snapdragon 855, Adreno 640, 6144

4072 Points ∼88% +15%

Google Pixel 4 XL
Qualcomm Snapdragon 855, Adreno 640, 6144

4652 Points ∼100% +31%

OnePlus Nord
Qualcomm Snapdragon 765G, Adreno 620, 12288

3555 Points ∼76% 0%

Vivo X50 Pro
Qualcomm Snapdragon 765G, Adreno 620, 8192

3651 Points ∼78% +3%

Xiaomi Mi 10 Lite 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

3492 Points ∼75% -1%

ZTE Axon 11 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

3238 Points ∼70% -9%

Average Qualcomm Snapdragon 765G
  (1790 – 3651, n=14)

3232 Points ∼69% -9%

Average of class Smartphone
  (375 – 5765, n=568)

2162 Points ∼46% -39%

Google Pixel 5
Qualcomm Snapdragon 765G, Adreno 620, 8192

4036 Points ∼23%

Apple iPhone 11
Apple A13 Bionic, A13 Bionic GPU, 4096

17853 Points ∼100% +342%

Google Pixel 4
Qualcomm Snapdragon 855, Adreno 640, 6144

9217 Points ∼52% +128%

Google Pixel 4 XL
Qualcomm Snapdragon 855, Adreno 640, 6144

9995 Points ∼56% +148%

OnePlus Nord
Qualcomm Snapdragon 765G, Adreno 620, 12288

5357 Points ∼30% +33%

Vivo X50 Pro
Qualcomm Snapdragon 765G, Adreno 620, 8192

5265 Points ∼29% +30%

Xiaomi Mi 10 Lite 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

5335 Points ∼30% +32%

ZTE Axon 11 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

5416 Points ∼30% +34%

Average Qualcomm Snapdragon 765G
  (4036 – 5437, n=14)

5144 Points ∼29% +27%

Average of class Smartphone
  (70 – 20511, n=568)

2987 Points ∼17% -26%

Google Pixel 5
Qualcomm Snapdragon 765G, Adreno 620, 8192

3915 Points ∼43%

Apple iPhone 11
Apple A13 Bionic, A13 Bionic GPU, 4096

9199 Points ∼100% +135%

Google Pixel 4
Qualcomm Snapdragon 855, Adreno 640, 6144

7196 Points ∼78% +84%

Google Pixel 4 XL
Qualcomm Snapdragon 855, Adreno 640, 6144

7963 Points ∼87% +103%

OnePlus Nord
Qualcomm Snapdragon 765G, Adreno 620, 12288

4815 Points ∼52% +23%

Vivo X50 Pro
Qualcomm Snapdragon 765G, Adreno 620, 8192

4819 Points ∼52% +23%

Xiaomi Mi 10 Lite 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

4893 Points ∼53% +25%

ZTE Axon 11 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

4712 Points ∼51% +20%

Average Qualcomm Snapdragon 765G
  (3678 – 4893, n=14)

4512 Points ∼49% +15%

Average of class Smartphone
  (88 – 11895, n=568)

2489 Points ∼27% -36%

Google Pixel 5
Qualcomm Snapdragon 765G, Adreno 620, 8192

3547 Points ∼77%

Apple iPhone 11
Apple A13 Bionic, A13 Bionic GPU, 4096

2429 Points ∼53% -32%

Google Pixel 4
Qualcomm Snapdragon 855, Adreno 640, 6144

4199 Points ∼91% +18%

Google Pixel 4 XL
Qualcomm Snapdragon 855, Adreno 640, 6144

4612 Points ∼100% +30%

OnePlus Nord
Qualcomm Snapdragon 765G, Adreno 620, 12288

3503 Points ∼76% -1%

Vivo X50 Pro
Qualcomm Snapdragon 765G, Adreno 620, 8192

3462 Points ∼75% -2%

Xiaomi Mi 10 Lite 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

3425 Points ∼74% -3%

ZTE Axon 11 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

3133 Points ∼68% -12%

Average Qualcomm Snapdragon 765G
  (2964 – 3556, n=14)

3324 Points ∼72% -6%

Average of class Smartphone
  (435 – 5262, n=618)

2124 Points ∼46% -40%

Google Pixel 5
Qualcomm Snapdragon 765G, Adreno 620, 8192

2106 Points ∼34%

Apple iPhone 11
Apple A13 Bionic, A13 Bionic GPU, 4096

5726 Points ∼92% +172%

Google Pixel 4
Qualcomm Snapdragon 855, Adreno 640, 6144

6214 Points ∼100% +195%

Google Pixel 4 XL
Qualcomm Snapdragon 855, Adreno 640, 6144

6163 Points ∼99% +193%

OnePlus Nord
Qualcomm Snapdragon 765G, Adreno 620, 12288

3284 Points ∼53% +56%

Vivo X50 Pro
Qualcomm Snapdragon 765G, Adreno 620, 8192

3342 Points ∼54% +59%

Xiaomi Mi 10 Lite 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

3321 Points ∼53% +58%

ZTE Axon 11 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

3302 Points ∼53% +57%

Average Qualcomm Snapdragon 765G
  (2048 – 3342, n=14)

3109 Points ∼50% +48%

Average of class Smartphone
  (53 – 11573, n=618)

1866 Points ∼30% -11%

Google Pixel 5
Qualcomm Snapdragon 765G, Adreno 620, 8192

2315 Points ∼40%

Apple iPhone 11
Apple A13 Bionic, A13 Bionic GPU, 4096

4400 Points ∼77% +90%

Google Pixel 4
Qualcomm Snapdragon 855, Adreno 640, 6144

5615 Points ∼98% +143%

Google Pixel 4 XL
Qualcomm Snapdragon 855, Adreno 640, 6144

5734 Points ∼100% +148%

OnePlus Nord
Qualcomm Snapdragon 765G, Adreno 620, 12288

3330 Points ∼58% +44%

Vivo X50 Pro
Qualcomm Snapdragon 765G, Adreno 620, 8192

3329 Points ∼58% +44%

Xiaomi Mi 10 Lite 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

3346 Points ∼58% +45%

ZTE Axon 11 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

3263 Points ∼57% +41%

Average Qualcomm Snapdragon 765G
  (2261 – 3346, n=14)

3133 Points ∼55% +35%

Average of class Smartphone
  (68 – 9138, n=619)

1768 Points ∼31% -24%

Google Pixel 5
Qualcomm Snapdragon 765G, Adreno 620, 8192

3529 Points ∼76%

Apple iPhone 11
Apple A13 Bionic, A13 Bionic GPU, 4096

Points ∼0% -100%

Google Pixel 4
Qualcomm Snapdragon 855, Adreno 640, 6144

4540 Points ∼98% +29%

Google Pixel 4 XL
Qualcomm Snapdragon 855, Adreno 640, 6144

4618 Points ∼100% +31%

OnePlus Nord
Qualcomm Snapdragon 765G, Adreno 620, 12288

3476 Points ∼75% -2%

Vivo X50 Pro
Qualcomm Snapdragon 765G, Adreno 620, 8192

3432 Points ∼74% -3%

Xiaomi Mi 10 Lite 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

3452 Points ∼75% -2%

ZTE Axon 11 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

3223 Points ∼70% -9%

Average Qualcomm Snapdragon 765G
  (1689 – 3529, n=14)

3200 Points ∼69% -9%

Average of class Smartphone
  (293 – 5274, n=659)

1992 Points ∼43% -44%

Google Pixel 5
Qualcomm Snapdragon 765G, Adreno 620, 8192

2844 Points ∼31%

Apple iPhone 11
Apple A13 Bionic, A13 Bionic GPU, 4096

Points ∼0% -100%

Google Pixel 4
Qualcomm Snapdragon 855, Adreno 640, 6144

8765 Points ∼96% +208%

Google Pixel 4 XL
Qualcomm Snapdragon 855, Adreno 640, 6144

9141 Points ∼100% +221%

OnePlus Nord
Qualcomm Snapdragon 765G, Adreno 620, 12288

5171 Points ∼57% +82%

Vivo X50 Pro
Qualcomm Snapdragon 765G, Adreno 620, 8192

5176 Points ∼57% +82%

Xiaomi Mi 10 Lite 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

5832 Points ∼64% +105%

ZTE Axon 11 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

5167 Points ∼57% +82%

Average Qualcomm Snapdragon 765G
  (2844 – 5832, n=14)

4825 Points ∼53% +70%

Average of class Smartphone
  (43 – 16670, n=658)

2454 Points ∼27% -14%

Google Pixel 5
Qualcomm Snapdragon 765G, Adreno 620, 8192

2972 Points ∼40%

Apple iPhone 11
Apple A13 Bionic, A13 Bionic GPU, 4096

Points ∼0% -100%

Google Pixel 4
Qualcomm Snapdragon 855, Adreno 640, 6144

7263 Points ∼97% +144%

Google Pixel 4 XL
Qualcomm Snapdragon 855, Adreno 640, 6144

7507 Points ∼100% +153%

OnePlus Nord
Qualcomm Snapdragon 765G, Adreno 620, 12288

4665 Points ∼62% +57%

Vivo X50 Pro
Qualcomm Snapdragon 765G, Adreno 620, 8192

4652 Points ∼62% +57%

Xiaomi Mi 10 Lite 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

4693 Points ∼63% +58%

ZTE Axon 11 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

4556 Points ∼61% +53%

Average Qualcomm Snapdragon 765G
  (2972 – 4693, n=14)

4256 Points ∼57% +43%

Average of class Smartphone
  (55 – 11256, n=661)

2100 Points ∼28% -29%

Google Pixel 5
Qualcomm Snapdragon 765G, Adreno 620, 8192

28331 Points ∼65%

Apple iPhone 11
Apple A13 Bionic, A13 Bionic GPU, 4096

33864 Points ∼77% +20%

Google Pixel 4
Qualcomm Snapdragon 855, Adreno 640, 6144

37641 Points ∼86% +33%

Google Pixel 4 XL
Qualcomm Snapdragon 855, Adreno 640, 6144

43773 Points ∼100% +55%

OnePlus Nord
Qualcomm Snapdragon 765G, Adreno 620, 12288

21016 Points ∼48% -26%

Vivo X50 Pro
Qualcomm Snapdragon 765G, Adreno 620, 8192

20631 Points ∼47% -27%

Xiaomi Mi 10 Lite 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

20429 Points ∼47% -28%

ZTE Axon 11 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

14891 Points ∼34% -47%

Average Qualcomm Snapdragon 765G
  (14891 – 28331, n=14)

20608 Points ∼47% -27%

Average of class Smartphone
  (735 – 59268, n=804)

15713 Points ∼36% -45%

Google Pixel 5
Qualcomm Snapdragon 765G, Adreno 620, 8192

60281 Points ∼29%

Apple iPhone 11
Apple A13 Bionic, A13 Bionic GPU, 4096

209204 Points ∼100% +247%

Google Pixel 4
Qualcomm Snapdragon 855, Adreno 640, 6144

102168 Points ∼49% +69%

Google Pixel 4 XL
Qualcomm Snapdragon 855, Adreno 640, 6144

105430 Points ∼50% +75%

OnePlus Nord
Qualcomm Snapdragon 765G, Adreno 620, 12288

69285 Points ∼33% +15%

Vivo X50 Pro
Qualcomm Snapdragon 765G, Adreno 620, 8192

68808 Points ∼33% +14%

Xiaomi Mi 10 Lite 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

69506 Points ∼33% +15%

ZTE Axon 11 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

68840 Points ∼33% +14%

Average Qualcomm Snapdragon 765G
  (56690 – 69645, n=14)

67110 Points ∼32% +11%

Average of class Smartphone
  (536 – 221179, n=802)

27850 Points ∼13% -54%

Google Pixel 5
Qualcomm Snapdragon 765G, Adreno 620, 8192

48201 Points ∼50%

Apple iPhone 11
Apple A13 Bionic, A13 Bionic GPU, 4096

97276 Points ∼100% +102%

Google Pixel 4
Qualcomm Snapdragon 855, Adreno 640, 6144

73984 Points ∼76% +53%

Google Pixel 4 XL
Qualcomm Snapdragon 855, Adreno 640, 6144

80296 Points ∼83% +67%

OnePlus Nord
Qualcomm Snapdragon 765G, Adreno 620, 12288

45872 Points ∼47% -5%

Vivo X50 Pro
Qualcomm Snapdragon 765G, Adreno 620, 8192

44918 Points ∼46% -7%

Xiaomi Mi 10 Lite 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

45304 Points ∼47% -6%

ZTE Axon 11 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

38137 Points ∼39% -21%

Average Qualcomm Snapdragon 765G
  (38137 – 48201, n=14)

44131 Points ∼45% -8%

Average of class Smartphone
  (662 – 117606, n=802)

21534 Points ∼22% -55%

Google Pixel 5
Qualcomm Snapdragon 765G, Adreno 620, 8192

8.8 fps ∼15%

Apple iPhone 11
Apple A13 Bionic, A13 Bionic GPU, 4096

60 fps ∼100% +582%

Google Pixel 4
Qualcomm Snapdragon 855, Adreno 640, 6144

24 fps ∼40% +173%

Google Pixel 4 XL
Qualcomm Snapdragon 855, Adreno 640, 6144

11 fps ∼18% +25%

OnePlus Nord
Qualcomm Snapdragon 765G, Adreno 620, 12288

13 fps ∼22% +48%

Vivo X50 Pro
Qualcomm Snapdragon 765G, Adreno 620, 8192

13 fps ∼22% +48%

Xiaomi Mi 10 Lite 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

13 fps ∼22% +48%

ZTE Axon 11 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

13 fps ∼22% +48%

Average Qualcomm Snapdragon 765G
  (8.8 – 13, n=14)

12 fps ∼20% +36%

Average of class Smartphone
  (0.61 – 60, n=329)

11.7 fps ∼20% +33%

Google Pixel 5
Qualcomm Snapdragon 765G, Adreno 620, 8192

5.3 fps ∼16%

Apple iPhone 11
Apple A13 Bionic, A13 Bionic GPU, 4096

33 fps ∼100% +523%

Google Pixel 4
Qualcomm Snapdragon 855, Adreno 640, 6144

13 fps ∼39% +145%

Google Pixel 4 XL
Qualcomm Snapdragon 855, Adreno 640, 6144

12 fps ∼36% +126%

OnePlus Nord
Qualcomm Snapdragon 765G, Adreno 620, 12288

8.5 fps ∼26% +60%

Vivo X50 Pro
Qualcomm Snapdragon 765G, Adreno 620, 8192

8.3 fps ∼25% +57%

Xiaomi Mi 10 Lite 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

8.5 fps ∼26% +60%

ZTE Axon 11 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

8.6 fps ∼26% +62%

Average Qualcomm Snapdragon 765G
  (5.3 – 13, n=14)

8.37 fps ∼25% +58%

Average of class Smartphone
  (0.21 – 101, n=327)

8.24 fps ∼25% +55%

Google Pixel 5
Qualcomm Snapdragon 765G, Adreno 620, 8192

13 fps ∼22%

Apple iPhone 11
Apple A13 Bionic, A13 Bionic GPU, 4096

60 fps ∼100% +362%

Google Pixel 4
Qualcomm Snapdragon 855, Adreno 640, 6144

33 fps ∼55% +154%

Google Pixel 4 XL
Qualcomm Snapdragon 855, Adreno 640, 6144

17 fps ∼28% +31%

OnePlus Nord
Qualcomm Snapdragon 765G, Adreno 620, 12288

21 fps ∼35% +62%

Vivo X50 Pro
Qualcomm Snapdragon 765G, Adreno 620, 8192

21 fps ∼35% +62%

Xiaomi Mi 10 Lite 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

21 fps ∼35% +62%

ZTE Axon 11 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

21 fps ∼35% +62%

Average Qualcomm Snapdragon 765G
  (13 – 21, n=14)

19.8 fps ∼33% +52%

Average of class Smartphone
  (1.4 – 60, n=333)

17.4 fps ∼29% +34%

Google Pixel 5
Qualcomm Snapdragon 765G, Adreno 620, 8192

13 fps ∼15%

Apple iPhone 11
Apple A13 Bionic, A13 Bionic GPU, 4096

87 fps ∼100% +569%

Google Pixel 4
Qualcomm Snapdragon 855, Adreno 640, 6144

32 fps ∼37% +146%

Google Pixel 4 XL
Qualcomm Snapdragon 855, Adreno 640, 6144

33 fps ∼38% +154%

OnePlus Nord
Qualcomm Snapdragon 765G, Adreno 620, 12288

24 fps ∼28% +85%

Vivo X50 Pro
Qualcomm Snapdragon 765G, Adreno 620, 8192

23 fps ∼26% +77%

Xiaomi Mi 10 Lite 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

24 fps ∼28% +85%

ZTE Axon 11 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

24 fps ∼28% +85%

Average Qualcomm Snapdragon 765G
  (13 – 24, n=14)

22.1 fps ∼25% +70%

Average of class Smartphone
  (0.6 – 257, n=332)

20 fps ∼23% +54%

Google Pixel 5
Qualcomm Snapdragon 765G, Adreno 620, 8192

12 fps ∼16%

Apple iPhone 11
Apple A13 Bionic, A13 Bionic GPU, 4096

73 fps ∼100% +508%

Google Pixel 4
Qualcomm Snapdragon 855, Adreno 640, 6144

34 fps ∼47% +183%

Google Pixel 4 XL
Qualcomm Snapdragon 855, Adreno 640, 6144

32 fps ∼44% +167%

OnePlus Nord
Qualcomm Snapdragon 765G, Adreno 620, 12288

21 fps ∼29% +75%

Vivo X50 Pro
Qualcomm Snapdragon 765G, Adreno 620, 8192

20 fps ∼27% +67%

Xiaomi Mi 10 Lite 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

21 fps ∼29% +75%

ZTE Axon 11 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

21 fps ∼29% +75%

Average Qualcomm Snapdragon 765G
  (12 – 21, n=12)

19.4 fps ∼27% +62%

Average of class Smartphone
  (0.6 – 75, n=497)

14.9 fps ∼20% +24%

Google Pixel 5
Qualcomm Snapdragon 765G, Adreno 620, 8192

13 fps ∼22%

Apple iPhone 11
Apple A13 Bionic, A13 Bionic GPU, 4096

60 fps ∼100% +362%

Google Pixel 4
Qualcomm Snapdragon 855, Adreno 640, 6144

30 fps ∼50% +131%

Google Pixel 4 XL
Qualcomm Snapdragon 855, Adreno 640, 6144

16 fps ∼27% +23%

OnePlus Nord
Qualcomm Snapdragon 765G, Adreno 620, 12288

19 fps ∼32% +46%

Vivo X50 Pro
Qualcomm Snapdragon 765G, Adreno 620, 8192

18 fps ∼30% +38%

Xiaomi Mi 10 Lite 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

18 fps ∼30% +38%

ZTE Axon 11 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

18 fps ∼30% +38%

Average Qualcomm Snapdragon 765G
  (11 – 19, n=12)

17.4 fps ∼29% +34%

Average of class Smartphone
  (1.1 – 60, n=501)

13.2 fps ∼22% +2%

Google Pixel 5
Qualcomm Snapdragon 765G, Adreno 620, 8192

1428 Points ∼89%

Apple iPhone 11
Apple A13 Bionic, A13 Bionic GPU, 4096

Points ∼0% -100%

Google Pixel 4
Qualcomm Snapdragon 855, Adreno 640, 6144

1601 Points ∼100% +12%

Google Pixel 4 XL
Qualcomm Snapdragon 855, Adreno 640, 6144

1502 Points ∼94% +5%

OnePlus Nord
Qualcomm Snapdragon 765G, Adreno 620, 12288

1441 Points ∼90% +1%

Vivo X50 Pro
Qualcomm Snapdragon 765G, Adreno 620, 8192

1442 Points ∼90% +1%

Xiaomi Mi 10 Lite 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

1462 Points ∼91% +2%

ZTE Axon 11 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

1362 Points ∼85% -5%

Average Qualcomm Snapdragon 765G
  (1139 – 1462, n=12)

1360 Points ∼85% -5%

Average of class Smartphone
  (7 – 1745, n=741)

829 Points ∼52% -42%

Google Pixel 5
Qualcomm Snapdragon 765G, Adreno 620, 8192

2560 Points ∼27%

Apple iPhone 11
Apple A13 Bionic, A13 Bionic GPU, 4096

Points ∼0% -100%

Google Pixel 4
Qualcomm Snapdragon 855, Adreno 640, 6144

9307 Points ∼99% +264%

Google Pixel 4 XL
Qualcomm Snapdragon 855, Adreno 640, 6144

9356 Points ∼100% +265%

OnePlus Nord
Qualcomm Snapdragon 765G, Adreno 620, 12288

5203 Points ∼56% +103%

Vivo X50 Pro
Qualcomm Snapdragon 765G, Adreno 620, 8192

4639 Points ∼50% +81%

Xiaomi Mi 10 Lite 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

5201 Points ∼56% +103%

ZTE Axon 11 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

5247 Points ∼56% +105%

Average Qualcomm Snapdragon 765G
  (2525 – 5247, n=12)

4698 Points ∼50% +84%

Average of class Smartphone
  (18 – 16996, n=741)

2561 Points ∼27% 0%

Google Pixel 5
Qualcomm Snapdragon 765G, Adreno 620, 8192

5381 Points ∼83%

Apple iPhone 11
Apple A13 Bionic, A13 Bionic GPU, 4096

Points ∼0% -100%

Google Pixel 4
Qualcomm Snapdragon 855, Adreno 640, 6144

6458 Points ∼100% +20%

Google Pixel 4 XL
Qualcomm Snapdragon 855, Adreno 640, 6144

6387 Points ∼99% +19%

OnePlus Nord
Qualcomm Snapdragon 765G, Adreno 620, 12288

5296 Points ∼82% -2%

Vivo X50 Pro
Qualcomm Snapdragon 765G, Adreno 620, 8192

4637 Points ∼72% -14%

Xiaomi Mi 10 Lite 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

5683 Points ∼88% +6%

ZTE Axon 11 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

5464 Points ∼85% +2%

Average Qualcomm Snapdragon 765G
  (4556 – 5683, n=12)

5112 Points ∼79% -5%

Average of class Smartphone
  (21 – 8874, n=741)

1919 Points ∼30% -64%

Google Pixel 5
Qualcomm Snapdragon 765G, Adreno 620, 8192

5960 Points ∼71%

Apple iPhone 11
Apple A13 Bionic, A13 Bionic GPU, 4096

Points ∼0% -100%

Google Pixel 4
Qualcomm Snapdragon 855, Adreno 640, 6144

8058 Points ∼96% +35%

Google Pixel 4 XL
Qualcomm Snapdragon 855, Adreno 640, 6144

8417 Points ∼100% +41%

OnePlus Nord
Qualcomm Snapdragon 765G, Adreno 620, 12288

6801 Points ∼81% +14%

Vivo X50 Pro
Qualcomm Snapdragon 765G, Adreno 620, 8192

6936 Points ∼82% +16%

Xiaomi Mi 10 Lite 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

6069 Points ∼72% +2%

ZTE Axon 11 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

6691 Points ∼79% +12%

Average Qualcomm Snapdragon 765G
  (5411 – 6936, n=12)

6459 Points ∼77% +8%

Average of class Smartphone
  (369 – 14189, n=741)

3516 Points ∼42% -41%

Google Pixel 5
Qualcomm Snapdragon 765G, Adreno 620, 8192

3290 Points ∼62%

Apple iPhone 11
Apple A13 Bionic, A13 Bionic GPU, 4096

Points ∼0% -100%

Google Pixel 4
Qualcomm Snapdragon 855, Adreno 640, 6144

5277 Points ∼100% +60%

Google Pixel 4 XL
Qualcomm Snapdragon 855, Adreno 640, 6144

5243 Points ∼99% +59%

OnePlus Nord
Qualcomm Snapdragon 765G, Adreno 620, 12288

4054 Points ∼77% +23%

Vivo X50 Pro
Qualcomm Snapdragon 765G, Adreno 620, 8192

3857 Points ∼73% +17%

Xiaomi Mi 10 Lite 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

4094 Points ∼78% +24%

ZTE Axon 11 5G
Qualcomm Snapdragon 765G, Adreno 620, 6144

4020 Points ∼76% +22%

Average Qualcomm Snapdragon 765G
  (3237 – 4094, n=12)

3792 Points ∼72% +15%

Average of class Smartphone
  (1 – 6273, n=741)

1801 Points ∼34% -45%