The iPhone 13 family has arrived. It has a bunch of members. Apple’s iPhone 13 Pro and Pro Max are positioned as video shooters fit for Spielberg. Heck, maybe they are. The iPhone 13 and 13 Mini have bigger batteries than their predecessors, and the impressive camera sensor stabilisation only available to the most expensive last-gen iPhones.
But one thing ties all the iPhones together: the notch. Every year we wonder if Apple might finally get rid of this almost half-decade-old relic and let the Super XDR OLED screen spread out, as the OLEDs of the best Samsung phones do.
A recent survey by 9to5Mac suggests Apple fans are roughly evenly split on whether Apple should keep the notch or trade it in for a punch hole, joining everyone else. Some of the site’s readers say the notch gives iPhones a distinctive appearance. Others say Apple should ditch phone-unlocking Face ID altogether in this new masked-up world — not that the UK seems all that masked-up in September 2021.
However, a change in the iPhone 13 shows Apple does know the notch is an unfortunate compromise. It’s a mole pitched as a beauty mark, because Apple made it 20 per cent smaller this time around.
You need to look at the iPhone 12 and iPhone 13 side-by-side to see the change. Only then is the notch noticeably less wide than the old one. But is also slightly deeper, eating a little further into the screen and limiting the usable area for apps and games.
This is a mon dieu moment. The “20 per cent decrease” in notch size is a distraction. We’re not heading any closer to the tiny punch holes of Android phones. We’re no closer to Apple experimenting publicly with the, admittedly flawed, under-screen camera tech of phones such as the Samsung Galaxy Z Fold 3.
The problem here is Apple cannot replace the notch with a punch hole anyway because there is a lot more than just a camera in this notch. Previous-generation iPhones with notches have housed a) the selfie camera; b) ambient light sensor; c) speaker; d) microphone; e) proximity sensor; f) flood illuminator; g) dot projector; and h) infrared camera. We’ve covered a good chunk of the alphabet there.
Even if Apple shifted the speaker and microphone into the upper screen surround, you’d need not one punch hole but enough little black shapes cut out of the screen to make up a kids’ block toy set.
Face ID needs these components in order to function. The flood illuminator shines infrared light, invisible to the human eye, onto your face. The dot projector casts thousands of more pronounced IR dots onto it. And the infrared camera sees the pattern these dots create, a contour map of your face the iPhone learns, and can recognise whether you’re in a well-lit space or a dark room.
The notch and Face ID are a co-dependent pair in a world where we might question whether either should exist. Face ID is great, but is it really justified?
Apple cannot really argue rival fingerprint scanning systems are insufficient for security or a good user experience either. Finger pads, under-screen or otherwise, are now wonderfully fast. Apple uses them in its other devices, including the iPhone SE and new iPad mini. It popularised the bloody things in the first place with 2013’s iPhone 5S.
It may argue its Face ID system is more secure than fingerprint methods. The opinions of security experts are mixed at best on this, and if Apple’s plan was to sell it in as a business-critical feature, you’d imagine Face ID would also be present in MacBooks and iMacs as well as iPhones.
Microsoft seems much more interested in this area, actually. Windows 10’s Hello secure login feature supports Intel RealSense 3D cameras. Laptops such as the Surface Laptop 4 have an IR camera that uses a technique more rudimentary than — but ultimately similar to — Face ID.
There’s more the notch than security and a quick way to unlock your phone. And several of the core features Apple talked about at the iPhone 13 launch point to some M. Night Shyamalan-worthy twist in the company’s roadmap we can’t quite make out at this point.
It all hinges on AR, augmented reality, but not in the frequently groan-inducing way we perceive much of that area of tech today. Apple is searching for the uses of augmented reality-enabling hardware that will make us wonder how we ever lived without them.
Apple has not found it yet, but its obsession is clear. Much of its spotlight on the Apple A15 Bionic processor focused on its neural network capabilities, something that appeals directly to a vanishingly small number of potential iPhone 13 buyers. The average person’s reaction to the prospect of a 16-core neural engine is “ummm, ok?”
However, Apple is getting closer. The iPhone 13 series’s Cinematic camera mode is an implementation of AR hardware, assuming it makes some use of the iPhone 13 Pro’s LiDAR sensor in creating depth maps. This feature lets you synthetically create and alter background blur in videos after you shoot them to emulate the look of a dedicated video camera or mirrorless/DSLR camera with a wide aperture lens. It requires a detailed depth map made on the fly, which is what LiDAR can provide.
Apple expanded Cinematic mode out of the slightly cheesy, and often flaky, background blur video modes of Androids as old as 2018's Huawei Mate 20 Pro, and built them up in order to seem serious and legitimate. It went as far as hiring Oscar-winning director Kathryn Bigelow and Emmy-winning cinematographer Greig Fraser to endorse Cinematic mode. Apple is saying this isn’t AR tech fluff, it’s real and important.
This AR tech obsession is nothing new, either. Remember 2018’s Memoji? That’s AR. Did you see Apple’s 2017 ARKit demo, one of the few times Apple actually seemed vaguely excited about mobile gaming? That is definitely AR.
We see a progression here. Apple is peeling away the faddy layers of augmented reality in the belief or hope there’s something amazing at its core, past the games that look great on paper but aren’t much fun in reality and, well, Memojis.
However, Apple can’t get there without the notch and LiDAR, or some replacement for them. They are the tools with which it mines for advancements that will secure the growth of future generations of iPhones. And it helps that rivals have done no more than dabble in this area. Remember when depth-calculating “Time of Flight” cameras in numerous Android phones caught a lot of attention in 2019, then did almost nothing useful and were largely forgotten about?
Google’s fickle interest in AR means this is Apple’s patch, for now. But you end up wishing sometimes Apple were a bit more like Google.
Not in its approach to privacy, or for the world-devouring ad- and algorithm-filled maw you find when you lift Google’s friendly looking lid. But a bunch of Google’s most important software features were written openly about, in some detail, on its AI Blog long before they appeared in any phones. Apple’s development is a hidden thing.
We could guess Apple might have plans in its other areas of interest. Perhaps it wants to use AR camera tech and machine learning to analyse that curious rash on your stomach, that lump on your neck, as part of HealthKit? It’s like WebMD, but the end result is an appointment with a doctor rather than anxious posts on Reddit and Mumsnet. Perhaps the notch and LiDAR could help dispense physiotherapy remotely — with or without the guidance of a medical professional?
Perhaps the Apple Car really is just parked around the corner, and for years the AR trinkets we’ve seen are the byproducts of all the work Apple engineers have put into intelligent cameras for autonomous vehicles?
But we should not adopt the Elon Musk fan’s mindset and assume a company or its figurehead always knows the best route forward. Here is an eye-opener for you. Did you know Steve Jobs was reticent to even allow third-party apps on iPhone and had to be convinced otherwise by his execs? Or that he wasn’t even keen on the idea of making a phone at all originally?
Success in Big Tech comes from a million broken prototypes, exploring countless dead-end ideas, timing, luck and a lot of money. Apple is playing the AR long game, with the hope that the way we even talk about it in 2021 will soon seem as stilted and archaic as saying the “https://www" part at the beginning of web addresses. Maybe it will even pay off.
- 🌡️ Sign-up to WIRED’s climate briefing: Get Chasing Zero
- We finally know the true toll of all those bad Slacks
- A strange Covid-19 origin theory is gaining traction
- How to lose that lockdown weight the right way
- Supercharge your chats with these WhatsApp tricks
- The draconian rise of internet shutdowns
- A radical plan to treat Covid’s mental health fallout
- The 100 hottest startups in Europe in 2021
- 🔊 Subscribe to the WIRED Podcast. New episodes every Friday
This article was originally published by WIRED UK