Skip to main content

Apple secretly adds AR-powered FaceTime eye correction in iOS 13

Image used with permission by copyright holder

While we knew iOS 13 was going to contain a lot of useful additions outside of the headline features like Dark Mode, we didn’t expect Apple to add AR-powered eye correction to FaceTime video calls. But that seems to be exactly what it’s done in the most recent update for the iOS 13 public beta.

According to app designer Mike Rundle (and signal-boosted by the surprised folks on Reddit’s Apple subreddit), the iOS 13 public beta now includes an option for “FaceTime Attention Correction.” According to the feature’s tooltip, turning this on will increase the accuracy of your eye contact with the camera during FaceTime video calls. What does that mean? AR black magic trickery, basically.

Haven’t tested this yet, but if Apple uses some dark magic to move my gaze to seem like I’m staring at the camera and not at the screen I will be flabbergasted. (New in beta 3!) pic.twitter.com/jzavLl1zts

— Mike Rundle (@flyosity) July 2, 2019

It all comes down to a minor, but irritating flaw that FaceTime — and admittedly, all other video-calling apps — suffers from. If you’re looking at your screen to look at the person you’re talking to, then you’re not looking at the camera. If you’re not looking at the camera, then it doesn’t seem as if you’re looking at the person you’re calling — which leads to a weird disconnect where everyone in the call seems to be not looking directly at anyone else.

Apple’s new setting changes that, making subtle alterations to your video stream to make it seem as if you’re actually looking directly at the person on the other end of the call. People were quick to try it out, and noticed immediately that the setting is actually fairly effective.

So how does it work? It’s a combination of Apple’s ARKit augmented reality software and the TrueDepth cameras built into the latest iPhones. FaceTime uses the TrueDepth camera to grab a depth map of your face — much like FaceID — and then runs the data through ARKit, creating a slightly altered version of your eyes and nose with a new focus. Thanks to the processing power of the most recent iPhones, this can happen in real time, making the process seamless. In a video, Dave Shukin shows how it’s done.

How iOS 13 FaceTime Attention Correction works: it simply uses ARKit to grab a depth map/position of your face, and adjusts the eyes accordingly.

Notice the warping of the line across both the eyes and nose. pic.twitter.com/U7PMa4oNGN

— Dave Schukin ???? (@schukin) July 3, 2019

As ever, there’s a catch to this amazing new feature. It’s only available to the most recent batch of iPhones — so only iPhone XS and XS Max owners are currently able to experience it. Despite being loaded with the same hardware, the iPhone X misses out. But with Apple being Apple, don’t be surprised if this rolls out for iPhone X in the full release of iOS 13, or comes to it shortly afterward. At this moment, it’s also unknown whether this feature will also come to MacOS and iPadOS — but we’d be surprised if it didn’t.

Editors' Recommendations

Mark Jansen
Mark Jansen is an avid follower of everything that beeps, bloops, or makes pretty lights. He has a degree in Ancient &…
The 7 biggest features we expect to see in iOS 18
The home screen on the Apple iPhone 15 Plus.

Apple revealed that its Worldwide Developers Conference (WWDC) will take place on June 10. This is when we expect to see the next iteration of software across all of Apple’s products, including iOS 18.

From the sounds of it, we’re in for a big update with iOS 18, rumored to be one of the “biggest updates” yet. Here’s what we expect from Apple's next major iPhone update with iOS 18.
A more customizable home screen

Read more
This could be our first look at iOS 18’s huge redesign
An iPhone 14 Pro Max and iPhone 14 Pro standing upright on a desk.

While iOS 17 fell short on a visual overhaul, Apple is rumored to be working on an updated identity for its next iOS version. Previous reports have claimed that the upcoming iOS 18 will feature visionOS-like elements introduced on the Apple Vision Pro. A new report confirms this with a leaked image of the iOS 18 Camera app.

According to a report from MacRumors, the next version of the Camera app could feature visionOS-style design elements. It is based on an iPhone frame template that the publication received from an anonymous source who claimed to have received it from an iOS engineer. It is said to have been included as part of the Apple Design Resources for iOS 18.

Read more
iOS 18 could add a customization feature I’ve waited years for
iOS 17 interactive widgets on an iPhone 15 Pro Max.

iOS 18 is coming later this year, and all signs point to it being a dramatic iPhone update. Now, thanks to one new report, it looks like iOS 18 could add a customization feature I've been waiting years and years and years for: better home screen customization.

According to Bloomberg's Mark Gurman, iOS 18 will introduce a "more customizable" home screen. More specifically, iOS 18 will allow you to place app icons and widgets anywhere you want. If you want a space or break between an app icon or your widget, welcome to the future: iOS 18 may finally let you do that. MacRumors corroborated this report with its own sources, too.

Read more