via The Verge:
Normally, video calls tend to make it look like both participants are peering off to one side or the other, since they’re looking at the person on their display, rather than directly into the front-facing camera. However, the new “FaceTime Attention Correction” feature appears to use some kind of image manipulation to correct this, and results in realistic-looking fake eye contact between the FaceTime users.
It looks like this will be limited to the iPhone XS and XS Max, I'm guessing because of the FaceID cameras in those phones as well as their Apple claiming their CPUs have some dedicated machine learning features onboard.
This tweet is a good example of how this looks:
How iOS 13 FaceTime Attention Correction works: it simply uses ARKit to grab a depth map/position of your face, and adjusts the eyes accordingly.— Dave Schukin 🤘 (@schukin) July 3, 2019
Notice the warping of the line across both the eyes and nose. pic.twitter.com/U7PMa4oNGN
I'm honestly not really sure how I feel about this feature. It "solves" an problem that has always existed with video calls; you aren't making "eye-contact" with the other person, because you want to look at their face on your screen, meaning that you aren't staring directly into the camera. On the other hand, this type of image manipulation feels sort of creepy.