Apple is set to introduce a rear-facing 3D camera to the 2020 iPhone.
The camera, which is a culmination of a laser, sensor as well as a software system emits light. This light measures the distance between the device and different objects and surfaces located in front. This rear camera feature will add an advanced photo and video effects in addition to augmented reality experiences.
iPhone professionals and engineers have been continually working on this rear camera for around two years now. Though this was on the list of possible new additions to the 2020 iPhone, it wasnât confirmed until now. iPhone users will see this innovative rear camera/sensor feature for the first time in the device. Apple will purchase the laser for the new 3D camera from the San Jose-based company Lumentum. This is the same company that supplies lasers for the front front-facing 3D camera of iPhone. However, Apple isnât the first company to add a rear-facing depth camera to its devices. Earlier Samsung added this feature to its Galaxy Note 10+, Galaxy S20+ and Galaxy 20+ Ultra phones, as well as other Android devices. But Apple can make it innovative in a way by developing the technology used in the camera/sensor feature.

Tim Cook and Deirdre OâBrien welcome the first visitors to the reimagined Apple Fifth Avenue. (Image: Apple Newsroom)
The iPhones already have depth cameras on their front side, used primarily for Face ID security and for fun camera effects. The iPhone 11 Pro as well as the iPhone 11 Pro Max, have three camera lenses located on the rear: a 12-megapixel wide-angle, a 12-megapixel 2X telephoto lens, and a 12-megapixel ultra-wide-angle lens. While these three lenses on the back would provide breadth for scenario pictures, the 3D camera feature would add depth to the photos.
How the depth camera works
The primary technology behind the depth camera system is a rear-facing vertical-cavity surface-emitting laser (VCSEL) which emits lightwaves consistently and measures the time that each ray takes to bounce off objects in the environment and then return to the sensor. The light that returns from objects located near has a shorter âtime of flight,â while the light returning from objects found far away have a higher ‘time of flightâ. The light that is emitted by the front-facing depth camera in an iPhone can travel only a few feet away, which is excellent for the person standing nearby. But the Lumentum laser which will be used in the rear-facing camera system would have a more extended range.
The 3D camera used on the Samsung phones is the technology backed by two features. One is the Live Focus which lets the users blur out the background of the photos in still images and video. This enables the user to put added emphasis on the person or object located in the foreground. The second feature is the Quick Measure which allows the user to estimate the width, height, area, and volume of any object located within the camera frame. Apple might add one or two more features like an AR functionality to improve the rear-facing 3D camera.
The 2020 iPhone will be the first 5G iPhone by Apple. Also, the impact rear 3D sensing will bring more attention to the photo effects, which will create a considerable demand and love for the new iPhone by smartphone users. The 3D mapping by Apple might be used in collaboration with other photo software features of the phone. Apple is also developing an AR app for the iOS 14 which will enable users to point their iPhone at items in Apple Stores as well as Starbucks, and acquire digital information display on their phone screens instantly. The rear-facing 3D cameras would also allow users to create digital content which also has feasible social media sharing provision.