LiDAR in Apple Devices: Enhancing 3D Imaging

November 13, 2023
|
Updated February 9, 2026
|

2 min read

Close-up of the rear camera module on an Apple device, featuring dual lenses and a LiDAR sensor.
Tech giant Apple began including LiDAR sensors in its mobile devices beginning with the 2020 iPad Pro, and today all of its top-end smartphones carry the feature. In fact, you could argue that the majority of Apple’s iPhone and iPad products have LiDAR capabilities if you include Face ID, which also has 3D imaging features. But there are some key differences between the rear-mounted LiDAR on the iPhone and the front-facing Face ID assembly.

From an article in Tech HQ

The Face ID module projects a patterned point cloud of more than 30,000 infrared dots onto surfaces that are roughly an arm’s length (25 – 50 cm) away from the device. If the target surface were perfectly flat then the shape of that point cloud would be unchanged, comparing the optically generated pattern emitted from the iPhone or iPad with the image received by the infrared camera element within the Face ID hardware assembly.

Any variations in the surface reflecting those infrared dots back to the camera will shift the points either closer together or further apart. And the deviations can be used to infer the shape and physical characteristics of the object – in this case, the user’s face – being imaged.

In contrast, returning to the differences between Face ID and LiDAR, the sensor found on the rear of Apple’s products emits fewer infrared marker points, comprising a regular grid of 24 x 24 dots. However, each dot is brighter than the Face ID projected points, which gives the LiDAR on the back of handsets and tablets a much great working range up to 5 m.

And not only can the system determine shape information from distortions in the grid seen when the infrared dots are projected onto the scene in front of the sensor, it can also capture depth information based on time-of-flight. The detection of laser pulses emitted by the LiDAR chip is affected by whether reflecting surfaces are closer or farther away from the device’s infrared source.

Patents such as US-20200256669 (view PDF) show that Apple uses a sparse array of single photon avalanche diodes (SPADs) to perform a kind of optical stocktake on the whereabouts of the infrared range-finding LiDAR emissions. And the iPhone maker’s True Depth Face ID design also features innovations that allows the hardware to perceive distance more accurately, albeit on different length scales.

For the complete article CLICK HERE.

Get Lidar News in Your Inbox

Weekly updates on lidar tech, geospatial industry news, case studies, and product reviews.

About The Author

Gene Roe - founder of Lidar News

NV5 GeoAgent
SAM Managed geospatial services

Recent iPhone Posts

Scan-to-BIM on Your Phone: Transforming 3D Capture

From Point Clouds to Practical BIM Anyone familiar with traditional Scan-to-BIM knows the challenges. A…

December 11, 2025

iPhone LiDAR Scanning: Ley-ing Down History With Ian George

In this guest post, Ian George of Cosmic Oak Media takes us on a journey…

April 22, 2025

New iPhone LiDAR App Advances Burn Wound Assessment

Accurate burn wound measurement is critical for effective treatment, but traditional 2D imaging techniques often…

March 4, 2025

iPhone Face ID: Lidar vs. TrueDepth

iPhone Face ID Lidar or Something Else? Many assume the iPhone uses face ID lidar…

February 20, 2025

iPhone LiDAR Revolutionizes Dog Prosthetics

If you have an iPhone 12 Pro or later model, you probably know you can…

February 18, 2025

iPhone 16 Repairability Enhanced

Apple’s new series marks a significant leap in iPhone repairability, reflecting the company’s response to…

December 22, 2024

Popular Posts

Get Lidar News in Your Inbox

Weekly updates on lidar tech, geospatial industry news, case studies, and product reviews.

Stitch3D cloud strategy