Apple’s LiDAR Integration in iPhone 13 Pro Explained

February 4, 2023
|
Updated February 9, 2026
|

2 min read

Close-up of the iPhone 13 Pro camera module featuring LiDAR sensor.
For the past six months I’ve been staring at the backside of my iPhone 13 Pro wondering what possessed Apple to build a Light Detection and Ranging (LIDAR) camera into its flagship smartphone.

From an article in The A Register by Mark Pesce.

It’s not as though you need a time-of-flight depth camera, sensitive enough to chart the reflection time of individual photons, to create a great portrait. That’s like swatting a fly with a flamethrower – fun, but ridiculous overkill. There are more than enough cameras on the back of my mobile to be able to map the depth of a scene – that’s how Google does it on its Pixel phones. So what is Apple’s intention here? Why go to all this trouble?

The answer lies beyond the iPhone, and points to what comes next.

In the earliest days of virtual reality, thirty years ago, the biggest barrier to entry was compute capacity necessary to render real-time three-dimensional graphics. Back in 1992, systems capable of real-time 3D looked like supercomputers and cost hundreds of thousands of dollars.

For the computer to see the world, it must be able to capture the world. This has always been hard and expensive. It requires supercomputer-class capabilities, and sensors that cost tens of thousands of dollars … Wait a minute. This is sounding oddly familiar, isn’t it?

Until just two years ago, LIDAR systems cost hundreds to thousands of dollars. Then Apple added a LIDAR camera to the back of its iPad Pro and iPhone 12 Pro. Suddenly a technology that had been rare and expensive became cheap and almost commonplace. The component cost for LIDAR suddenly dropped by two orders of magnitude – from hundreds of dollars per unit to a few dollars apiece.

Apple needed to do this because the company’s much-rumored AR spectacles will necessarily sport several LIDAR cameras, feeding their M1-class SoC with a continuous stream of depth data so that the mixed reality environment managed by the device maps neatly and precisely onto the real world. As far as Apple is concerned, the LIDAR on my iPhone doesn’t need to do much beyond drive component costs down for its next generation of hardware devices.

For the complete article on what possessed Apple… CLICK HERE.

Get Lidar News in Your Inbox

Weekly updates on lidar tech, geospatial industry news, case studies, and product reviews.

About The Author

Gene Roe - founder of Lidar News

Phoenix Lidar System - complete lidar solutions
Phoenix Lidar Systems

Recent iPhone Posts

Scan-to-BIM on Your Phone: Transforming 3D Capture

From Point Clouds to Practical BIM Anyone familiar with traditional Scan-to-BIM knows the challenges. A…

December 11, 2025

iPhone LiDAR Scanning: Ley-ing Down History With Ian George

In this guest post, Ian George of Cosmic Oak Media takes us on a journey…

April 22, 2025

New iPhone LiDAR App Advances Burn Wound Assessment

Accurate burn wound measurement is critical for effective treatment, but traditional 2D imaging techniques often…

March 4, 2025

iPhone Face ID: Lidar vs. TrueDepth

iPhone Face ID Lidar or Something Else? Many assume the iPhone uses face ID lidar…

February 20, 2025

iPhone LiDAR Revolutionizes Dog Prosthetics

If you have an iPhone 12 Pro or later model, you probably know you can…

February 18, 2025

iPhone 16 Repairability Enhanced

Apple’s new series marks a significant leap in iPhone repairability, reflecting the company’s response to…

December 22, 2024

Popular Posts

Get Lidar News in Your Inbox

Weekly updates on lidar tech, geospatial industry news, case studies, and product reviews.

Mach9 Digital Surveyor 2