Step by step, Tim Cook and his team are putting all the pieces in place to introduce The Next Big Thing in 2023 - Apple Glasses, the device that will take us from the smartphone era to one where wearables are the key to our existence in the information age. The latest chess move was revealed today in the form of the iPad Pro 2020's new secret weapon: the ToF sensor, which Apple calls the LiDAR sensor.

But what the heck is LiDAR, why does Apple call its ToF LiDAR, and why is it so important to the future of Apple AR Glasses - and the future of computers themselves?

What is LiDAR?

BERLIN WEATHER

How does LiDAR - Light Detection And Ranging - work?

In short, a laser - usually infrared - is fired in a pattern across whatever is in front of it. This is what illuminates the scene. The camera then measures the time it takes for each fired photon to return to the emitter.

By measuring the time it takes to return - hence its name, the time of flight of a fired photon - you can calculate the distance between the camera and any objects hit by the photons.

Furthermore, algorithms for pattern deformation analysis can easily be applied, which give a better understanding of the objects that LiDAR measures and thus provide data to AI for object identification.

There are other considerations such as close light sources and concave surfaces that can confuse the time-of-flight sensor, but these can be addressed by other algorithms.

With the instantaneous measurement of the distance between objects and the transmitter obtained by the ToF sensor on the iPad - or the Samsung Galaxy S20 Ultra - the processor will be able to get precise 3D geometric data from the world around it.

Why is Apple calling this sensor LiDAR instead of ToF?

Apple has chosen to call the iPad's 3D sensor a "LiDAR sensor" for reasons I don't understand. The fact is that the LiDAR sensor is the form of ToF that Samsung and other manufacturers call it in their devices. Their sensors are mostly made by Sony, which has 95% of the market share for this type of technology. In fact, Apple is also allegedly using the same ToF sensors from Sony.

Perhaps Apple, always a master marketer, did this to differentiate itself from everyone else, knowing that one, everyone uses the term ToF and, two, not many people have any idea what ToF is. It is one of those obscure technical terms that gets thrown around in spec sheets.

At least people have heard of LiDAR for some time. Many people know that these are cameras that help autonomous cars recognize the world around them and navigate safely. In fact, Apple has been working with LiDAR for some time on its Titan project, the infamous Apple Car that never saw the light of day.

Apple also claims on its website that NASA is using the same technology on the 2020 Mars Rover, and it's true. Curiosity uses it too. Just definitely not the sensor that Apple uses, but a much more advanced one. And in fact many of them, and more so than for terrain navigation.

Does it work well?

During Apple's virtual press preview for the new iPad Pro, Tom's Guide editor in chief Mark Spoonauer got a preview of some of the AR apps that will be available and optimized for the iPad Pro's LiDAR scanner, including a game called Hot Lava that brings (you guessed it) hot lava right into your living room along with a very realistic looking main character.

The Complete Anatomy app was also impressive, as it can show you in real time which muscles the person in front of you is using when moving. And IKEA Place looked much better, as Apple's A12Z Bionic chip makes setting up your virtual room instantaneous. Presumably the power of Apple Glasses will rest in your iPhone or iPad, but the glasses themselves will have a LiDAR scanner on board.

Why LiDAR is crucial for Apple Glasses

Apple Glasses will need this 3D sensor technology to operate at 110% of its capabilities when they come out in 2023. As the iPad Pro will show, truly compelling augmented reality applications need to have immediate and continuous knowledge of its surroundings to effectively connect virtual images to the real world.

Knowing what the world around you looks like in 3D at all times will allow you to accurately position 3D objects in the world, apply textures to real world objects, and place virtual objects behind physical objects. If you want your new fantastical, mixed reality world to feel real to your eyes, you need all of these skills working constantly at a high frame rate. This is how connected reality can look on Apple Glasses - albeit a dystopian reality.

Comments (0)

Leave a comment