Apple is bullish on Lidar, a new technology introduced in the iPhone 12 family.. ( Pro Max will last in a few weeks.)
Take a closer look at one of the new iPhone 12 Pro models or, A small black dot about the same size as the flash appears near the camera lens. This is the lidar sensor, and it’s a new type of depth sensing that can make a difference in many interesting ways.
If Apple’s got that way, it’s a term that riders will hear a lot now, so let’s break down what we know, what Apple will use it for, and where technology can go next.
Lidar stands for light detection and distance measurement and has been around for a while. The laser is used to ping off the object and return to the laser source to measure the travel or flight time of the light pulse to measure the distance.
How does the rider work to sense depth?
Lidar is a kind of Time-of-Flight camera. Some other smartphones measure depth with a single light pulse, while smartphones using this type of lidar technology send a wave of light pulses with an infrared dot spray and measure each with a sensor to create a point field that maps the distance. . You can “mesh” the dimensions of space and the objects within it. Light pulses are not visible to the human eye, but can be viewed with a night vision camera.
Isn’t this the same as Face ID on iPhone?
But the range is longer. The idea is the same. AppleIt also shoots an infrared laser array, but can only work a few feet away. The rear lidar sensor on the iPad Pro and iPhone 12 Pro works at a range of up to 5 meters.
Lidar is already in many different technologies.
Lidar is a technology that sprouts everywhere. Is used, or . Is used and . Augmented reality headset There is a similar technique for mapping space before layering 3D virtual objects. But it also has a pretty long history.
Microsoft’s old depth-sensing Xbox accessory,, It was a camera with infrared depth scan function. PrimeSense, the company that actually helped create Kinect technology, . Now you have Apple’s TrueDepth face scan and rear lidar camera sensor.
The iPhone 12 Pro camera can work better with the rider.
Smartphones’ Time-of-flight cameras tend to be used to improve focus accuracy and speed, and so does the iPhone 12 Pro. Apple promises up to 6x faster low-light focus in low-light conditions. Rider depth detection is also used to improve the night portrait mode effect.
Better focus is the plus, and the iPhone 12 Pro can also add more 3D photo data to the image. This element hasn’t been placed yet, but Apple’s front depth sensing TrueDepth camera was used in a similar way in the app.
It will also greatly improve augmented reality.
With Lidar, your iPhone 12 Pro can launch AR apps much faster, and you can create a quick map of your room to add details. ManyUse Lidar to hide virtual objects (called occlusions) behind real objects and place them within more complex spatial mappings, such as tables or chairs.
But with a longer tail, there’s more to it. Many companies are dreaming of headsets that will mix virtual and real objects: AR glasses,, , , , and And others use advanced 3D maps of the world to layer virtual objects.
These 3D maps are currently being built with special scanners and equipment, such as the world scanned version of Google Maps cars. But it’s possible that people’s own devices will eventually help crowdsource that information or add instant data. Again, AR headsets like Magic Leap and HoloLens already layer items after pre-scanning the environment, and Apple’s rider-mounted AR technology works the same way. In that sense, the iPhone 12 Pro and iPad Pro are like AR headsets without a headset part, and in the end, it could pave the way for Apple to create its own glasses.
3D scanning can be a killer app.
You can use a technique called photogrammetry by meshing 3D objects and rooms with Lidar and superimposing photographic images on top. It could be the next wave of capture technology for practical uses such as: journalism. The ability to capture 3D data and share that information with others can open these rider-equipped phones and tablets into 3D content capture tools. Lidar can be used to acquire measurements of objects and spaces without the need for camera elements., Or social media and
Apple wasn’t the first to explore this technology in mobile phones.
Google has the same idea-The initial AR platform -Created. In addition, the advanced camera array has infrared sensors, which can map rooms to generate 3D scans and depth maps for AR and indoor spatial measurements. Phones with Google’s Tango were short-lived and were replaced by computer vision algorithms that performed predictive depth detection on the camera without the same hardware. But Apple’s iPhone 12 Pro looks like a much more advanced successor.