Apple wants to make Lidar a great deal on iPhone 12 Pro and above. What is it and why is it important

Apple wants to make Lidar a great deal on iPhone 12 Pro and above. What is it and why is it important

The lidar sensor on the iPhone 12 Pro (the black circle in the lower right corner of the camera unit) opens up AR possibilities.

Apple

Apple is bullish on Lidar, a new technology introduced in the iPhone 12 family. iPhone 12 Pro and iPhone 12 Pro Max. (The iPhone 12 Pro is currently on sale with the iPhone 12 Pro Max. Pro Max will last in a few weeks.)

Take a closer look at one of the new iPhone 12 Pro models or Latest iPad Pro, A small black dot about the same size as the flash appears near the camera lens. This is the lidar sensor, and it’s a new type of depth sensing that can make a difference in many interesting ways.

If Apple’s got that way, it’s a term that riders will hear a lot now, so let’s break down what we know, what Apple will use it for, and where technology can go next.

Mean?

Lidar stands for light detection and distance measurement and has been around for a while. The laser is used to ping off the object and return to the laser source to measure the travel or flight time of the light pulse to measure the distance.

How does the rider work to sense depth?

Lidar is a kind of Time-of-Flight camera. Some other smartphones measure depth with a single light pulse, while smartphones using this type of lidar technology send a wave of light pulses with an infrared dot spray and measure each with a sensor to create a point field that maps the distance. . You can “mesh” the dimensions of space and the objects within it. Light pulses are not visible to the human eye, but can be viewed with a night vision camera.

ipad-pro-ar

The iPad Pro released in the spring also has a rider.

Scott Stein / CNET

Isn’t this the same as Face ID on iPhone?

But the range is longer. The idea is the same. Apple TrueDepth camera with Face ID It also shoots an infrared laser array, but can only work a few feet away. The rear lidar sensor on the iPad Pro and iPhone 12 Pro works at a range of up to 5 meters.

Lidar is already in many different technologies.

Lidar is a technology that sprouts everywhere. Is used Autonomous car, or Auxiliary driving. Is used Robotics and drone. Augmented reality headset HoloLens 2 There is a similar technique for mapping space before layering 3D virtual objects. But it also has a pretty long history.

Microsoft’s old depth-sensing Xbox accessory, Kinect, It was a camera with infrared depth scan function. PrimeSense, the company that actually helped create Kinect technology, It was acquired by Apple in 2013.. Now you have Apple’s TrueDepth face scan and rear lidar camera sensor.

XBox_One_35657846_03.jpg

Remember Kinect?

Sarah Tew / CNET

The iPhone 12 Pro camera can work better with the rider.

Smartphones’ Time-of-flight cameras tend to be used to improve focus accuracy and speed, and so does the iPhone 12 Pro. Apple promises up to 6x faster low-light focus in low-light conditions. Rider depth detection is also used to improve the night portrait mode effect.

Better focus is the plus, and the iPhone 12 Pro can also add more 3D photo data to the image. This element hasn’t been placed yet, but Apple’s front depth sensing TrueDepth camera was used in a similar way in the app.

lidar-powered-snapchat-lens.png

Snapchat is already using the iPhone 12 Pro’s rider to activate the AR lens.

Snapchat

It will also greatly improve augmented reality.

With Lidar, your iPhone 12 Pro can launch AR apps much faster, and you can create a quick map of your room to add details. Many Apple’s AR update on iOS 14 Use Lidar to hide virtual objects (called occlusions) behind real objects and place them within more complex spatial mappings, such as tables or chairs.

But with a longer tail, there’s more to it. Many companies are dreaming of headsets that will mix virtual and real objects: AR glasses, Working on Facebook, Qualcomm, Snapchat, Microsoft, Magic leaf and Most apple And others use advanced 3D maps of the world to layer virtual objects.

These 3D maps are currently being built with special scanners and equipment, such as the world scanned version of Google Maps cars. But it’s possible that people’s own devices will eventually help crowdsource that information or add instant data. Again, AR headsets like Magic Leap and HoloLens already layer items after pre-scanning the environment, and Apple’s rider-mounted AR technology works the same way. In that sense, the iPhone 12 Pro and iPad Pro are like AR headsets without a headset part, and in the end, it could pave the way for Apple to create its own glasses.

occipital-canvas-ipad-pro-lidar.png

3D room scanning in Occipital’s Canvas app, activated as a depth-sensing rider on the iPad Pro. The same goes for the iPhone 12 Pro.

occipital

3D scanning can be a killer app.

You can use a technique called photogrammetry by meshing 3D objects and rooms with Lidar and superimposing photographic images on top. It could be the next wave of capture technology for practical uses such as: Home improvement, Or social media and journalism. The ability to capture 3D data and share that information with others can open these rider-equipped phones and tablets into 3D content capture tools. Lidar can be used to acquire measurements of objects and spaces without the need for camera elements.

google-tango-lenovo-1905-001.jpg

Remember Google Tango? There was also depth detection.

Josh Miller / CNET

Apple wasn’t the first to explore this technology in mobile phones.

Google has the same idea Project Tango -The initial AR platform Only on two phones -Created. In addition, the advanced camera array has infrared sensors, which can map rooms to generate 3D scans and depth maps for AR and indoor spatial measurements. Phones with Google’s Tango were short-lived and were replaced by computer vision algorithms that performed predictive depth detection on the camera without the same hardware. But Apple’s iPhone 12 Pro looks like a much more advanced successor.


Now playing:
Look at this:

iPhone 12, iPhone 12 Mini, Pro and Pro Max explained


9:16

READ  Google’s new Nest smart speaker is all cloth

You May Also Like

About the Author: Nathaniel Marrow

Explorer. Entrepreneur. Devoted coffee enthusiast. Avid bacon geek. Lifelong internet nerd.

Leave a Reply

Your email address will not be published. Required fields are marked *