top of page
Sumit Negi

Apple’s LiDAR Scanner for AR Platform - Future and Applications


3D image sensors acquire Z-direction information in addition to imaging in the X and Y directions, allowing for 3D sensing. 3D sensing allows for the detection of things that are impossible to detect with traditional 2D images, such as volume or shape inspections or separating overlapped objects.


The ToF sensor realizes 3D sensing by calculating distance using time difference until the reflected light is detected by the sensor. ToF can be divided into two categories - Direct ToF and Indirect TOF. Direct ToF (dToF) involves a simple, direct measurement of the time until the reflection is detected. Indirect ToF (iToF) measures distance by collecting reflected light and discerning the phase shift between emitted and reflected light.



Apple first implemented 3D sensing in its iPhone X in 2017. The iPhone X and later models use structured light 3D sensing for facial recognition. iPhone models supporting Face ID houses a variety of sensors, including the new True Depth camera system. This includes an infrared camera, a flood illuminator, a regular camera and a dot projector. The flood illuminator shines infrared light at the face, which allows the system to detect whoever is in front of the iPhone, even in low-light situations or if the person is wearing glasses (or a hat). Then the dot projector shines more than 30,000 pin-points of light onto the face, building a depth map that can be read by the infrared camera. Now, Apple recently introduced dToF technology into their latest devices, the iPad 11 Pro tablet and the iPhone 12 Pro and Pro Max, by adding a LiDAR scanner. Apple is laying the groundwork for Augmented Reality with the addition of a direct time-of-flight sensor. Many smartphone vendors such as Samsung, Huawei and LG have been already using iToF for taking better pictures (ToF can blur backgrounds in photos), but not dToF.


Apple limited Structured light 3D sensing to Face ID and introduced a new dToF technology in the rear cameras of the latest device for 3D sensing. The reason is that even though the structured light method provides a high degree of depth accuracy, but the complex post-processing required to calculate the depth from pattern matching is time-consuming. In contrast, the advantage of the dToF technique, on the other hand, is that it allows for easy post-processing. Another benefit of dToF is that it works for longer ranges than Apple's grid-based Face ID system, which only works from 10 to 20 inches away from the phone. If the subject is too far away, the dots for the grid are too spaced out to provide a good resolution. However, to measure the time-of-flight with a small number of incident photons in a single measurement, it requires photodetectors with high sensitivity (such as single-photon avalanche diodes) and a large form factor.


Seminal Patents - Apple, Inc.


US Patent & Trademark Office published patent applications US20200256669A1 and US20200256993A1 (filed on August 6, 2019 and published on August 13, 2020) from Apple that relates to a next-gen depth sensing camera system associated with a Time-of-Flight camera. According to one embodiment of the invention, a depth sensing apparatus is given, which includes a radiation source designed to emit a first plurality of light pulses toward a target scene. An array of the second plurality of sensing elements is configured to output signals indicating respective times of photon incidence on the sensing element, with the second plurality exceeding the first. The target scene is imaged onto the array of sensing elements using light collection optics. Processing and control circuitry is coupled to receive signals from the array and is configured to search over the sensing elements in response to the signals in order to identify respective regions of the array on which the light pulses reflected from the target scene are incident and to process the signals from the identified regions in order to determine respective times of arrival of the reflected light pulses. The figure given below is a schematic side view of a depth mapping system, in accordance with an embodiment of the invention.

LiDAR Scanner Module In iPad 11 Pro


The LiDAR scanner in apple devices comprises an emitter — a vertical cavity surface emitting laser (VCSEL) from Lumentum, and a receptor — near infrared (NIR) CMOS image sensor that does the direct measurement of time of flight.


A new generation of Near Infrared (NIR) CIS with a Single Photon Avalanche Diode (SPAD) array from Sony is included in the LiDAR scanner module. The sensor features 10 µm long pixels and a resolution of 30-kilo pixels. The in-pixel connection is made using hybrid Direct Bonding Interconnect technology between the NIR CIS and the logic wafer, which is the first time Sony has used 3D stacking for its ToF sensors.


The LiDAR uses a vertical cavity surface emitting laser (VCSEL) coming from Lumentum. Multiple electrodes are connected to the emitter array separately in the laser. To improve wafer probe tests, a new design with mesa contact is used. The pulse is generated and the VCSEL power and beam shape are controlled by a wafer level chip scale packaging (WLCSP) five-side molded driver integrated circuit. To produce a dot pattern, a new Himax Diffractive Optical Element (DOE) is assembled on top of the VCSEL.

Apple’s LiDAR Scanner – Applications


For a long time, Apple has promoted the use of augmented reality on the iPad and iPhone. Many AR apps are currently available on Apple's App Store, many of which are aimed at students.

The LiDAR sensor on the iPad Pro, on the other hand, is useful for future applications for developers. The system makes use of "ARKit," a toolkit that allows developers to create powerful augmented reality apps. Furthermore, stakeholders can use ARKit to virtually transport to the field and collaborate in real-time, seeing what workers on-site see through the iPad's new cameras, which are integrated with AR and detailed depth information created by the LiDAR Scanner. Apple's LiDAR Scanner is a powerful and precise 3D scanning and measuring device (especially when combined with the Measure App) that allows for 3D mapping, measuring, and classification of interior spaces, as well as to object identification and classification. The Apple LiDAR Scanner is a response to the increasing demand for mobile scanning in short/indoor and complex environments, as well as confined outdoor industrial sites. One recent example is TeamViewer, which recently announced an update to the AR-based TeamViewer Pilot that takes advantage of the new Apple iPad Pro's LiDAR Scanner. Experts or technicians will use TeamViewer Pilot and Apple's LiDAR Scanner together in industrial sites to identify and present an accurate understanding of the environment. The LiDAR Scanner in the iPad Pro allows the pilot to identify and obstruct annotations behind physical objects by providing an understanding of the physical environment. This gives both the remote expert and the person in the field a much better understanding of the actual situation, greatly improving first-time fix rates.


Apps available in the App Store utilizing LiDAR scanner –

  • Complete Anatomy

  • Snapchat

  • iScape

  • Warby Parker

  • JigSPace

  • DSLR Camera

  • Hot Lava

  • IKEA Place

  • AR Quick Look

  • Plantale

Some may argues that there are already some AR apps that appear to work well on smart devices that do not have a LiDAR scanner. The truth is that smart devices do not address the problem of depth without a LiDAR scanner. When a hand is placed in front of an AR object (skeleton) in Figure 1, a smart device without LiDAR does not detect it and instead overlays the AR object over the hand. There is no sense of depth as the hand simply disappears and therefore the smart device with no LiDAR does not recognize what should be in front of and behind the AR object. In contrast, in Figure 2 when the hand is put in front of the AR object (doll), a smart device with LiDAR detects it. Therefore, the LiDAR scanner solves the depth problem by detecting what should be in front of and behind the AR object.














Current Limitation of LiDAR in Apple


While the LiDAR Scanner makes Apple's devices useful for sensing 3D-spaces, it is not yet "accurate" enough to use with a 3D printer, according to numerous studies by LiDAR experts and app developers. Apple's LiDAR Scanner and its use in spatial tracking were investigated by vGIS, a leading developer of AR/MR solutions, who discovered that the Scanner offers improvements in surface scanning and object detection but is still unable to accurately track position. The LiDAR Scanner is best suited for an indoor environment, according to the findings. The use of raw depth data from the LiDAR Scanner is still heavily dependent on the innovations of the application developers, especially in the AR domain (such as Measure, Halide, Shapr3D).


Market Insights


The Time-of-Flight (ToF) Sensor Market is expected to grow at a CAGR of 20.0 percent from $2.8 B in 2020 to $6.9 B by 2025. The rising demand for ToF sensors from the automotive industry, as well as the growing adoption of 3D cameras in smartphones and increased use of such smartphones, are driving the growth of this market. The growing adoption of 3D machine vision systems in industries like aerospace and defence, consumer electronics, and healthcare, as well as the growing deployment of Industry 4.0, is expected to drive the ToF sensor market forward. Rising demand for 3D-enabled devices in consumer electronics and increasing users of smartphones are the key factors for the growth of the ToF sensor market for consumer electronics.


Conclusion


The developers have a strategic role to play in improving the adaptation and integration of the technology to engage users and elevate the technology on industrial sites, for indoor mapping, complex, and confined environments, now that the Scanner is easily available to customers/surveyors. The multi-functional hand-held tool is cheap and lightweight, and it covers a small area in almost no time. This also applies to architects and industrial designers, who must be able to accurately measure and model the space. The Scanner can be integrated with Shapr3D – a design modeling tool – to create a CAD model of the indoor environment in real-time on a mobile device, as LiDAR is an important technology for architects and industrial designers. To push augmented reality, Apple will launch a mixed-reality headset and augmented reality glasses in near future. It has been predicted that Apple's MR / AR product scheduling plan includes three phases: helmet with hearing aids by 2022, glasses by 2025, and contact lenses by 2030-2040.


Sumit is a research analyst at Copperpod. He has a Bachelor's degree in Electronics and Communication Engineering. His interest areas are Microcontrollers, IoT, Semiconductors, Displays, Wireless Communications and Memory Devices.


References

  1. https://www.apple.com/augmented-reality/

  2. https://www.eetasia.com/look-inside-ipad-pro-11s-lidar-scanner/

  3. https://www.systemplus.fr/reverse-costing-reports/apple-ipad-pro-11s-lidar-module/

  4. https://www.pocket-lint.com/tablets/news/apple/151476-what-is-lidar-ipad-why-arkit-measure

  5. https://www.sony-semicon.co.jp/e/products/IS/industry/technology/ToF_technology.html

  6. https://www.youtube.com/watch?v=fS3J4V_BgP0

  7. https://indianexpress.com/article/technology/tech-news-technology/what-is-lidar-and-why-is-it-in-apples-new-ipad-pro-2020-6323223/

  8. https://www.computerworld.com/article/3235140/apples-face-id-the-iphone-xs-facial-recognition-tech-explained.html

  9. https://www.geospatialworld.net/blogs/apples-lidar-scanner/#:~:text=%E2%80%93%20Apple's%20LiDAR%20Scanner%20provides%20for,spaces%2C%20object%20identification%20and%20classification.

  10. https://www.bloomberg.com/press-releases/2020-02-11/time-of-flight-tof-sensor-market-worth-6-9-billion-by-2025-exclusive-report-by-marketsandmarkets


Copperpod provides Reverse Engineering services. Copperpod analyzes existing hardware and software systems and processes owned by the seller to provide you a clear and detailed view of the seller's architecture, growth plans and the investment that such growth will require.

Comments


bottom of page