Blickfeld Cube 1 LiDAR sensor mounted tripod

Myths about LiDAR Sensor Debunked – Part 2

LiDAR technology has acted as a building block for many technologies and applications but has also been mired with several myths and false presumptions. As the second part of this blog series, we have rounded up more common myths about LiDAR and have debunked them.
Florian Petit Blickfeld founder

LiDAR sensors have proven to be a key technology for many sensor-based applications, but they have also attracted many unwarranted misconceptions and myths. Here’s the second part of the blog series, which addresses and debunks some more common myths about LiDAR. You can read the first part here.

1. Myth: LiDAR is the same as a camera in terms of privacy

Unlike cameras, LiDAR doesn’t record any color information. It only captures the 3D distance data to create the point cloud, thus generating an anonymized picture of the entire scene while protecting privacy.

As the usage of sensors has risen over the years, so have privacy concerns. For instance, smart city projects worldwide have stirred controversy over data collection and use, so much so that the European Union has even been considering a ban on facial recognition technology in public spaces until the authorities can study and regulate its use.

These concerns primarily revolve around the sensors being able to capture and store pictures of people and then using the data for identification through facial recognition. While the debate on the merits of using surveillance and identification applications is indeed a challenging social enigma, LiDAR technology is primed to play a pivotal part in offsetting the privacy infringement concerns.

LiDAR can track pedestrians, vehicles, and other objects with a great degree of reliability and accuracy using algorithms that analyze the point cloud data. But it doesn’t record any color information and only captures the 3D distance data to create the point cloud. Thus, it generates an anonymized picture of the entire scene, making the sensor invaluable in privacy-sensitive applications like perimeter security, crowd management, and people counting.

Blickfeld LiDAR detecting two people hugging while keeping their identities entirely anonymous
LiDAR detecting two people hugging without revealing their identities

You can read more about how LiDAR sensors capture and process the data while maintaining complete anonymity for the people in this blog post.

2. Myth: iPhone LiDARs and large scale LiDARs have similar functionalities

The type of LiDAR sensor employed, the scanning range, and the resolution of the iPhone LiDAR means it cannot offer the same functionality as a conventional scanning LiDAR.

Myths about LiDAR IPhone LiDAR vs Conventional LiDAR

The recent unveiling of the iPhone 12 Pro and iPad pro made all the headlines as Apple announced the inclusion of a LiDAR in their latest devices. iPhone LiDARs create small-scale 3D depth maps of objects, people, and surroundings by sending waves of light pulses in a spray of infrared dots. These dots measure the distance between each other, create a ‘field of points’, and generate a mesh of dimensions. If the working principle sounds familiar, it’s because this is essentially just an update on the TrueDepth camera Face ID technology used in the past.

But are these LiDARs at par with the conventional LiDAR technology? iPhone LiDAR employs flash illumination and no scanning techniques, meaning the entire field of view is illuminated using a single pulsed wide diverging laser beam. This is in stark contrast to the conventional scanning LiDARs, which use collimated laser beams to illuminate a single point at a time in the field of view. Although there certainly are conventional flash LiDARs as well, we will compare the iPhone LiDAR with a conventional scanning LiDAR for the purpose of this article.

The most striking difference between a conventional scanning LiDAR and an iPhone LiDAR is its range: LiDAR usage beyond consumer products requires higher performance in terms of range and resolution than iPhone.  Blickfeld’s Cube 1, for instance, has a range of up to 250 m, whereas the iPhone LiDAR can only measure and analyze up to 5 m-10m.

The large-scale commercialization of LiDAR by iPhone has undoubtedly put the technology in the limelight and has enhanced consumer familiarity. And just like cameras, it will also gear up the entire semiconductor ecosystem to create a more robust infrastructure for optics and electronics, to be leveraged by other LiDAR applications. In itself, the iPhone LiDAR certainly improves the camera focus speed and accuracy in low-light conditions. But for now, it can’t be employed in any large-scale applications such as autonomous vehicles or HD map generation.  

As for the resolution, a typical scanning LiDAR like Cube 1 can scan more than 500 scan lines per second, generate hundred thousands of data points, and consequently have a very high-density point cloud. In comparison, the iPhone LiDAR can reportedly only measure up to 500 data points per frame and therefore has a comparatively lower resolution.

3. Myth: LiDAR isn’t safe for the human eye

One of the common myths about LiDAR is it is not safe for human eye. On the contrary, all LiDAR products are manufactured following the Class 1 eye-safe (IEC 60825-1:2014) standard, which ensures eye safety.

Eye safety for LiDAR is usually based on a combination of factors and not just the laser’s wavelength. For instance, the safety rating of a LiDAR depends hugely on the peak power for the laser, which then directly affects the range of the sensors for a particular wavelength. In general, eyes are more sensitive towards the 905 nm wavelengths lasers. Therefore, this particular type of laser is operated with low peak power to remain within the eye-safe region.

In contrast, LiDARs operating with the 1550 nm wavelength range can safely employ higher power thresholds and have longer ranges than 905 nm lasers while remaining within the eye-safe region. This is because the eye’s cornea, lens, and aqueous and vitreous humors effectively absorb any wavelengths greater than 1400 nm, alleviating the risk of retinal damage at longer wavelengths.

Importantly, these eye-safe combinations of peak power in relation to the wavelengths are defined by the Class 1 eye-safe (IEC 60825-1:2014) standard, which is binding upon every laser manufacturer of wavelength range 180 nm to 1 mm and thus guarantees safe operations. By following these regulations, every LiDAR can be eye-safe!

There have been many online discussions on the possibility of multiple LiDARs sending out waves at the same wavelength and phase, e.g., at a road intersection. Could they combine and create a higher energy laser and that is not eye-safe?Theoretically, these lasers can constructively superimpose and increase in amplitude, meaning the peak power (amplitude) of the pulses can increase and possibly go beyond the eye-safe region.

As disconcerting as that might sound, it is virtually impossible in the real world. This is because the other LiDAR sensors would have to send a laser pulse with a perfectly aligned combination of factors such as the pulse duration, divergence angle, and exposure direction with respect to the human eye’s position for this high-energy laser to be generated. This makes it highly unlikely for any two or more LiDAR waves to overlap at one point in space and time.

4. Myth: LiDARs have very limited applications

From fleet management to agriculture and security to smart city applications, the sky’s the limit for LiDAR applications!

Although the integration of LiDARs in iPhones has already started to challenge this notion, there are still some myths about LiDAR that it is somewhat a niche technology and has only a handful of applications. This could be attributed to how LiDARs are generally only associated with autonomous driving in the mainstream press and discussions about the sensor. While it is true that autonomous driving is inconceivable without LiDAR sensors, they also have many other applications touching all walks of life.

Applications of LiDAR

For instance, LiDARs are used in agriculture, where the sensors are employed for automated and autonomous maneuvering of the farming equipment and vehicles, environmental detection, and tracking of activities like sowing and fertilizing.

Or LiDARs can act as a crucial part of a security and safety ecosystem, where the sensors can be used to support and supplement other security technologies in use cases like perimeter security, regulating entrances and checkpoints, and enabling social distancing.

Other LiDAR applications range from fleet management to smart city applications like reducing congestion in cities and people counting and crowd management.

These were some of the common misconceptions and myths about LiDARs and their applications. There is no denying the importance of LiDAR sensors in carving the future of technology. And as the world marches towards automation, LiDAR and its applications will undoubtedly become a more regular part of the discourse.

Florian Petit Blickfeld founder

Don’t want to miss any news?

Subscribe to our newsletter for regular updates straight to your inbox.

By subscribing to the newsletter of the Blickfeld GmbH you agree to our Privacy Policy and to the tracking of your opening rates.

You may also be interested in

MEMS mirrors
Mathias Müller
The world of LiDAR technology
Through innovation and meticulous design, MEMS mirrors have proven to be a key enabler of advanced MEMS LiDAR technology. MEMS-based LiDAR sensors offer both scalability and high performance. A critical factor in their design is the size of the MEMS mirrors. How is this optimal size determined?
Blickfeld Qb2 LiDAR sensor front and back
Mathias Müller
The world of LiDAR technology
LiDAR is the latest cutting-edge technology leveraging 3D point cloud images to enable many applications. But how can one assess the performance metrics and decide which LiDAR is a good fit for their application?
Terrain top view stock image
Florian Petit
The world of LiDAR technology
Lidar technology emerged already in the 60s and has evolved from application in aeronautics and aerospace to areas of environment detection such as autonomous driving today.

Next events

Perimeter Protection 2025

January 14
- January 16, 2025
Nuremberg

Test Camp Intralogistics

March 26
- March 27, 2025
Dortmund

Austrian Waste Management Conference (AWT) 2025

April 9
- April 11, 2025
Graz
Search