Wednesday 9 May 2018

sensor - LIDAR burnout; ways to check for damaging infrared lasers before shooting besides looking for posted warnings?


The BBC News article Driverless car laser ruined camera describes a situation where a particularly powerful infrared laser from the LIDAR of a prototype car at the CES show damaged the sensor of a photogrpher's camera.


Question: Besides looking for a sign that says "Caution, no photography, infrared lasers in use" are there any ways to check for the presence of infrared lasers before shooting?


For cameras with through-the-lens displays, one can view the scene as the sensor sees it, and detect damage after the fact if it is bad enough.



However, are there any ways to detect the presence of un-announced infrared laser beams that could damage a camera, or possibly any way to use the camera itself to do this in a less risky way?


At the end of the article (part quoted below) it mentions "fiber laser" and these are often at longer wavelengths (1300 to 1600 nm) than most semiconductor lasers (often 800 to 950 nm). The problem for the longer wavelengths is that the silicon itself may not produce any signal, so in those cases you wouldn't see the "purple dot" from the IR light, but only sensor damage after the fact. (I've asked separately What wavelengths are most commonly used in laser-scanners and LIDAR systems?)


When shooting in bright sunlight there is obviously general awareness and recommendation of caution.


Eye-level infrared laser beams however are something new and different, and these are invisible and so one doesn't necessarily know one is photographing a laser until the dot shows up in the photo.


If I understand correctly these LIDAR systems use wavelengths that are absorbed in the front of the eye and so never pass through the lens and get focused to a small spot on the retina. An IR-blocking filter on the lens can mitigate the problem, but an IR-blocking filter on the sensor, near the focus, can melt and fail for the obvious reason that it absorbs the power which is now focused to a small spot.


enter image description here


Jit Ray Chowdhury/BBC



The lidar system on the top of the demonstration car




enter image description here


Jit Ray Chowdhury/BBC



The purple dots and lines on this photo of the Stratosphere hotel in Las Vegas show the damage...



The article goes on to explain:



Lidar works in a similar way to radar and sonar, using lasers rather than radio or soundwaves, explained Zeina Nazer, a postgraduate researcher at the University of Southampton specialising in driverless car technology.


"Powerful lasers can damage cameras," she said.


"Camera sensors are, in general, more susceptible to damage than the human eye from lasers. Consumers are usually warned never to point a camera directly at laser emitters during a laser show."



Ms Nazer added that for cameras to be immune to high power laser beams, they need an optical filter that cuts out infrared which is invisible to humans. However, it can affect night vision, when infrared can be an advantage.


"AEye is known for its lidar units with much longer range than their competitors, ranging 1km compared to 200m or 300m," she said.


"In my opinion, AEye should not use their powerful fibre laser during shows."





No comments:

Post a Comment

Why is the front element of a telephoto lens larger than a wide angle lens?

A wide angle lens has a wide angle of view, therefore it would make sense that the front of the lens would also be wide. A telephoto lens ha...