As a concept, Lidar has been around for decades. However, recent years have seen a sharp uptake in interest in the technology, as sensors have become smaller and more sophisticated and an increasing range of uses for Lidar products has been identified.
Lidar stands for Light Detection and Ranging. It is similar to radar, in that its primary uses are in surveillance and detection, but it uses the light generated by lasers, instead of radio waves, as in radar. The term Lidar is often used interchangeably with Ladar, which stands for laser detection and ranging, though the two concepts are technically different, says Joe Buck, chief scientist at Coherent Technologies, part of the Advanced Technology Centre at Lockheed Martin’s space systems division: “When you’re looking at something that would be considered a soft target, for instance particulates or aerosols in the air, the community tends to use Lidar when referring to detecting those objects,” Mr. Buck told Armada: “Then if you’re looking at hard targets, which are solid objects such as a car or a tree … then we tend to use Ladar.” For more details regarding the science of Lidar, please see the How It Works box accompanying this article.
How It Works: Lidar
Lidar works by illuminating a target with light. Lidar can use visible, ultraviolet or near-infrared light. Put simply, light is shone onto a mirror via a series of pulses. The mirror is then rotated. This moves the pulses of light around a specific area, such as the inside of a room. The light pulses hit an object and are reflected back to the Lidar. The Lidar then measures the time difference between the transmission of the light pulse and its reflection; based on the constant speed of light which is 161,595 nautical miles per second (299,274 kilometres-per-hour). By measuring this time difference, it is possible to discern the distance of a particular part of an object from the Lidar, and hence build an image of the object based on its position of the relative to the Lidar.
Lidar has been an area of research for many decades, Mr. Buck continued, from right back when the laser was first developed in the early 1960s. However, this interest has accelerated since the beginning of this century, thanks largely to technological advances. Mr. Buck gave the example of synthetic aperture imaging: The larger a telescope is, the higher the resolution that can be obtained when looking at an object. If you need a particularly high resolution, it may require a telescope that is much larger than is practical for the application. Synthetic aperture imaging overcomes this problem, through the use of a moving platform and the processing of the signals to create an effective aperture much larger than the physical aperture. Synthetic Aperture Radar (SAR) has been in use for many decades, Mr. Buck said. However, it took until the early 2000s before there were practical demonstrations of optical synthetic aperture imaging, despite the fact that lasers were widely used during that time: “Really what happened is it took that long for the optical sources to develop to the point where they had sufficient stability over a broad tuning range … Improvements on the materials, sources, and detectors (used in Lidar) are continuing. It’s not just that you have the ability to do these measurements now, you have the ability to do them in small packages, making the systems practical from a Size, Weight and Power (SWAP) standpoint.”
It is also becoming easier and more practical to collect Lidar data (information gathered by Lidar). Traditionally, it was collected from sensors on aeroplanes, says Nick Rosengarten, product manager of the Geospatial Exploitation Products Group at BAE Systems. However, today the sensors can be placed in land vehicles as well, or even in backpacks, meaning humans can collect the data: “This opens up a whole range of possibilities, in that data can now be collected indoors as well as outdoors,” Mr. Rosengarten explained: Lidar is “really an amazing data set,” because it provides a huge amount of detail about the surface of the Earth, posits Matt Morris, director of integrated solutions for Textron Systems’ geospatial solutions division. It provides a far closer and more nuanced picture than Digital Terrain Elevation Data (DTED) which provides details regarding the height of the Earth’s surface at specific points, he continues: “Probably one of the most powerful use cases I’ve heard from our military customers is that when they deploy they need to know whether they’re going to need to … get up on a rooftop or climb a fence,” he told Armada. “DTED data won’t allow you to see that. You won’t even see the buildings.”
Mr. Morris noted that even some of the traditional, high-resolution elevation data would not allow you to see these details. Lidar will, because of its ‘post spacing’, a term that describes the distance between positions that can accurately be shown in a dataset. With Lidar, post spacing can be down to centimetres: “so you can know exactly what the height of a building rooftop or the height of a wall or the height of a tree is. That really helps three-dimensional (3D) situational awareness.” Moreover, the cost of Lidar sensors has decreased, Mr. Morris notes, as has the size of its apparatus, making it far more accessible: “A decade ago the (Lidar) sensor packages … were very large and extremely expensive. They had really high power requirements. But as they developed, improved the technology, the platforms have gotten significantly smaller, the power requirements have gone way down, and the quality of the data they’re producing has gone way up.”
Mr. Morris said that the major use of Lidar he has seen on the military side is in 3D mission planning and rehearsal. For example, for flight simulations, his company’s Lidar Analyst product enables users to take in large volumes of data and “rapidly generate those 3D models, and then they can do very accurate mission planning.” The same was also true for ground operations, Mr. Morris explained: “They’ll use it for (planning) ingress and egress routes, and because of the high resolution of the data, they’re able to do very accurate line-of-sight analysis,” he said.
As well as the Lidar Analyst, Textron also produces the RemoteView; an imagery analysis product with customers in the US defence and intelligence domains. RemoteView can use numerous data sources, as well as Lidar. BAE Systems also provides geospatial analysis software, with its flagship product being the SOCET GXP, which provides a multitude of capabilities, including Lidar exploitation, Mr. Rosengarten said. In addition, the company has a technology called GXP Xplorer which is a data management application. These technologies have a number of possible military applications. For example, Mr. Rosengarten pointed to a helicopter landing zone tool within SOCET GXP, which can take “Lidar data and provide users areas on the ground that would be sufficient for landing a helicopter”. For example, it will tell them if there are vertical obstructions in the way, such as trees: “People can use this to identify areas that would be the best place to use as an evacuation point, say during a humanitarian crisis.” Mr. Rosengarten also highlighted the potential for ‘mosaicing’, when numerous Lidar datasets are collected from a particular area and stitched together. This was possible, he said, because of “the increased metadata accuracy of the Lidar sensors combined with software programmes such as BAE Systems’ SOCET GXP application that can turn the metadata into geospatially-accurate ground locations. This can be accomplished with Lidar data regardless of how it is collected.”
Mr. Buck, meanwhile, pointed to the potential military applications of Lockheed Martin’s WindTracer technology. WindTracer is a commercial technology that uses Lidar to measure wind shear at airports. The same type of process could be used in the military arena, in precision airdrops, for example, he continued: “You need to drop supplies from a reasonably high altitude, so you put them on these palettes and you drop them down with a parachute. Now where does it land? You can try and predict where it’s going, but the problem is that as you’re dropping from altitude, the wind shear is moving in different directions at different altitudes,” he explained. “So how do you predict where that supply pallet is going to land? If you can measure the wind and optimise the trajectory, you can place the supplies with very high accuracy.”
Lidar is also being used in unmanned vehicles. For example, the Unmanned Ground Vehicle (UGV) manufacturer Roboteam has created a tool called Top Layer, a 3D mapping and autonomous navigation technology that utilises Lidar. Top Layer uses Lidar in two main ways, said Shahar Abuhazira, Roboteam’s chief executive officer. First, it enables the mapping of enclosed environments in real time. Sometimes video is insufficient in subterranean environments: for example, it may be too dark, or the view may be obscured by dust or smoke, Mr. Abuhazira added. A Lidar capability allows you to “move from a situation where you have zero orientation and understanding of the surroundings … now it maps the room, it maps the tunnel. Immediately you can understand the surroundings even though you don’t see anything and even though you don’t know where you are.”
The second application of Lidar is autonomy, Mr. Abuhazira said: assisting an operator in controlling more than one system at any given time: “One operator can control one UGV, but there are two other UGVs that are just tracking the controlled vehicle and following it automatically,” he explained. Similarly, a soldier could enter a room, and the UGV could simply follow them, meaning they would not need to put down their weapon, for example, in order to control the UGV: “It makes the operation simple and intuitive.” Roboteam’s larger Probot UGV also deploys Lidar to help travel over large distances: “You can’t expect the operator just to press a button for three days … you can use the Lidar sensor just to follow the forces, or follow the vehicle, or even better just to drive automatically from point to point, and the Lidar sensor will avoid obstacles.” Mr. Abuhazira expects there to be major breakthroughs in this area in the future. For example, he said that users aspire to a situation in which a human and a UGV interact just like two soldiers would: “You don’t control each other. You look at each other, you are calling each other, you watch each other and you just act the way that you should,” he said: “I think that there is an expectation that, in a way, we will get to this level between people and systems. It will be more effective. And I believe that Lidar sensors are taking us to this direction.”
Mr. Abuhazira also expects Lidar sensors to improve operations in the dangerous subterranean environment. Lidar sensors give an additional dimension, he said, providing a map of the tunnel. Mr. Abuhazira continued that sometimes, because a tunnel is small and dark, a user may not even realise that they are driving the UGV in the wrong direction: “A Lidar sensor acts like a real-time GPS (Global Positioning System) and makes it like a videogame,” he said: “You can see your system in the tunnel, you know (where you are( moving in real time.”
It should be noted that Lidar sensors are another source of data, and should not be seen as a direct replacement for radar, for example. Mr. Buck said there were large wavelength differences between the two, which provides both advantages and disadvantages. Often the best solution is using both, he said, using aerosol wind measurement as an example. The shorter wavelength of optical sensors provides improved direction finding compared to the longer wavelength of a RF (Radio Frequency) sensor as used by radar. However, the atmospheric transmission properties are very different for the two sensors: “Radar has the ability to get through certain types of clouds that Lidar would have more difficulty with. But then in other types of fog, one might be a little bit better than the other.”
Mr. Rosengarten said that combining Lidar with other sources like panchromatic data (where imagery is built using a wide range of light wavelengths) would give a complete picture of an area. A good example is identifying a helicopter landing zone, he said. Lidar may look at an area and say that it has a slope of zero, without taking into account that it could actually be looking at a lake. This type of information could be obtained by adding using other light sources, he explained. Mr. Rosengarten expects the industry to eventually look at the merging of technologies, bringing together the various different sources of visual and light data: “It’s finding ways to bring all that data under one umbrella …It’s not just using Lidar data in a silo to solve an intelligence problem.”