It may not fly, but the robot developed at Tel Aviv University maps novel surroundings like bats do – by measuring sound-wave reflections
Bats can map novel surroundings and get around using echolocation – by emitting a sound signal towards objects and using the echoes returning from them. Until now, only a few attempts to build an autonomous robot that imitates these abilities were made, but now scientists from Tel Aviv University have successfully developed an autonomous bat-like robot that navigates on the ground by mapping new environments using sound reflection only. The robot delineates the borders of objects it encounters and even classifies them, which help it to find its way.
Increasing robot use necessitates the development of novel abilities for autonomous robots, such as obstacle avoidance, object recognition and route planning. One of the most challenging tasks is mapping novel surroundings, while navigating through them. Bats do it routinely by emitting sounds at a specific frequency towards objects in their surroundings and extracting information from the sounds reflected off the objects. Previous attempts to imitate the way bats map novel surroundings were limited to identifying general landscape features, without identifying the spatial properties of their area.
The current study, conducted at Yossi Yovel's lab at Tel Aviv University, presents two advancements compared to previous studies: The robot, nicknamed Robat, can move around independently, without human control. It successfully two-dimensionally maps its surroundings, while previous robots were only able to map their own location in the environment. As an autonomous robot, the Robat had to delineate the borders of objects it encounters in the new surroundings, so it could find an obstacle-free path – just like a bat flying through an orchard or patch of bushes that it encounters for the first time.
Mapping by ear
Like a bat, the Robat emitted sounds and received the returning signals using ear-like devices. An object’s distance from the Robat was calculated according to the time passing between signal emission and its return to the Robat’s "ears". And the difference in time between the arrival of the signals to each of the two ears indicated the location of the object in relation to the Robat. This allowed the Robat to create a spatial map of the objects in its environment, which served it to plan its next move. This ability was tested in two different greenhouses at the Tel Aviv University Botanical Garden.
The Robat successfully navigated through both greenhouses, bypassing obstacles in its path. It mapped the borders of objects it encountered relatively accurately, and even differentiated between plants and non-plants, with 70% accuracy, using an artificial neural network algorithm installed in it. It also used this ability to pass through the plants – just like a bat would identify important landmarks throughout its flight route, especially those that can help it find food – such as certain plants rich in fruit or insects, which are bats' primary food.
An "artificial bat"?
Well, not quite. First, the Robat does not fly, but moves terrestrially using wheels, mapping its surroundings in two dimensions and not three, as a bat does. The Robat is also much slower than a bat – having to stop every 30 seconds or so to pick up echoes, due to the system’s mechanical limitations, especially those of the stabilizing device (Gimbal), which is particularly slow. Since the Robat can only emit a sound signal that covers a small area, signals are emitted in three directions: Straight ahead, 60 degrees to the right and 60 degrees to the left. The sum of all of these signals creates a relatively wide signal, similar to the one emitted by a bat in nature. This has made the Robat's mission easier, but also limited its motion, since it had to stop and calculate the signals coming from each point separately.
Furthermore, the Robat is not equipped with the external ears which bats and other animals possess – ears that can detect the direction from which the sound comes from and encode it using electrical signals, allowing them to map the source of the sound in three dimensions. The Robat, in contrast, relies solely on temporal information – the time difference between signal emission and its return, and between the timing of its return to the two ears – which allows it locate the objects in two dimensions.
Robat improvements are probably forthcoming, such as parallel information processing – in contrast to serial information processing, where at any given moment, only one type of information is processed. Yet despite its limitations, the Robat is a technological advance that may prove useful in certain cases – for instance, a robot that can clean an environment with physical obstacles while bypassing them, as well as systems for navigation in a dark or unfamiliar environment without relying on optical means.
Watch the Robat navigating through the garden in the following video. Video credit: Itamar Eliakim.
Translated by Elee Shimshoni