Nissan engineers have been inspired by the animal kingdom to develop new technologies that will determine the future of mobility . One of the long -term objectives of the Nissan is to minimize accidents .
The claim is that the figure tends to zero over the years. Toru Futami, director of technology and advanced research, argues that the study of the behavior of animals that move in groups helps engineers to understand how vehicles can interact with each other to achieve a safer and efficient driving environment.
"In our constant search to develop anti -collision systems for the next generation of cars, we need to inspire ourselves in Mother Nature to find the most appropriate response. At this time, research focuses on fish behavior patterns."
The research team has created Eporo (episode 0 robot), using LRF laser technology (Laser Range Finder) -inspired by eyes composed of bees, whose field of vision covers more than 300 degrees -, together with other advanced technologies. Six units of the Eporo robot communicate with each other to control their positions. The goal is double: avoid collisions and be able to travel from side to side or in one direction, in the same way that fish do when they move under water grouped into banks.
"In current traffic laws it is assumed that cars drive inside the lanes and obey road signals by order of the driver, but if all cars were autonomous, the need for lanes and even signs could disappear. We talked before about the fish, and the fish follows these three rules: do not go too far, do not get too close and do not hit others. A fish bank Guide, but its members fix them to swim very close to each other.
Futami adds that the robot also has the ability to communicate with its peers at an intersection, so that they can make the decision which could happen and which not, thus eliminating the need for traffic signals.
Before the development of Eporo, Nissan created the Car Robot Biometric Unit, or BR23C, which mimics the curious ability to avoid bees collisions. This is a joint project with the Advanced Science and Technology Research Center at the renowned University of Tokyo.
Inspired by the eyes composed of the bee, which can see more than 300 degrees, the LRF laser (laser range Finder) detects obstacles within a radius of 180 degrees up to two meters away. The BR23C calculates the distance to the obstacle, then sends a signal to a microprocessor, which translates this information and moves or changes the position of the robot to avoid a collision.
"In a fraction of Second detects an obstacle," explains Toshiyuki Andou, director of the Nissan Mobility Laboratory and main head of the project, "the robot will imitate the movements of a bee and immediately change direction to avoid a clash."