top of page
Itamar Shenhar, Omer Licht, Yuval Salant

FlyEye – Assisted Driving Using a Drone and Deep Learning

Updated: Apr 17, 2021

Assisted Driving Systems (ADS) are becoming more and more prevalent in our day to day lives. Today, almost all cars have integrated some form of ADS, whether it is a parking sensor or a full collision alert system in the likes of Mobileye. The goal of this project is to create an ADS that resembles Mobileye, using a drone that hovers above the car as the only sensor.

In addition, we offer a mobile application as an interface between the driver and the ADS that presents to the drive an aerial view of his vehicle in real time and alerts him when a potential collision hazard is detected – both visually and audibly.


Using a Drone as a sensor has many potential advantages: A drone is mobile, and so it offers a dynamic and virtually unlimited field of view. Using a drone the ADS can detect traffic jams or collisions ahead in real time without relying on potentially distorted user data. In addition, a drone requires no specific set-up for a car, and thus the ADS can be added or removed from the vehicle instantaneously.

Our ADS uses state-of-the-art Deep Learning algorithms to classify and detect the objects in the drone’s view. These algorithms are lightweight enough to run in real-time on an ordinary laptop.



Project’s book: project_book Link the project’s github repository: FlyEye’s Github Repo Future work: This project focuses on the collision detection part of the ADS and does not include a system for making the drone hover above the car. In future versions, GPS coordinates of the vehicle sent by the mobile app can be used by a more advanced drone model to track the car’s position.

Contact information: Itamar Shenhar: itamar8910@gmail.com Omer Licht: https://github.com/olicht Yuval Salant: yusal1234@gmail.com

Comments


bottom of page