While UAVs provide a platform to efficiently collect remotely sensed image data (for example, in the aftermath of a natural disaster) the data must be processed and analysed in order to extract meaningful information. An experienced remote sensing analyst normally achieves this image analysis through visual interpretation; a manual approach which presents a very high workload and requires time to be carried out.
Automating or partially automating some of the interpretation with machine learning techniques such as Artificial Intelligence (AI) would greatly reduce the time required to carry out the analysis and help get better and faster information into the hands of Emergency Coordinators.
WFP’s Rapid UAVs for Data Analysis in Emergencies (RUDA) pilot will see machine learning software integrated with data collection and analysis workflows in order to tag and analyze image data collected by drones. The pilot will be implemented in five countries prone to natural disasters, aligning with the locations chosen for the coordination model.
In April 2018, RUDA successfully built a software platform that can store and manage thousands of drone-captured images. These images will be labelled with key information manually, and in doing so, help train an AI system on how to assess each photo. The trained system will help to identify homes destroyed by natural disasters or spot failed harvests So far, more than 50% of photos have been tagged.