Artificial Intelligence has given computers vision. It is now giving conservationists the “ears” they need. Recognizing endangered species by their call in the wild is a critical element in conservation efforts. Unlike vision, sound can be used to track them even if the animals are camouflaged and irrespective of the direction where they are located. Also, sound travels a long distance in the wild. This means that tracking by sound can work even in areas that are difficult to access. Tracking an endangered species by sight, on the other hand, is far more difficult as they are so rare. The complexity increases with the nature of the habitat, such as under water. Conservationists, therefore, rely significantly on identification by call to detect the location and population of an endangered species.
However, it means a humungous amount of manual work for conservationists. Recorders placed around the known habitat of an endangered species generate hundreds of hours of audio data, which conservationists analyze manually to identify their location. This is a slow and cumbersome process, prone to errors and delays in marking their location. Most of the time what conservationists hear is silence, interspersed with sudden outburst of animal and other sounds. They need to be constantly alert to locate and parse the sounds for the presence of an endangered species in the wild.
Wildlife Conservation Society (WCS) India and Accenture Labs worked collaboratively to demonstrate the role of technology in not only easing the task for conservationists, but allowing them to focus on their core job, which is to protect animals in the wild. WCS collects data by deploying multiple sound recorders in the forest, which are collated at the base station. Recorded files are analyzed by the Nature Sound Analyzer, which annotates the start and end points of animal sounds in the file. These sounds are then reconfirmed by WCS researchers. The analysis of the recordings helps identify where endangered animals are located (based on the location of the recorder) and when they were last present at the location (based on the timestamp of the identified call sound). This information on their habitat is a critical input to the conservation efforts, such as demarcation of forest areas as protected zones or organizing a response from forest rangers and park management against illegal poaching and trade in endangered species. The primary technologies used are Digital Signal Processing (DSP) and AI.
“Our work is an example of true human + machine response, each doing what it does best.” said Sanjay Podder, Managing Director, Accenture Labs. “Overall, the solution exemplifies the role that AI can play in scaling up conservation efforts across species. As the solution matures, a potential application includes animal sounds to be recognized in real time and the information transmitted to the base station and central office for immediate response.”
The benefits of the project are two-fold. On the human side, it reduces hours and hours of effort in analyzing information to a few seconds and minutes. The faster the data is analyzed and transmitted, the faster the response to prevent immediate threats to the species and their habitat. On the technology side, it represents the importance of AI in converting sounds in the wild into critical insights that can help conservationists calibrate and prioritize their response. The starting point for the solution was to train the AI model on “silence” as a separate class, allowing it to be filtered out. The next part of the equation was to recognize the sounds in the specific frequency range of the call of the endangered animal, filtering out the background noise. The methodology is sound enough to be applied across a variety of endangered animals.
In the words of Sanjay Podder, “Accenture’s Tech4Good program solves some of the most intractable social development challenges, with a focus on inclusion, diversity and protection of the environment. When the Wildlife Conservation Society (WCS) reached out to us, we quickly responded to the request and worked in close collaboration with them to figure out how exponential technologies such as AI can be leveraged to identify the location of an endangered species while significantly reducing the human effort involved in it, thus letting the experts focus on their key goal of conservation.”
The ease with which the solution worked in the field at the time of testing, particularly so because the data available for building the solution was limited, was a big surprise. The insight gained was that the pre-work, in close collaboration with conservation experts, helped build a robust solution. Given that the system is now built, and will get further refined, it can be applied in other contexts, including tracking marine animals such as Dugong. WCS and Accenture Labs are already in discussion to jointly build such a solution.