Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 39 tok/s
Gemini 2.5 Pro 49 tok/s Pro
GPT-5 Medium 12 tok/s Pro
GPT-5 High 18 tok/s Pro
GPT-4o 91 tok/s Pro
Kimi K2 191 tok/s Pro
GPT OSS 120B 456 tok/s Pro
Claude Sonnet 4 37 tok/s Pro
2000 character limit reached

Full-Stack Bioacoustics: Field Kit to AI to Action (Workshop report) (2210.07685v1)

Published 14 Oct 2022 in cs.SD and eess.AS

Abstract: Acoustic data (sound recordings) are a vital source of evidence for detecting, counting, and distinguishing wildlife. This domain of "bioacoustics" has grown in the past decade due to the massive advances in signal processing and machine learning, recording devices, and the capacity of data processing and storage. Numerous research papers describe the use of Raspberry Pi or similar devices for acoustic monitoring, and other research papers describe automatic classification of animal sounds by machine learning. But for most ecologists, zoologists, conservationists, the pieces of the puzzle do not come together: the domain is fragmented. In this Lorentz workshop we bridge this gap by bringing together leading exponents of open hardware and open-source software for bioacoustic monitoring and machine learning, as well as ecologists and other field researchers. We share skills while also building a vision for the future development of "bioacoustic AI". This report contains an overview of the workshop aims and structure, as well as reports from the six groups.

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.