Emergent Mind

Audio Analytics-based Human Trafficking Detection Framework for Autonomous Vehicles

(2209.04071)
Published Sep 9, 2022 in cs.AI , cs.SD , and eess.AS

Abstract

Human trafficking is a universal problem, persistent despite numerous efforts to combat it globally. Individuals of any age, race, ethnicity, sex, gender identity, sexual orientation, nationality, immigration status, cultural background, religion, socioeconomic class, and education can be a victim of human trafficking. With the advancements in technology and the introduction of autonomous vehicles (AVs), human traffickers will adopt new ways to transport victims, which could accelerate the growth of organized human trafficking networks, which can make the detection of trafficking in persons more challenging for law enforcement agencies. The objective of this study is to develop an innovative audio analytics-based human trafficking detection framework for autonomous vehicles. The primary contributions of this study are to: (i) define four non-trivial, feasible, and realistic human trafficking scenarios for AVs; (ii) create a new and comprehensive audio dataset related to human trafficking with five classes i.e., crying, screaming, car door banging, car noise, and conversation; and (iii) develop a deep 1-D Convolution Neural Network (CNN) architecture for audio data classification related to human trafficking. We have also conducted a case study using the new audio dataset and evaluated the audio classification performance of the deep 1-D CNN. Our analyses reveal that the deep 1-D CNN can distinguish sound coming from a human trafficking victim from a non-human trafficking sound with an accuracy of 95%, which proves the efficacy of our framework.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.