Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
GPT-5.1
GPT-5.1 74 tok/s
Gemini 2.5 Flash 163 tok/s Pro
Gemini 2.5 Pro 46 tok/s Pro
Kimi K2 200 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

The Effect of Training Dataset Size on Discriminative and Diffusion-Based Speech Enhancement Systems (2406.06160v2)

Published 10 Jun 2024 in eess.AS

Abstract: The performance of deep neural network-based speech enhancement systems typically increases with the training dataset size. However, studies that investigated the effect of training dataset size on speech enhancement performance did not consider recent approaches, such as diffusion-based generative models. Diffusion models are typically trained with massive datasets for image generation tasks, but whether this is also required for speech enhancement is unknown. Moreover, studies that investigated the effect of training dataset size did not control for the data diversity. It is thus unclear whether the performance improvement was due to the increased dataset size or diversity. Therefore, we systematically investigate the effect of training dataset size on the performance of popular state-of-the-art discriminative and diffusion-based speech enhancement systems in matched conditions. We control for the data diversity by using a fixed set of speech utterances, noise segments and binaural room impulse responses to generate datasets of different sizes. We find that the diffusion-based systems perform the best relative to the discriminative systems in terms of objective metrics with datasets of 10 h or less. However, their objective metrics performance does not improve when increasing the training dataset size as much as the discriminative systems, and they are outperformed by the discriminative systems with datasets of 100 h or more.

Citations (1)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.