Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 52 tok/s
Gemini 2.5 Pro 47 tok/s Pro
GPT-5 Medium 18 tok/s Pro
GPT-5 High 13 tok/s Pro
GPT-4o 100 tok/s Pro
Kimi K2 192 tok/s Pro
GPT OSS 120B 454 tok/s Pro
Claude Sonnet 4 37 tok/s Pro
2000 character limit reached

Gradient Algorithms for Complex Non-Gaussian Independent Component/Vector Extraction, Question of Convergence (1803.10108v2)

Published 27 Mar 2018 in eess.SP, cs.IT, and math.IT

Abstract: We revise the problem of extracting one independent component from an instantaneous linear mixture of signals. The mixing matrix is parameterized by two vectors, one column of the mixing matrix and one row of the de-mixing matrix. The separation is based on the non-Gaussianity of the source of interest, while the other background signals are assumed to be Gaussian. Three gradient-based estimation algorithms are derived using the maximum likelihood principle and are compared with the Natural Gradient algorithm for Independent Component Analysis and with One-unit FastICA based on negentropy maximization. The ideas and algorithms are also generalized for the extraction of a vector component when the extraction proceeds jointly from a set of instantaneous mixtures. Throughout the paper, we address the problem of the size of the region of convergence for which the algorithms guarantee the extraction of the desired source. We show how that size is influenced by the ratio of powers of the sources within the mixture. Simulations confirm this observation where several algorithms are compared. They show various convergence behavior in a scenario where the source of interest is dominant or weak. Here, our proposed modifications of the gradient methods taking into account the dominance/weakness of the source show improved global convergence property.

Citations (70)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.