Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 152 tok/s
Gemini 2.5 Pro 25 tok/s Pro
GPT-5 Medium 20 tok/s Pro
GPT-5 High 30 tok/s Pro
GPT-4o 92 tok/s Pro
Kimi K2 134 tok/s Pro
GPT OSS 120B 437 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

In search of the most efficient and memory-saving visualization of high dimensional data (2303.05455v1)

Published 27 Feb 2023 in cs.LG and cs.HC

Abstract: Interactive exploration of large, multidimensional datasets plays a very important role in various scientific fields. It makes it possible not only to identify important structural features and forms, such as clusters of vertices and their connection patterns, but also to evaluate their interrelationships in terms of position, distance, shape and connection density. We argue that the visualization of multidimensional data is well approximated by the problem of two-dimensional embedding of undirected nearest-neighbor graphs. The size of complex networks is a major challenge for today's computer systems and still requires more efficient data embedding algorithms. Existing reduction methods are too slow and do not allow interactive manipulation. We show that high-quality embeddings are produced with minimal time and memory complexity. We present very efficient IVHD algorithms (CPU and GPU) and compare them with the latest and most popular dimensionality reduction methods. We show that the memory and time requirements are dramatically lower than for base codes. At the cost of a slight degradation in embedding quality, IVHD preserves the main structural properties of the data well with a much lower time budget. We also present a meta-algorithm that allows the use of any unsupervised data embedding method in a supervised manner.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Questions

We haven't generated a list of open questions mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube