Emergent Mind

TINC: Tree-structured Implicit Neural Compression

(2211.06689)
Published Nov 12, 2022 in cs.CV

Abstract

Implicit neural representation (INR) can describe the target scenes with high fidelity using a small number of parameters, and is emerging as a promising data compression technique. However, limited spectrum coverage is intrinsic to INR, and it is non-trivial to remove redundancy in diverse complex data effectively. Preliminary studies can only exploit either global or local correlation in the target data and thus of limited performance. In this paper, we propose a Tree-structured Implicit Neural Compression (TINC) to conduct compact representation for local regions and extract the shared features of these local representations in a hierarchical manner. Specifically, we use Multi-Layer Perceptrons (MLPs) to fit the partitioned local regions, and these MLPs are organized in tree structure to share parameters according to the spatial distance. The parameter sharing scheme not only ensures the continuity between adjacent regions, but also jointly removes the local and non-local redundancy. Extensive experiments show that TINC improves the compression fidelity of INR, and has shown impressive compression capabilities over commercial tools and other deep learning based methods. Besides, the approach is of high flexibility and can be tailored for different data and parameter settings. The source code can be found at https://github.com/RichealYoung/TINC .

TINC approach: partition target data, represent blocks with MLPs, share parameters among similar blocks.

Overview

  • The paper introduces Tree-structured Implicit Neural Compression (TINC), a new method that leverages the representation capabilities of Implicit Neural Representations (INRs) enhanced through a hierarchical framework for efficient data compression.

  • TINC utilizes Multi-Layer Perceptrons (MLPs) organized in a tree structure to divide and compress data into local regions, sharing parameters hierarchically based on spatial proximity to maintain continuity and reduce redundancy.

  • Experimental results on medical and biological datasets demonstrate TINC's superior performance in terms of PSNR, SSIM, and binary accuracy compared to traditional and other machine learning-based compression methods.

TINC: Tree-structured Implicit Neural Compression

Overview

At the core of this work is an innovative data compression method named Tree-structured Implicit Neural Compression (TINC). The approach leverages the powerful representation capabilities of Implicit Neural Representations (INRs) and enhances them through a hierarchical framework. The result is a system that can compress large and complex data more efficiently than both traditional compression algorithms and other machine learning-based techniques.

Key Concepts

Implicit Neural Representations (INRs)

INRs have gained popularity in various fields like scene rendering and shape estimation. They use compact neural networks to precisely describe data, often requiring fewer parameters than traditional methods. However, their compression capabilities hit a ceiling when dealing with large or complex data due to limited spectrum coverage.

Tree-structured INRs

To overcome this limitation, TINC proposes using Multi-Layer Perceptrons (MLPs) organized in a tree structure. Here's a breakdown:

  • The data is divided into local regions.
  • Each region is represented using an MLP.
  • These MLPs share parameters hierarchically based on spatial proximity, ensuring continuity between adjacent regions.

Method Description

Local Compression

The TINC method begins by dividing the target data into smaller, manageable blocks. Each block is independently compressed using an MLP. This concept borrows from ensemble learning, focusing on fitting simpler models to partitioned data to maintain high local fidelity.

Hierarchical Parameter Sharing

A tree structure organizes these MLPs, allowing for parameter sharing among nodes to exploit both local and non-local redundancies:

  • Close regions in the data tree share more parameters, capturing local similarities.
  • Far-apart yet similar regions also benefit from shared parameters at higher tree levels, reducing redundancy and ensuring continuity.

Experimental Results

Data and Metrics

Two main datasets were used:

  1. Medical Data: Slices from the HiP-CT dataset covering organs like lungs, heart, kidneys, and brain.
  2. Biological Data: Sub-micrometer resolution neuronal images from mouse brains.

TINC was evaluated against several state-of-the-art methods, including:

  • Traditional tools: JPEG, H.264, HEVC
  • Machine learning-based methods: DVC, SGA+BB, SSF
  • Other INR-based compressors: SCI, NeRV, NeRF

The primary evaluation metrics included Peak Signal-to-Noise Ratio (PSNR), Structural Similarity Index Measure (SSIM), and binary accuracy for biological data.

Strong Numerical Results

The results shown in Table 1 are worth highlighting:

  • Medical Data: TINC achieved a PSNR of 52.02 dB and SSIM of 0.9897 at all compression ratios, surpassing most other methods. At high compression ratios, TINC's PSNR reached 50.59 dB.
  • Biological Data: For binary accuracy thresholds of 200 and 500, TINC scored 0.9945 and 0.9970, respectively, also excelling at high compression ratios.

Visual Integrity

TINC preserved the fine details and structural continuity in 3D medical images. Supplementary figures demonstrated that TINC minimized the artifacts common in methods like SCI and HEVC, making it more reliable for critical applications such as medical imaging.

Flexibility and Adaptability

Tree Levels and Parameter Allocation

The study showed how TINC's performance could be tuned by varying tree levels and parameter allocation:

  • Tree Depth: Deeper trees improved performance for data with rich details but at the cost of higher parameter requirements.
  • Parameter Allocation: Distributing parameters according to local and non-local data features allowed for tailored compression levels across different regions, improving overall efficiency.

Practical and Theoretical Implications

Practical

TINC's ability to compress large-volume data with high fidelity and efficiency makes it a powerful tool for fields like medical imaging and biological research. Its performance under high compression ratios suggests it could significantly reduce storage and transmission costs.

Theoretical

The hierarchical parameter sharing mechanism opens new avenues in neural network design, particularly for tasks requiring fine-grained data representation. Future work could explore adaptive tree structures and more complex parameter sharing mechanisms to further enhance performance.

Future Directions

  • Speed Improvements: The current compression process is computationally intensive. Integrating meta-learning could accelerate this stage by optimizing initial parameter settings.
  • Adaptiveness: Developing methods to automatically adjust tree structures based on data complexity could make TINC more universally applicable.
  • Extended Applications: TINC's approach could be extended to compress other types of high-dimensional data, such as 4D medical imaging or functional MRI data, broadening its utility.

Conclusion

Overall, the TINC approach significantly advances data compression techniques by combining the strengths of INRs with a novel hierarchical structure. This method shows great promise in both enhancing compression fidelity and ensuring flexibility across diverse data types. As research progresses, TINC has the potential to become a staple technology in data-intensive fields.

Create an account to read this summary for free:

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.