Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 165 tok/s
Gemini 2.5 Pro 47 tok/s Pro
GPT-5 Medium 25 tok/s Pro
GPT-5 High 26 tok/s Pro
GPT-4o 81 tok/s Pro
Kimi K2 189 tok/s Pro
GPT OSS 120B 445 tok/s Pro
Claude Sonnet 4.5 35 tok/s Pro
2000 character limit reached

Machine Learning based Post Processing Artifact Reduction in HEVC Intra Coding (1912.13100v1)

Published 30 Dec 2019 in eess.IV

Abstract: The lossy compression techniques produce various artifacts like blurring, distortion at block bounders, ringing and contouring effects on outputs especially at low bit rates. To reduce those compression artifacts various Convolutional Neural Network (CNN) based post processing techniques have been experimented over recent years. The latest video coding standard HEVC adopts two post processing filtering operations namely de-blocking filter (DBF) followed by sample adaptive offset (SAO). These operations consumes extra signaling bit and becomes an overhead to network. In this paper we proposed a new Deep learning based algorithm on SAO filtering operation. We designed a variable filter size Sub-layered Deeper CNN (SDCNN) architecture to improve filtering operation and incorporated large stride convolutional, deconvolution layers for further speed up. We also demonstrated that deeper architecture model can effectively be trained with the features learnt in a shallow network using data augmentation and transfer learning based techniques. Experimental results shows that our proposed network outperforms other networks in terms on PSNR and SSIM measurements on widely available benchmark video sequences and also perform an average of 4.1 % bit rate reduction as compared to HEVC baseline.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.