Emergent Mind

Adaptative Diffraction Image Registration for 4D-STEM to optimize ACOM Pattern Matching

(2305.02124)
Published May 3, 2023 in cond-mat.mtrl-sci and eess.IV

Abstract

The technique known as 4D-STEM has recently emerged as a powerful tool for the local characterization of crystalline structures in materials, such as cathode materials for Li-ion batteries or perovskite materials for photovoltaics. However, the use of new detectors optimized for electron diffraction patterns and other advanced techniques requires constant adaptation of methodologies to address the challenges associated with crystalline materials. In this study, we present a novel image processing method to improve pattern matching in the determination of crystalline orientations and phases. Our approach uses sub-pixelar adaptative image processing to register and reconstruct electron diffraction signals in large 4D-STEM datasets. By using adaptive prominence and linear filters such as mean and gaussian blur, we are able to improve the quality of the diffraction pattern registration. The resulting data compression rate of 103 is well-suited for the era of big data and provides a significant enhancement in the performance of the entire ACOM data processing method. Our approach is evaluated using dedicated metrics, which demonstrate a high improvement in phase recognition. Our results demonstrate that this data preparation method not only enhances the quality of the resulting image but also boosts the confidence level in the analysis of the outcomes related to determining crystal orientation and phase. Additionally, it mitigates the impact of user bias that may occur during the application of the method through the manipulation of parameters.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.