Non-rigid 3D motion estimation at high temporal resolution from prospectively undersampled k-space data using low-rank MR-MOTUS (2007.00488v1)
Abstract: With the recent introduction of the MR-LINAC, an MR-scanner combined with a radiotherapy LINAC, MR-based motion estimation has become of increasing interest to (retrospectively) characterize tumor and organs-at-risk motion during radiotherapy. To this extent, we introduce low-rank MR-MOTUS, a framework to retrospectively reconstruct time-resolved non-rigid 3D+t motion-fields from a single low-resolution reference image and prospectively undersampled k-space data acquired during motion. Low-rank MR-MOTUS exploits spatio-temporal correlations in internal body motion with a low-rank motion model, and inverts a signal model that relates motion-fields directly to a reference image and k-space data. The low-rank model reduces the degrees-of-freedom, memory consumption and reconstruction times by assuming a factorization of space-time motion-fields in spatial and temporal components. Low-rank MR-MOTUS was employed to estimate motion in 2D/3D abdominothoracic scans and 3D head scans. Data were acquired using golden-ratio radial readouts. Reconstructed 2D and 3D respiratory motion-fields were respectively validated against time-resolved and respiratory-resolved image reconstructions, and the head motion against static image reconstructions from fully-sampled data acquired right before and right after the motion. Results show that 2D+t respiratory motion can be estimated retrospectively at 40.8 motion-fields-per-second, 3D+t respiratory motion at 7.6 motion-fields-per-second and 3D+t head-neck motion at 9.3 motion-fields-per-second. The validations show good consistency with image reconstructions. The proposed framework can estimate time-resolved non-rigid 3D motion-fields, which allows to characterize drifts and intra and inter-cycle patterns in breathing motion during radiotherapy, and could form the basis for real-time MR-guided radiotherapy.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.