Overview
The recent rapid advances in medical imaging and automated image analysis will continue and allow us to make significant advances in our understanding of life and disease processes, and our ability to deliver quality healthcare. A few of the synergistic developments involving a number of disciplines are highlighted.
Learning objectives
After reading this chapter you will be able to:
• recognize the limitations of current imaging technology;
• appreciate the trends and ongoing developments in medical imaging.
Trends
“A picture is worth a thousand words.”
The rapid advances of the last two or three decades in medical imaging technology, which have delivered high-resolution, three-dimensional anatomical and physiological images, is continuing apace, enabling ever more powerful advances in diagnosis and intervention. Improved, miniature detectors are pushing spatial resolution below 1 mm, which will require large computer memories and storage capacities and improved software capabilities to visualize the larger data sets interactively. Advances in post-processing, especially in automated registration, segmentation, classification and rendering, will be required (Van Leemput et al., 1999; Huber and Hebert, 2003; Way et al., 2006). The availability of multimodality imaging, such as combined CT/PET scanners, is increasing, along with the means to share such images around the clinical setting and remotely, fueling improvements in PACS and telemedicine systems (Section 4.3).
The inverse problem
A basic aspect of most imaging modalities is to reconstruct an image based on minimally invasive measurements from a number of sensors. The inverse problem determines the properties of the unknown system from the observed measurements. The goal of the reconstruction can be either structural information, such as the anatomy that comes from CT or MRI imaging, or functional information from nuclear medicine imaging or electrical impedance tomography (EIT). An important key feature of inverse problems is their ill-posedness, i.e. they do not fulfil classical requirements of existence, uniqueness and stability under data perturbations. The last aspect is especially important since in the real world measurements always contain noise; approximation methods for solving inverse problems with minimal sensitivity to noise, so-called regularization methods, are being studied.