3D reconstructions of select anatomical structures have been created in QTVR (Quicktime Virtual Reality). Because the contours of every anatomical object in the AnatLab knowledge base have been mapped, it is possible to generate true-color, highly-realistic 3D volumetric renderings of user-specified groupings of structures, in Quicktime VR format, to allow interactive rotation and viewing. A library of QTVR 3D reconstructions used at the UIC Medical School and at the University of Texas at Tyler are located at http://anatomy.nmhmchicago.net.
ImageMagick was used to mask out all but selected anatomical objects from the axial sections. These tools were also used to create model-like features like transparency and tint. http://www.imagemagick.org/
OsiriX is an image processing software dedicated to DICOM images. OsiriX has been specifically designed for navigation and visualization of multimodality and multidimensional images. OsiriX is used to create the 3D volumetric renderings in the Quicktime VR format.
The jQuery Reel plugin was used to create web browser based interactive animations from the QTVR 3D movie frames. Reel 1.2.1 is an established jQuery plugin which takes an ordinary image tag and transforms it into an interactive 360° object movie, panorama or stop-motion animation. The addition of leader lines for anatomical object identification was used in the QTVR frames. http://jquery.vostrel.cz/reel
This web browser-based product allows interactive examination of individual anatomical features, in three dimensions, in a web browser window. The individual bounding coordinates of thousands of anatomical features have been extracted from the planar images of the Visible Human Male dataset from the National Library of Medicine and stored in a high-performance database architecture. The Volume Renderer web page allows a visitor to pick from a menu any one feature to be viewed in isolation, or any combination of features. The Renderer's server code retrieves the coordinates from the database and constructs in real time a high-resolution 3D rendering of the chosen features, which can be rotated and viewed from multiple angles in the browser. The renderings can be magnified and sliced as well. This project was worked on by Steve Huntley through the University of Illinois Chicago Medical School under a grant from the Buonacorsi Foundation.
Our biomedical visualization interns worked with the AnatLab Visible Human Male (VHM) skull data in Mimics. Using various retopologizing techniques in both Autodesk 3ds Max and Mudbox, they were able to create a 3D digital reconstruction of the skull identical in appearance to the VHM data. This work was subsequently used in the jibber exhibit by Steve Landers for NMHM Chicago. (research of AZimmerman and JRogers)
Future work includes 3D surface rendering with true anatomical colors using texture unwrapping techniques, 3D scanning resources, and point cloud software like MeshLab.
Anthony Viola of Smith + Gill Architecture used the surface model of the VHM skull to create an augmented 3D reality presentation for the TTI Vanguard Next conference in December, 2012 for a talk given by Mr. Leslie Ventsch, Director of Design, Smith + Gill Architecture, and Mike Doyle, Founder of NMHM Chicago. (http://www.ttivanguard.com/conference/2012/next12.html) Anthony also demonstrated the use of augmented 3D reality in their architectural work to biomedical visualization student researchers. (BVIS student researchers)
The museum's biomedical visualization interns used the AR-media plugin to create an AR 3D model of Einstein's brain based on a 3D model digitally sculpted from pictures of Einstein's brain supplied by the National Museum of Health and Medicine. (research of AZimmerman and JRogers)