Font Size: a A A

Study On Key Technologies Of Handheld Augmented Reality Neuronavigation And The Establishment Of Its System

Posted on:2015-12-30Degree:DoctorType:Dissertation
Country:ChinaCandidate:W W DengFull Text:PDF
GTID:1224330464955353Subject:Biomedical engineering
Abstract/Summary:PDF Full Text Request
Neurosurgical operation is the most direct and effective way for the treatment of the neurosurgical diseases, and this largely depend on the accuracy of the preoperative surgical approaches and the precision of the intraoperative localization of the focus. Since the neurosurgeon need to precisely locate the target, completely remove the focus, and protect the nearby normal structures and tissues from unknown damage, it’s difficult to achieve this goal for an ordinary neurosurgeon by the traditional way of neurosurgical operation. Image Guided Neurosurgery System (IGNS) is a type of medical auxiliary positioning device, and it can facilitate the neurosurgical operation. The concept of IGNS is to guide an operation by using the preoperative images of the patient, primarily through tracking surgical tools and displaying their relative position to the patient by displaying virtual tools on the images. IGNS can help neurosurgeons make the surgical plans, locate the position of instruments and reduce the damage to normal tissues. Therefore, it has become a routine device in minimal invasive neurosurgical operation.IGNS makes up for the deficiency of the traditional way of neurosurgery, However, most of the existing IGNS is based on virtual reality (VR) technology, which displays computer-generated virtual tools on the patient’s medical images. Since the relative position between the surgical tools and the images is displayed on the navigation screen, the surgeon has to switch his/her view between the navigation screen and surgical field back and forth to obtain the navigation information, and thus affect the concentration of the surgeon on the surgical field. Meanwhile, the separation of the navigation information and the surgical field affects the understanding of the navigation information in the context of actual surgical field, especially after draping the patent’s head. These problems limit the role of the IGNS in neurosurgery.In this thesis, we present an augmented reality (AR) neuronavigation system using a tablet PC and wireless router to avoid the above-mentioned problems caused by a traditional IGNS, and we used the term AR-IGNS to describe this mobile and wireless pattern of IGNS. According to directly combine computer-generated virtual information, such as segmented tumor, with the actual surgical scene, the new system displays the navigation information on the tablet screen to guide the surgery. Since the tablet is laid beside the surgeon and the surgical field, the surgeon can simultaneously see the navigation information displayed on the tablet screen and the actual surgical field without large view switching. There are three navigation modes in the AR-IGNS, and each mode shows a type of navigation information. In this thesis, we first state the three types of the navigation modes, followed by the key technology of the new system. The whole thesis can be divided into the following four parts:Part 1 is the study of overlaid image navigation mode based on the combination of virtual and actual information. The overlaid image navigation mode is the most important mode in this AR-IGNS. In the mode, the AR-IGNS superimposes the preoperatively segmented brain tumor onto the actual surgical scene by displaying the overlaid image on the tablet screen to aid the surgery. According to the spatial registration, dynamic tracking and camera calibration process, the system can finally transforms the image information in the virtual space to the tablet screen space. In this mode, the system first use several artificial markers to calculate the spatial relationship between the virtual and actual space by singular value decomposition; and then, it use the tracking system to track the patient, the tablet and the tools simultaneously, and obtain the transformation from the actual space to the tablet space; finally, it obtain the transformation from the tablet space to the tablet screen by the camera calibration method. Combining these three transformations, the system can eventually superimposes the images in virtual space onto the tablet screen space and displays the overlaid images on the tablet screen. The navigation mode solves the problem of the separation of the navigation information and the surgical field that exists in a traditional IGNS by superimposing the navigation information onto the actual surgical field.Part 2 is the study of the sectional image navigation mode based on the combination of virtual and actual information. The sectional image is generated according to the relative position between the tablet and the actual patient and is displayed on the tablet screen. In this mode, the AR-IGNS first fuses the segmented tumor information with the patient’s original imaging data; and then, it transforms the position of the tablet in the actual space to the virtual space by the tracking of the tracking system, and generates corresponding sectional image with fused tumor information by resampling; finally, it transmits the fused sectional image to the tablet via wireless local area network and displays the it on the tablet screen. When the tablet moves, the fused sectional image updates accordingly. The fused sectional image navigation mode combines the sectional image with the actual surgical field such that avoid large view switching between them. Furthermore, this navigation mode can provide with much more navigation information and make them easier to understand.Part 3 is the study of the projection image navigation mode based on the combination of virtual and actual information. In this mode, the tablet screen is considered as the visualization plane and the projection image is generated by parallelly projecting the imaging data of the patient onto the tablet screen according to the relative position between the tablet and the actual patient’s head. For angiography imaging, the Maximum Intensity Projection (MIP) image can clearly shows the intracranial vascular information. Then the AR-IGNS fuses the MIP image with segmented tumor so as to obtain the relative position between the vessels and the regions of interest. When the tablet moves, the fused image updates accordingly. A traditional IGNS only shows the MIP images on the navigation screen, while the AR-IGNS can display the fused image of the MIP image and the segmented tumor on the tablet screen such that enrich the navigation information.Part 4 is the study of image segmentation in virtual space. Because of the particularity and complexity of medical images, in this thesis, we apply the random walk algorithm to the segmentation process of brain tumor. As an interactive image segmentation method based on graph theory, random walk algorithm can realize fast and accurate brain tumor segmentation by simply defining two sets of seed points: one set is in the tumor region and the other set is in background region. Then the segmented tumor can be quickly visualized. By localizing the position and boundary of the tumor in the above-mentioned three types of navigation modes, the AR-IGNS can help the surgeon to remove the tumor more accurately.In conclusion, in view of the existing problems of a traditional IGNS, in this thesis, we propose three new navigation modes based on overlaid images, sectional images and MIP images, separately. The three new navigation modes all combine virtual navigation information with actual surgical field by AR technology and displays the navigation information on the tablet screen. Meanwhile, they use the random walk algorithm interactively segment the brain tumor and fuse it the virtual navigation information. By avoiding large view switching of the surgeon during surgery, and making the navigation information more understandable, the AR-IGNS ensures the surgeon concentrate on the surgical field and improves the precision of the surgery. The feasibility and clinical applicability of the three new navigation modes are both verified by skull specimen and clinical experiments.With the research in this thesis as major contents, one paper was published in a SCI indexed journals, one paper was revised by another SCI indexed journal and one paper was published in a national core journal, two patents of invention were applied. In addition, the AR-IGNS was exhibited in the 15th China International Industry Fair, and is awarded for Outstanding Exhibit in university exhibition area.
Keywords/Search Tags:Image Guided Neurosurgery, Augmented reality, Tablet, Image segmentation, Maximum intensity projection
PDF Full Text Request
Related items