|20070057910||Method for express execution of playing computer multimedia data with device group activation||March, 2007||Chen|
|20070046698||Method for displaying high resolution picture in mobile communication terminal, mobile communication terminal and system for converting picture file format therefor||March, 2007||Nam et al.|
|20100020073||Automatic generation of human models for motion capture, biomechanics and animation||January, 2010||Corazza et al.|
|20040001076||Method and apparatus for adjusting brightness of an LCD display||January, 2004||Leng et al.|
|20060232570||Signaling pen||October, 2006||Yuen et al.|
|20050253824||[SERIAL-PROTOCOL TYPE PANEL DISPLAY SYSTEM AND METHOD]||November, 2005||Lin|
|20090167695||EMBEDDED NAVIGATION ASSEMBLY AND METHOD ON HANDHELD DEVICE||July, 2009||Griffin et al.|
|20100039435||DISPLAY CAPABLE OF BEING CONNECTED TO INTERNET||February, 2010||Chiang|
|20070030253||Touch scroller||February, 2007||Chu et al.|
|20080204476||Methods for combination tools that zoom, pan, rotate, draw, or manipulate during a drag||August, 2008||Montague|
|20100030612||MOBILE TERMINAL CAPABLE OF MANAGING SCHEDULE AND METHOD OF CONTROLLING THE MOBILE TERMINAL||February, 2010||Kim et al.|
 This application claims the benefit of U.S. Provisional Patent Application Serial No. 60/331,712, entitled “New Features For Virtual Colonoscopy” and filed Nov. 21, 2001, which is incorporated herein by reference in its entirety.
 The present disclosure relates to a system and method for performing a volume based three-dimensional virtual examination. More particularly, the disclosure relates to a virtual examination system and method providing enhanced visualization and navigational properties.
 Two-dimensional (“2D”) visualization of human organs using medical imaging devices has been widely used for patient diagnosis. Currently available medical imaging devices include computed tomography (“CT”) and magnetic resonance imaging (“MRI”), for example. Three-dimensional (“3D”) images can be formed by stacking and interpolating between two-dimensional pictures produced from the scanning machines. Imaging an organ and visualizing its volume in three-dimensional space would be beneficial due to the lack of physical intrusion and the ease of data manipulation. However, the exploration of the three-dimensional volume image must be properly performed in order to fully exploit the advantages of virtually viewing an organ from the inside.
 When viewing the 3D volume virtual image of an environment, a functional model must be used to explore the virtual space. One possible model is a virtual “camera” that can be used as a point of reference for the viewer to explore the virtual space. Camera control in the context of navigation within a general 3D virtual environment has been previously studied. There are two conventional types of camera control offered for navigation of virtual space. The first gives the operator full control of the camera, which allows the operator to manipulate the camera in different positions and orientations to achieve the view, desired. The operator will in effect pilot the camera. This allows the operator to explore a particular section of interest while ignoring other sections. However, complete control of a camera in a large domain would be tedious and tiring, and an operator might not view all the important features between the start and finishing point of the exploration.
 The second technique of camera control is a planned navigational method, which assigns the camera a predetermined path to take and which cannot be accidentally changed by the operator. This is akin to having an engaged “autopilot”. This allows the operator to concentrate on the virtual space being viewed, and not have to worry about steering into walls of the environment being examined. However, this second technique does not give the viewer the flexibility to alter the course or investigate an interesting area viewed along the flight path.
 It would be desirable to use a combination of the two navigation techniques described above to realize the advantages of both techniques while minimizing their respective drawbacks. It would be desirable to apply a flexible navigation technique to the examination of human or animal organs that are represented in virtual 3D space in order to perform a non-intrusive painless and thorough examination. The desired navigational technique would further allow for a complete examination of a virtual organ in 3D space by an operator, allowing flexibility while ensuring a smooth path and complete examination through and around the organ. It would be additionally desirable to be able to display the exploration of the organ in a real time setting by using a technique that minimizes the computations necessary for viewing the organ. The desired technique should also be equally applicable to exploring any virtual object.
 Radiologists and other specialists have historically been trained to analyze scan data consisting of two-dimensional slices. However, while stacks of such slices may be useful for analysis, they do not provide an efficient or intuitive means to navigate through a virtual organ, especially one as tortuous and complex as the colon. There remains a need for a virtual examination system providing data in a conventional format for analysis while, in addition, allowing an operator to easily navigate a virtual organ.
 A preferred embodiment of the present disclosure generates a three-dimensional visualization image of an object such as a human organ using volume visualization techniques and explores the virtual image using a guided navigation system, which allows the operator to travel along a predefined flight path and to adjust both the position and viewing angle to a particular portion of interest in the image away from the predefined path in order to identify polyps, cysts or other abnormal features in the organ.
 An aspect of the present disclosure relates to a method for performing a three-dimensional internal virtual examination of at least one organ. According to this aspect, the organ is scanned with a radiological scanning device to produce scan data representative of the organ which is then used to create a three-dimensional volume representation of the organ that includes volume elements. The scan data includes a sequence of axial images. Using the three-dimensional representation, a defined flight path is generated and guided navigation through the three-dimensional representation is performed. Simultaneously with the display of the guided navigation, one of the series of axial images is displayed wherein the one image corresponds to the current location along defined path.
 Another aspect of the present disclosure relates to an operator interface for a three-dimensional virtual examination system of an object wherein the virtual examination includes a guided navigation along a defined path within a three-dimensional volume representation of the object created from scanning data comprising a sequence of two-dimensional axial images of the object and then generating volume elements of the representation based on these axial images. According to this aspect, the operator interface includes a display screen having a plurality of sub-windows simultaneously visible. Within a first of these sub-windows volume elements responsive to the defined path and an operator's input during the guided navigation are displayed in real-time. In a second of these sub-windows one of the two-dimensional images corresponding to a current location along the defined path is displayed. This operator interface can be stored as instructions on a computer-readable medium as well to cause, upon execution thereof, a processor to provide the operator interface.
 Accordingly, system and method embodiments are provided for generating a three-dimensional visualization image of an object such as an organ using volume visualization techniques and exploring the image using a guided navigation system, which allows the operator to travel along a flight path and to adjust the view to a particular portion of the image of interest in order, for example, to identify polyps, cysts or other abnormal features in the visualized organ. One or more series of two-dimensional renditions of the organ, correlated to the flight path location, can be provided to an operator to assist in analyzing the organ. Both the three-dimensional representation, a display of the flight path, and the two-dimensional slices are simultaneously displayed to the operator.
 Further objects, features and advantages of the present disclosure will become apparent from the following detailed description taken in conjunction with the accompanying figures showing preferred embodiments of the disclosure, in which:
 While the methods and systems described herein may be applied to any object to be examined, the preferred embodiment to be described is the examination of an organ in the human body, specifically the colon. The colon is long and twisted, which makes it especially suited for a virtual examination saving the patient monetary expense as well as the discomfort and increased hazard of a physical probe. Other examples of organs that can be examined include the lungs, stomach and portions of the gastrointestinal system, the heart and blood vessels.
 As shown in
 An example of a single CT-image would use an X-ray beam of 5 mm width, 1:1 to 2:1 pitch, with a 40 cm field-of-view being performed from the top of the splenic flexure of the colon to the rectum.
 Discrete data representations of the object can be produced by other methods besides scanning. Voxel data representing an object can be derived from a geometric model by techniques described in U.S. Pat. No. 5,038,302 entitled “Method of Connecting Continuous Three-Dimensional Geometrical Representations into Discrete Three-Dimensional Voxel-Based Representations Within a Three-Dimensional Voxel-Based System” by Kaufman, issued Aug. 8, 1991, filed Jul. 26,1988, which is hereby incorporated by reference in its entirety. Additionally, data can be produced by a computer model of an image, which can be converted to three-dimensional voxels and explored in accordance with this disclosure.
 The method described in
 The steps described in conjunction with
 Turning to
 Methods for computing a centerline inside the area of interest are well known in the art (see, e.g., U.S. Pat. No. 5,971,767 entitled “SYSTEM AND METHOD FOR PERFORMING A THREE-DIMENSIONAL VIRTUAL EXAMINATION” by Kaufman et al.; issued Oct. 26, 1999 and incorporated by reference herein in its entirety).
 Referring to
 Turning now to
 In the system
 The scanned data need not be converted to its 3D representation until the visualization-rendering engine requires it to be in 3D form. This saves computational steps and memory storage space.
 The virtual navigation terminal
 The scanning device
 An important feature in system
 Some of the applicable techniques may be further enhanced in virtual colonoscopy applications through the use of a number of additional techniques that are described in U.S. Pat. No. 6,343,936 entitled “SYSTEM AND METHOD FOR PERFORMING A THREE-DIMENSIONAL VIRTUAL EXAMINATION, NAVIGATION AND VISUALIZATION” by Kaufman et al.; issued Feb. 7, 2002, which is incorporated herein by reference in its entirety. These improvements, described briefly below, include improved colon cleansing, volume rendering, additional fly-path determination techniques, and alternative hardware embodiments.
 An improved electronic colon cleansing technique employs modified bowel preparation operations followed by image segmentation operations, such that fluid and stool remaining in the colon during a computed tomographic (“CT”) or magnetic resonance imaging (“MRI”) scan can be detected and removed from the virtual colonoscopy images. Through the use of such techniques, conventional physical washing of the colon, and its associated inconvenience and discomfort, is minimized or completely avoided.
 In addition to image segmentation and texture mapping, volume-rendering techniques may be used in connection with virtual colonoscopy procedures to further enhance the fidelity of the resulting image. Methods for volume rendering are well known to those of ordinary skill in the pertinent art.
 Referring to
 As shown in
 Turning to
 Referring to
 Furthermore, each window
 The screen
 Turning to
 Embodiments of the present disclosure provide a user interface displaying both two-dimensional and three-dimensional data. Organs within the body are, by nature, three-dimensional. Conventional medical imaging devices, however, as explained herein, create stacks of two-dimensional images when acquiring scan data. Radiologists and other specialists, therefore, have historically been trained to review and analyze these two-dimensional images. As a result, most doctors are comfortable viewing two-dimensional images even if three-dimensional reconstructions or virtualizations are available.
 However, many organs are not simple convex objects but, instead, can be tortuous or have many branches. While a doctor may be comfortable analyzing two-dimensional images, performing navigation through complex organs is very difficult using merely two-dimensional images. Navigating using the two-dimensional images would include manually scrolling through the images to move in the “z” direction (along the major axis of the body) and panning the images to move in the “x” and “y” direction. In this manner an operator can traverse the organ looking for areas of interest.
 On the other hand, three-dimensional flight paths, as described herein, are intuitive, efficient tools to virtually travel through volumetric renderings of human organs either automatically or manually. During a flight path tour, each point along the flight path is represented by a coordinate (x, y, z). According to embodiments of the present disclosure, these coordinates are used to automatically scroll and pan the series of two-dimensional images that doctors are used to analyzing. Thus, the operator does not have to manually navigate through an organ in two dimensions but, instead, can let the present virtualization system advance along the organ while the operator concentrates on analyzing each two-dimensional image.
 The foregoing merely illustrates the principles of the disclosure. It will thus be appreciated that those skilled in the art will be able to devise numerous systems, apparatus and methods which, although not explicitly shown or described herein, embody the principles of the disclosure and are thus within the spirit and scope of the disclosure as defined by its claims.
 For example, the methods and systems described herein could be applied to virtually examine an animal, fish or inanimate object. Besides the stated uses in the medical field, applications of the technique could be used to detect the contents of sealed objects that cannot be opened. The technique could also be used inside an architectural structure such as a building or cavern and enable the operator to navigate through the structure.
 These and other features and advantages of the present disclosure may be readily ascertained by one of ordinary skill in the pertinent art based on the teachings herein. It is to be understood that the teachings of the present disclosure may be to implemented in various forms of hardware, software, firmware, special purpose processors, or combinations thereof.
 Most preferably, the teachings of the present disclosure are implemented as a combination of hardware and software. Moreover, the software is preferably implemented as an application program tangibly embodied on a program storage unit. The application program may be uploaded to, and executed by, a machine comprising any suitable architecture. Preferably, the machine is implemented on a computer platform having hardware such as one or more central processing units (“CPU”), a random access memory (“RAM”), and input/output (“I/O”) interfaces. The computer platform may also include an operating system and microinstruction code. The various processes and functions described herein may be either part of the microinstruction code or part of the application program, or any combination thereof, which may be executed by a CPU. In addition, various other peripheral units may be connected to the computer platform such as an additional data storage unit and a printing unit.
 It is to be further understood that, because some of the constituent system components and methods depicted in the accompanying drawings are preferably implemented in software, the actual connections between the system components or the process function blocks may differ depending upon the manner in which embodiments of the present disclosure are programmed. Given the teachings herein, one of ordinary skill in the pertinent art will be able to contemplate these and similar implementations or configurations of the present invention.
 Although the illustrative embodiments have been described herein with reference to the accompanying drawings, it is to be understood that the present invention is not limited to those precise embodiments, and that various changes and modifications may be effected therein by one of ordinary skill in the pertinent art without departing from the scope or spirit of the present disclosure. All such changes and modifications are intended to be included within the scope of the present invention as set forth in the appended claims.