Sign up
Title:
Display of two-dimensional and three-dimensional views during virtual examination
Kind Code:
A1
Abstract:
A system and method are provided for generating a three-dimensional visualization image of an object such as an organ using volume visualization techniques and exploring the image using guided navigation that allows the operator to travel along a flight path and to adjust the view to a particular portion of the image of interest in order, for example, to identify polyps, cysts or other abnormal features in the visualized organ, wherein one or more series of two-dimensional renditions of the organ, correlated to the flight path location, are provided to an operator to assist in analyzing the organ, and the three-dimensional representation, a display of the flight path, and two-dimensional slices are simultaneously displayed to the operator.


Inventors:
Kreeger, Kevin (East Setauket, NY, US)
Li, Bin (Centereach, NY, US)
Dachille, Frank C. (Amityville, NY, US)
Meade, Jeff (Bay Shore, NY, US)
Application Number:
10/301034
Publication Date:
07/17/2003
Filing Date:
11/21/2002
Assignee:
KREEGER KEVIN
LI BIN
DACHILLE FRANK C.
MEADE JEFF
Primary Class:
International Classes:
G06K9/00; G06T7/00; G06T15/08; G06T17/00; (IPC1-7): G06T17/00
View Patent Images:
Attorney, Agent or Firm:
F. CHAU & ASSOCIATES, LLP (Suite 501, East Meadow, NY, 11554, US)
Claims:

What is claimed is:



1. A method for performing a three-dimensional virtual examination of at least one object, the method comprising: scanning said object with a scanning device and producing scan data representative of said object, said scan data comprising a sequence of two-dimensional images of said object; creating a three-dimensional volume representation of said object comprising volume elements from said scan data; selecting a start position from said three-dimensional volume representation; generating a defined path from said start position and extending within said three-dimensional volume representation; performing a guided navigation of said three-dimensional representation along said path; and displaying in real time volume elements responsive to said path during said guided navigation and simultaneously displaying at least one of the sequence of two-dimensional images based on a current location along the defined path.

2. A method as defined in claim 1 wherein displaying in real time is further responsive to an operator's input.

3. A method as defined in claim 1 wherein said start position comprises a sub-voxel position.

4. A method as defined in claim 1, further comprising: moving along the defined path in response to the current location; and changing the at least one displayed image to correspond to the current location.

5. A method as defined in claim 1 wherein the object is an organ within a body.

6. A method as defined in claim 5 wherein said sequence of two-dimensional images comprises axial images of said organ.

7. A method as defined in claim 5 wherein the organ is a colon.

8. A method as defined in claim 5 wherein the organ is a lung.

9. A method as defined in claim 5 wherein the organ is a heart.

10. A method as defined in claim 1, further comprising displaying a view of the defined path simultaneously with the display of the volume elements.

11. A method as defined in claim 1 wherein the current location includes x, y and z coordinates and the at least one displayed image corresponds to the z coordinate and is displayed centered around the x and y coordinates.

12. A method as defined in claim 1 wherein the current location includes x, y and z coordinates and the at least one displayed image corresponds to the y coordinate and is displayed centered around the x and z coordinates.

13. A method as defined in claim 1 wherein the current location includes x, y and z coordinates and the at least one displayed image corresponds to the x coordinate and is displayed centered around the y and z coordinates.

14. A method as defined in claim 1, further comprising: generating from the three-dimensional volume representation a sequence of two-dimensional images along the defined flight path, the images aligned with a second axis; and displaying a particular image aligned with the second axis corresponding to the current location along the defined path simultaneously with the display of the volume elements.

15. A method as defined in claim 14 wherein the second axis is one of a coronal axis, a sagittal axis, and an axis perpendicular to the defined path.

16. A method for performing a three-dimensional internal virtual examination of at least one organ, the method comprising: scanning said organ with a scanning device and producing scan data representative of said organ, said scan data comprising a sequence of two-dimensional axial images of said organ; creating a three-dimensional volume representation of said organ comprising volume elements from said scan data; selecting a start position from said three-dimensional volume representation; generating a defined path from said start position and extending within said three-dimensional volume representation; performing a guided navigation of said three-dimensional representation along said path; and displaying in real time volume elements responsive to said path during said guided navigation and simultaneously displaying one of the sequence of axial images based on a current location along the defined path.

17. A method as defined in claim 16 wherein displaying in real time is further responsive to an operator's input.

18. A method as defined in claim 16 wherein said start position comprises a sub-voxel position.

19. A method as defined in claim 18, further comprising changing the one displayed image to correspond to the current location in response to the current location moving along the defined path.

20. A method as defined in claim 18 wherein the organ is a colon.

21. A method as defined in claim 18 wherein the organ is a lung.

22. A method as defined in claim 18 wherein the organ is a heart.

23. A method as defined in claim 18, further comprising displaying a view of the defined path simultaneously with the display of the volume elements.

24. A method as defined in claim 18 wherein the current location includes x, y and z coordinates and the one displayed image corresponds to the z coordinate and is displayed centered around the x and y coordinates.

25. A method as defined in claim 18, further comprising: generating from the three-dimensional volume representation a sequence of two-dimensional images along the defined flight path aligned with a second axis; and displaying a particular two-dimensional image from the sequence of images aligned with the second axis simultaneously with the display of the volume elements, said particular two-dimensional image corresponding to the current location along the defined path.

26. A method as defined in claim 25 wherein the second axis is one of a coronal axis, a sagittal axis, and an axis perpendicular to the defined path.

27. An operator interface for a three-dimensional virtual examination system of an object wherein said virtual examination includes a guided navigation along a defined path within a three-dimensional volume representation of said object created from scanning data comprising a sequence of two-dimensional axial images of said object and then generating volume elements of the representation based on the axial images, the operator interface comprising: a display screen having a plurality of sub-windows simultaneously visible; a first of said sub-windows configured to display in real time volume elements responsive to said defined path and to an operator's input during the guided navigation; and a second of said sub-windows configured to display one of the two-dimensional images corresponding to a current location along the defined path.

28. An interface as defined in claim 27 wherein the object is an organ within a body.

29. An interface as defined in claim 28 wherein the organ is a colon.

30. An interface as defined in claim 28 wherein the organ is a lung.

31. An interface as defined in claim 27, further comprising a third of said sub-windows configured to display the defined path.

32. An interface as defined in claim 27, further comprising a second sequence of two-dimensional images of said object, said second sequence of images generated from the volume elements and oriented along a second axis different from the axial images.

33. An interface as defined in claim 32 wherein the second axis is one of a coronal axis, a sagittal axis, and an axis perpendicular to the defined path.

34. An interface as defined in claim 32, further comprising a third of said sub-windows configured to display a particular image from the second sequence of images based on the current position along the defined path.

35. A computer-readable medium bearing instructions for an operator interface for a three-dimensional virtual examination system of an object wherein said virtual examination includes a guided navigation along a defined path within a three-dimensional volume representation of said object created from scanning data comprising a sequence of two-dimensional axial images of said object and then generating volume elements of the representation based on the axial images, said instructions arranged, when executed by one or more processors, to cause the one or more processors to: provide a display screen having a plurality of sub-windows simultaneously visible; display in a first of said sub-windows in real time volume elements responsive to said defined path and an operator's input during the guided navigation; and display in a second of said sub-windows one of the two-dimensional images corresponding to a current location along the defined path.

36. A program storage device readable by machine, tangibly embodying a program of instructions executable by the machine to perform method steps for performing a three-dimensional virtual examination of at least one object, the method steps comprising: scanning with a scanning device and producing scan data representative of said object, said scan data comprising a sequence of two-dimensional images of said object; creating a three-dimensional volume representation of said object comprising volume elements from said scan data; selecting a start volume element and a finish volume element from said three-dimensional volume representation; generating a defined path between said start and finish volume elements; performing a guided navigation of said three-dimensional representation along said path; and displaying in real time said volume elements responsive to said path and to an operator's input during said guided navigation and simultaneously displaying at least one of the sequence of two-dimensional images based on a current location along the defined path.

37. An apparatus for performing a three-dimensional virtual examination of at least one object, the apparatus comprising: scanning means for scanning with a scanning device and producing scan data representative of said object, said scan data comprising a sequence of two-dimensional images of said object; volume-rendering means for creating a three-dimensional volume representation of said object comprising volume elements from said scan data; selection means for selecting a start volume element and a finish volume element from said three-dimensional volume representation; flight-path means for generating a defined path between said start and finish volume elements; navigational means for performing a guided navigation of said three-dimensional representation along said path; and display means for displaying in real time said volume elements responsive to said path and to an operator's input during said guided navigation and simultaneously displaying at least one of the sequence of two-dimensional images based on a current location along the defined path.

38. An apparatus for performing a three-dimensional virtual examination of at least one object, the apparatus comprising: a scanning device for receiving a plurality of two-dimensional image slices of at least one object; a rendering device in signal communication with the scanning device for rendering a three-dimensional volume representation of the plurality of two-dimensional image slices; a processing device in signal communication with the rendering device for locating a first set of features along a centerline within the rendered three-dimensional volume representation; an indexing device in signal communication with the processing device for matching at least one feature in the rendered three-dimensional volume representation with a corresponding two-dimensional image slice; and a display device in signal communication with the indexing device for displaying both of the rendered three-dimensional volume representation and the matched two-dimensional image slice.

Description:

CROSS-REFERENCE

[0001] This application claims the benefit of U.S. Provisional Patent Application Serial No. 60/331,712, entitled “New Features For Virtual Colonoscopy” and filed Nov. 21, 2001, which is incorporated herein by reference in its entirety.

BACKGROUND

[0002] The present disclosure relates to a system and method for performing a volume based three-dimensional virtual examination. More particularly, the disclosure relates to a virtual examination system and method providing enhanced visualization and navigational properties.

[0003] Two-dimensional (“2D”) visualization of human organs using medical imaging devices has been widely used for patient diagnosis. Currently available medical imaging devices include computed tomography (“CT”) and magnetic resonance imaging (“MRI”), for example. Three-dimensional (“3D”) images can be formed by stacking and interpolating between two-dimensional pictures produced from the scanning machines. Imaging an organ and visualizing its volume in three-dimensional space would be beneficial due to the lack of physical intrusion and the ease of data manipulation. However, the exploration of the three-dimensional volume image must be properly performed in order to fully exploit the advantages of virtually viewing an organ from the inside.

[0004] When viewing the 3D volume virtual image of an environment, a functional model must be used to explore the virtual space. One possible model is a virtual “camera” that can be used as a point of reference for the viewer to explore the virtual space. Camera control in the context of navigation within a general 3D virtual environment has been previously studied. There are two conventional types of camera control offered for navigation of virtual space. The first gives the operator full control of the camera, which allows the operator to manipulate the camera in different positions and orientations to achieve the view, desired. The operator will in effect pilot the camera. This allows the operator to explore a particular section of interest while ignoring other sections. However, complete control of a camera in a large domain would be tedious and tiring, and an operator might not view all the important features between the start and finishing point of the exploration.

[0005] The second technique of camera control is a planned navigational method, which assigns the camera a predetermined path to take and which cannot be accidentally changed by the operator. This is akin to having an engaged “autopilot”. This allows the operator to concentrate on the virtual space being viewed, and not have to worry about steering into walls of the environment being examined. However, this second technique does not give the viewer the flexibility to alter the course or investigate an interesting area viewed along the flight path.

[0006] It would be desirable to use a combination of the two navigation techniques described above to realize the advantages of both techniques while minimizing their respective drawbacks. It would be desirable to apply a flexible navigation technique to the examination of human or animal organs that are represented in virtual 3D space in order to perform a non-intrusive painless and thorough examination. The desired navigational technique would further allow for a complete examination of a virtual organ in 3D space by an operator, allowing flexibility while ensuring a smooth path and complete examination through and around the organ. It would be additionally desirable to be able to display the exploration of the organ in a real time setting by using a technique that minimizes the computations necessary for viewing the organ. The desired technique should also be equally applicable to exploring any virtual object.

[0007] Radiologists and other specialists have historically been trained to analyze scan data consisting of two-dimensional slices. However, while stacks of such slices may be useful for analysis, they do not provide an efficient or intuitive means to navigate through a virtual organ, especially one as tortuous and complex as the colon. There remains a need for a virtual examination system providing data in a conventional format for analysis while, in addition, allowing an operator to easily navigate a virtual organ.

SUMMARY

[0008] A preferred embodiment of the present disclosure generates a three-dimensional visualization image of an object such as a human organ using volume visualization techniques and explores the virtual image using a guided navigation system, which allows the operator to travel along a predefined flight path and to adjust both the position and viewing angle to a particular portion of interest in the image away from the predefined path in order to identify polyps, cysts or other abnormal features in the organ.

[0009] An aspect of the present disclosure relates to a method for performing a three-dimensional internal virtual examination of at least one organ. According to this aspect, the organ is scanned with a radiological scanning device to produce scan data representative of the organ which is then used to create a three-dimensional volume representation of the organ that includes volume elements. The scan data includes a sequence of axial images. Using the three-dimensional representation, a defined flight path is generated and guided navigation through the three-dimensional representation is performed. Simultaneously with the display of the guided navigation, one of the series of axial images is displayed wherein the one image corresponds to the current location along defined path.

[0010] Another aspect of the present disclosure relates to an operator interface for a three-dimensional virtual examination system of an object wherein the virtual examination includes a guided navigation along a defined path within a three-dimensional volume representation of the object created from scanning data comprising a sequence of two-dimensional axial images of the object and then generating volume elements of the representation based on these axial images. According to this aspect, the operator interface includes a display screen having a plurality of sub-windows simultaneously visible. Within a first of these sub-windows volume elements responsive to the defined path and an operator's input during the guided navigation are displayed in real-time. In a second of these sub-windows one of the two-dimensional images corresponding to a current location along the defined path is displayed. This operator interface can be stored as instructions on a computer-readable medium as well to cause, upon execution thereof, a processor to provide the operator interface.

[0011] Accordingly, system and method embodiments are provided for generating a three-dimensional visualization image of an object such as an organ using volume visualization techniques and exploring the image using a guided navigation system, which allows the operator to travel along a flight path and to adjust the view to a particular portion of the image of interest in order, for example, to identify polyps, cysts or other abnormal features in the visualized organ. One or more series of two-dimensional renditions of the organ, correlated to the flight path location, can be provided to an operator to assist in analyzing the organ. Both the three-dimensional representation, a display of the flight path, and the two-dimensional slices are simultaneously displayed to the operator.

BRIEF DESCRIPTION OF THE DRAWINGS

[0012] Further objects, features and advantages of the present disclosure will become apparent from the following detailed description taken in conjunction with the accompanying figures showing preferred embodiments of the disclosure, in which:

[0013] FIG. 1 shows a flow chart of the steps for performing a virtual examination of an object, specifically a colon, in accordance with the disclosure;

[0014] FIG. 2 shows an illustration of a “submarine” camera model which performs guided navigation in the virtual organ;

[0015] FIG. 3 shows a diagram illustrating a two dimensional cross-section of a volumetric colon which contains the flight path;

[0016] FIG. 4 shows a diagram of a system used to perform a virtual examination of a human organ in accordance with the disclosure;

[0017] FIG. 5 shows an exemplary representation of a colon and accompanying flight-path generated according to an embodiment of the present disclosure;

[0018] FIG. 6 shows an exemplary display of a two-dimensional slice of scan data according to an embodiment of the present disclosure;

[0019] FIG. 7 shows the colon of FIG. 5 intersected by a plane oriented perpendicular to the flight-path;

[0020] FIG. 8 shows an exemplary operator interface screen according to embodiments of the present disclosure; and

[0021] FIG. 9 shows a block diagram of a system embodiment based on a personal computer bus architecture.

DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS

[0022] While the methods and systems described herein may be applied to any object to be examined, the preferred embodiment to be described is the examination of an organ in the human body, specifically the colon. The colon is long and twisted, which makes it especially suited for a virtual examination saving the patient monetary expense as well as the discomfort and increased hazard of a physical probe. Other examples of organs that can be examined include the lungs, stomach and portions of the gastrointestinal system, the heart and blood vessels.

[0023] As shown in FIG. 1, a method for performing a virtual examination of an object such as a colon is indicated generally by the reference numeral 100. The method 100 illustrates the steps necessary to perform a virtual colonoscopy using volume visualization techniques. Step 101 prepares the colon to be scanned in order to be viewed for examination if required by either the doctor or the particular scanning instrument. This preparation could include cleansing the colon with a “cocktail” or liquid, which enters the colon after being orally ingested and passed through the stomach. The cocktail forces the patient to expel waste material that is present in the colon. One example of a substance used is Golytcly. Additionally, in the case of the colon, air or carbon dioxide can be forced into the colon in order to expand it to make the colon easier to scan and examine. This is accomplished with a small tube placed in the rectum with approximately 1,000 cc of air pumped into the colon to distend the colon. Depending upon the type of scanner used, it may be necessary for the patient to drink a contrast substance such as barium to coat any unexpunged stool in order to distinguish the waste in the colon from the colon walls themselves. Alternatively, the method for virtually examining the colon can remove the virtual waste prior to or during the virtual examination as explained later in this specification. Step 101 does not need to be performed in all examinations as indicated by the dashed line in FIG. 1.

[0024] Step 103 scans the organ that is to be examined. The scanner can be an apparatus well known in the art, such as a spiral CT-scanner for scanning a colon or a Zenith MRI machine for scanning a lung labeled with xenon gas, for example. The scanner must be able to take multiple images from different positions around the body during suspended respiration, in order to produce the data necessary for the volume Visualization. For example, data can be acquired using a GE/CTI spiral mode scanner operating in a helical mode of 5 mm, 1.5-2.0:1 pitch, reconstructed in 1 mm slices, where the pitch is adjusted based upon the patient's height in a known manner. A routine imaging protocol of 120 kVp and 200-280 ma can be utilized for this operation. The data can be acquired and reconstructed as 1 mm thick slice images having an array size of 512×512 pixels in the field of view, which varies from 34 to 40 cm depending on the patient's size. The number of such slices generally varies under these conditions from 300 to 450, depending on the patient's height. The image data set is converted to volume elements or voxels.

[0025] An example of a single CT-image would use an X-ray beam of 5 mm width, 1:1 to 2:1 pitch, with a 40 cm field-of-view being performed from the top of the splenic flexure of the colon to the rectum.

[0026] Discrete data representations of the object can be produced by other methods besides scanning. Voxel data representing an object can be derived from a geometric model by techniques described in U.S. Pat. No. 5,038,302 entitled “Method of Connecting Continuous Three-Dimensional Geometrical Representations into Discrete Three-Dimensional Voxel-Based Representations Within a Three-Dimensional Voxel-Based System” by Kaufman, issued Aug. 8, 1991, filed Jul. 26,1988, which is hereby incorporated by reference in its entirety. Additionally, data can be produced by a computer model of an image, which can be converted to three-dimensional voxels and explored in accordance with this disclosure.

[0027] Step 104 converts the scanned images into three-dimensional volume elements (“voxels”). In a preferred embodiment for examining a colon, the scan data is reformatted into 5 mm thick slices at increments of 1 mm or 2.5 mm and reconstructed in 1 mm slices, with each slice represented as a matrix of 512 by 512 pixels. By doing this, voxels of approximately 1 cubic mm are created. Thus a large number of 2D slices are generated depending upon the length of the scan. The set of 2D slices is then reconstructed to 3D voxels. The conversion process of 2D images from the scanner into 3D voxels can either be performed by the scanning machine itself or by a separate machine such as a computer implementing techniques that are well known in the art (see, e.g., U.S. Pat. No. 4,985,856 entitled “Method and Apparatus for Storing, Accessing, and Processing Voxel-based Data” by Kaufman et al.; issued Jan. 15, 1991, filed Nov. 11, 1988; which is hereby incorporated by reference in its entirety).

[0028] Step 105 allows the operator to define the portion of the selected organ to be examined. A physician may be interested in a particular section of the colon likely to develop polyps. The physician can view a two dimensional slice overview map to indicate the section to be examined. A starting point and finishing point of a path to be viewed can be indicated by the physician/operator. A conventional computer and computer interface (e.g., keyboard, mouse or spaceball) can be used to designate the portion of the colon that is to be inspected. A grid system with coordinates can be used for keyboard entry or the physician/operator can “click” on the desired points. The entire image of the colon can also be viewed if desired.

[0029] Step 107 performs the planned or guided navigation operation of the virtual organ being examined. Performing a guided navigation operation is defined as navigating through an environment along a predefined or automatically predetermined flight path, which can be manually adjusted by an operator at any time. After the scan data has been converted to 3D voxels, the inside of the organ is traversed from the selected start to the selected finishing point. The virtual examination is modeled on having a tiny viewpoint or “camera” traveling through the virtual space with a view direction or “lens” pointing towards the finishing point. The guided navigation technique provides a level of interaction with the camera, so that the camera can navigate through a virtual environment automatically in the case of no operator interaction, and at the same time, allow the operator to manipulate the camera when necessary. The preferred embodiment of achieving guided navigation is to use a physically based camera model that employs potential fields to control the movement of the camera, as is further detailed with respect to FIG. 2.

[0030] Step 109, which can be performed concurrently with step 107, displays the inside of the organ from the viewpoint of the camera model along the selected pathway of the guided navigation operation. Three-dimensional displays can be generated using techniques well known in the art such as the marching cubes technique, for example. In order to produce a real time display of the colon, a technique is used that reduces the vast number of data computations necessary for the display of the virtual organ.

[0031] The method described in FIG. 1 can also be applied to scanning multiple organs in a body at the same time. For example, a patient may be examined for cancerous growths in both the colon and lungs. The method of FIG. 1 would be modified to scan all the areas of interest in step 103 and to select the current organ to be examined in step 105. For example the physician/operator may initially select the colon to virtually explore and later explore the lung. Alternatively, two different doctors with different specialties may virtually explore different scanned organs relating to their respective specialties. Following step 109, the next organ to be examined is selected and its portion will be defined and explored. This continues until all organs that need examination have been processed.

[0032] The steps described in conjunction with FIG. 1 can also be applied to the exploration of any object that can be represented by volume elements. For example, an architectural structure or inanimate object can be represented and explored in the same manner.

[0033] Turning to FIG. 2, a “submarine camera” model that performs guided navigation in a virtual organ is indicated generally by the reference numeral 200. The model 200 depicts a viewpoint control model that performs the guided navigation technique of step 107. When there is no operator control during guided navigation, the default navigation is similar to that of planned navigation that automatically directs the camera along a flight path from one selected end of the colon to another. During the planned navigation phase, the camera stays at the center of the colon for obtaining better views of the colonic surface. When an interesting region is encountered, the operator of the virtual camera using guided navigation can interactively bring the camera close to a specific region and direct the motion and angle of the camera to study the interesting area in detail, without unwillingly colliding with the walls of the colon. The operator can control the camera with a standard interface device such as a keyboard, mouse or nonstandard device such as a spaceball. In order to fully operate a camera in a virtual environment, six degrees of freedom for the camera are required. The camera must be able to move in the horizontal, vertical, and depth or Z direction (axes 217), as well as being able to rotate in another three degrees of freedom (axes 219) to allow the camera to move and scan all sides and angles of a virtual environment.

[0034] Methods for computing a centerline inside the area of interest are well known in the art (see, e.g., U.S. Pat. No. 5,971,767 entitled “SYSTEM AND METHOD FOR PERFORMING A THREE-DIMENSIONAL VIRTUAL EXAMINATION” by Kaufman et al.; issued Oct. 26, 1999 and incorporated by reference herein in its entirety).

[0035] Referring to FIG. 3, a two dimensional cross-section of a volumetric colon containing a flight path is indicated generally by the reference numeral 300. The cross-section 300 includes the final flight path for the camera model down the center of the colon, as indicated by “x”s, and at least one starting location 301 or 303 near one end of the colon.

[0036] Turning now to FIG. 4, a system used to perform a virtual examination of a human organ in accordance with the disclosure is indicated generally by the reference numeral 400. The system 400 is for performing the virtual examination of an object such as a human organ using the techniques described herein. A patient 401 lays on a platform 402, while a scanning device 405 scans the area that contains the organ or organs to be examined. The scanning device 405 contains a scanning portion 403 that takes images of the patient and an electronics portion 406. The electronics portion 406 includes an interface 407, a central processing unit 409, a memory 411 for temporarily storing the scanning data, and a second interface 413 for sending data to a virtual navigation platform or terminal 416. The interfaces 407 and 413 may be included in a single interface component or may be the same component. The components in the portion 406 are connected together with conventional connectors.

[0037] In the system 400, the data provided from the scanning portion 403 of the device 405 is transferred to unit 409 for processing and is stored in memory 411. The central processing unit 409 converts the scanned 2D data to 3D voxel data and stores the results in another portion of the memory 411. Alternatively, the converted data may be directly sent to the interface unit 413 to be transferred to the virtual navigation terminal 416. The conversion of the 2D data could also take place at the virtual navigation terminal 416 after being transmitted from the interface 413. In the preferred embodiment, the converted data is transmitted over a carrier 414 to the virtual navigation terminal 416 in order for an operator to perform the virtual examination. The data may also be transported in other conventional ways, such as storing the data on a storage medium and physically transporting it to terminal 416 or by using satellite transmissions, for example.

[0038] The scanned data need not be converted to its 3D representation until the visualization-rendering engine requires it to be in 3D form. This saves computational steps and memory storage space.

[0039] The virtual navigation terminal 416 includes a screen for viewing the virtual organ or other scanned image, an electronics portion 415 and an interface control 419 such as a keyboard, mouse or spaceball. The electronics portion 415 includes an interface port 421, a central processing unit 423, optional components 427 for running the terminal and a memory 425. The components in the terminal 416 are connected together with conventional connectors. The converted voxel data is received in the interface port 421 and stored in the memory 425. The central processing unit 423 then assembles the 3D voxels into a virtual representation and runs the submarine camera model as described for FIG. 2 to perform the virtual examination. As the submarine camera travels through the virtual organ, the visibility technique is used to compute only those areas that are visible from the virtual camera, and display them on the screen 417. A graphics accelerator can also be used in generating the representations. The operator can use the interface device 419 to indicate which portion of the scanned body is desired to be explored. The interface device 419 can further be used to control and move the submarine camera as desired as detailed for FIG. 2. The terminal portion 415 can be, for example, the Cube-4 dedicated system box, generally available from the Department of Computer Science at the State University of New York at Stony Brook.

[0040] The scanning device 405 and terminal 416, or parts thereof, can be part of the same unit. A single platform would be used to receive the scan image data, connect it to 3D voxels if necessary and perform the guided navigation.

[0041] An important feature in system 400 is that the virtual organ can be examined at a later time without the presence of the patient. Additionally, the virtual examination could take place while the patient is being scanned. The scan data can also be sent to multiple terminals, which would allow more than one doctor to view the inside of the organ simultaneously. Thus a doctor in New York could be looking at the same portion of a patient's organ at the same time with a doctor in California while discussing the case. Alternatively, the data can be viewed at different times. Two or more doctors could perform their own examination of the same data in a difficult case. Multiple virtual navigation terminals could be used to view the same scan data. By reproducing the organ as a virtual organ with a discrete set of data, there are a multitude of benefits in areas such as accuracy, cost and possible data manipulations.

[0042] Some of the applicable techniques may be further enhanced in virtual colonoscopy applications through the use of a number of additional techniques that are described in U.S. Pat. No. 6,343,936 entitled “SYSTEM AND METHOD FOR PERFORMING A THREE-DIMENSIONAL VIRTUAL EXAMINATION, NAVIGATION AND VISUALIZATION” by Kaufman et al.; issued Feb. 7, 2002, which is incorporated herein by reference in its entirety. These improvements, described briefly below, include improved colon cleansing, volume rendering, additional fly-path determination techniques, and alternative hardware embodiments.

[0043] An improved electronic colon cleansing technique employs modified bowel preparation operations followed by image segmentation operations, such that fluid and stool remaining in the colon during a computed tomographic (“CT”) or magnetic resonance imaging (“MRI”) scan can be detected and removed from the virtual colonoscopy images. Through the use of such techniques, conventional physical washing of the colon, and its associated inconvenience and discomfort, is minimized or completely avoided.

[0044] In addition to image segmentation and texture mapping, volume-rendering techniques may be used in connection with virtual colonoscopy procedures to further enhance the fidelity of the resulting image. Methods for volume rendering are well known to those of ordinary skill in the pertinent art.

[0045] Referring to FIG. 5, an exemplary representation of a colon and accompanying flight-path generated according to an embodiment of the present disclosure is indicated generally by the reference numeral 500. The representation 500 depicts a human colon 502 showing a centerline flight path 504. As the operator travels through the virtual organ along this flight path, two-dimensional images of the current position are displayed.

[0046] As shown in FIG. 6, an exemplary display of a two-dimensional slice of scan data according to an embodiment of the present disclosure is indicated generally by the reference numeral 600. The slice 600 is shown while advancing along the flight path, and the operator interface displays the virtual organ along with the slice for the current “z” coordinate and pans the image of that slice so that the current “x, y” position are in the center of the image. Thus, in this arrangement, the two-dimensional slices are axial slices, where convention has the z-axis pointing towards the head. However, once the scan data has been converted into a three-dimensional volume, two-dimensional slices oriented on other planes can be generated and viewed as well. For example, the two-dimensional images displayed to the operator can be oriented on the sagittal plane, the coronal plane, or perpendicular to the flight path.

[0047] Turning to FIG. 7, the colon of FIG. 5 intersected by a plane oriented perpendicular to the flight-path is indicated generally by the reference numeral 700. The intersected colon 700, for example, depicts a plane 704 indicating a two-dimensional image perpendicular to the flight path 702 through the colon 703.

[0048] Referring to FIG. 8, an exemplary operator interface screen according to embodiments of the present disclosure is indicated generally by the reference numeral 2100. The screen 2100 includes a number of sub-windows that simultaneously provide an operator with graphical information from a number of different perspectives. The center sub-window 2104 displays inside the virtual organ. An arrow or marker 2105 helps orient the operator along the projected flight path. A complete view of this flight path, along with the entire organ is depicted in sub-window 2102. Operator controls 2108 are near the bottom of the screen 2100 and are useful to control the travel through the virtual organ. The rendering of the virtual organ, as well as the control of flight through the organ, have been described earlier and are not repeated here. Four other sub-windows are shown 2106, 2114, 2112 and 2110 that provide two-dimensional images along the perpendicular plane, axial plane, sagittal plane and coronal plane, respectively. The displayed two-dimensional image is based on the current position along the flight path through the virtual organ. Each of these windows can include a marker, for example 2115, 2113 and 2111, to help orient the operator along the flight path.

[0049] Furthermore, each window 2106, 2114, 2112, and 2110 has a respective control for scrolling through two-dimensional images such as scroll bar 2115. Accordingly, the operator can traverse the flight path manually, in either direction, using this scroll bar.

[0050] The screen 2100 is exemplary in nature and a skilled artisan would recognize many equivalent alternatives within the scope of the present disclosure. For example, not all sub-windows 2106, 2114, 2112 and 2110 need to be displayed.

[0051] Turning to FIG. 9, a system embodiment based on a personal computer bus architecture is indicated generally by the reference numeral 900. The system 900 includes an alternate hardware embodiment suitable for deployment on a personal computer (“PC”), as illustrated. The system 900 includes a processor 910 that preferably takes the form of a high speed, multitasking processor, such as, for example, a Pentium III processor operating at a clock speed in excess of 400 MHZ. The processor 910 is coupled to a conventional bus structure 920 that provides for high-speed parallel data transfer. Also coupled to the bus structure 920 are a main memory 930, a graphics board 940, and a volume rendering board 950. The graphics board 940 is preferably one that can perform texture mapping, such as, for example, a Diamond Viper v770 Ultra board manufactured by Diamond Multimedia Systems. The volume rendering board 950 can take the form of the VolumePro board from Mitsubishi Electric, for example, which is based on U.S. Pat. Nos. 5,760,781 and 5,847,711, which are hereby incorporated by reference in their entirety. A display device 945, such as a conventional SVGA or RGB monitor, is operably coupled to the graphics board 940 for displaying the image data. A scanner interface board 960 is also provided for receiving data from an imaging scanner, such as an MRI or CT scanner, for example, and transmitting such data to the bus structure 920. The scanner interface board 960 may be an application specific interface product for a selected imaging scanner or can take the form of a general-purpose input/output card. The PC based system 900 will generally include an I/O interface 970 for coupling I/O devices 980, such as a keyboard, digital pointer or mouse, and the like, to the processor 910. Alternatively, the I/O interface can be coupled to the processor 910 via the bus 920.

[0052] Embodiments of the present disclosure provide a user interface displaying both two-dimensional and three-dimensional data. Organs within the body are, by nature, three-dimensional. Conventional medical imaging devices, however, as explained herein, create stacks of two-dimensional images when acquiring scan data. Radiologists and other specialists, therefore, have historically been trained to review and analyze these two-dimensional images. As a result, most doctors are comfortable viewing two-dimensional images even if three-dimensional reconstructions or virtualizations are available.

[0053] However, many organs are not simple convex objects but, instead, can be tortuous or have many branches. While a doctor may be comfortable analyzing two-dimensional images, performing navigation through complex organs is very difficult using merely two-dimensional images. Navigating using the two-dimensional images would include manually scrolling through the images to move in the “z” direction (along the major axis of the body) and panning the images to move in the “x” and “y” direction. In this manner an operator can traverse the organ looking for areas of interest.

[0054] On the other hand, three-dimensional flight paths, as described herein, are intuitive, efficient tools to virtually travel through volumetric renderings of human organs either automatically or manually. During a flight path tour, each point along the flight path is represented by a coordinate (x, y, z). According to embodiments of the present disclosure, these coordinates are used to automatically scroll and pan the series of two-dimensional images that doctors are used to analyzing. Thus, the operator does not have to manually navigate through an organ in two dimensions but, instead, can let the present virtualization system advance along the organ while the operator concentrates on analyzing each two-dimensional image.

[0055] The foregoing merely illustrates the principles of the disclosure. It will thus be appreciated that those skilled in the art will be able to devise numerous systems, apparatus and methods which, although not explicitly shown or described herein, embody the principles of the disclosure and are thus within the spirit and scope of the disclosure as defined by its claims.

[0056] For example, the methods and systems described herein could be applied to virtually examine an animal, fish or inanimate object. Besides the stated uses in the medical field, applications of the technique could be used to detect the contents of sealed objects that cannot be opened. The technique could also be used inside an architectural structure such as a building or cavern and enable the operator to navigate through the structure.

[0057] These and other features and advantages of the present disclosure may be readily ascertained by one of ordinary skill in the pertinent art based on the teachings herein. It is to be understood that the teachings of the present disclosure may be to implemented in various forms of hardware, software, firmware, special purpose processors, or combinations thereof.

[0058] Most preferably, the teachings of the present disclosure are implemented as a combination of hardware and software. Moreover, the software is preferably implemented as an application program tangibly embodied on a program storage unit. The application program may be uploaded to, and executed by, a machine comprising any suitable architecture. Preferably, the machine is implemented on a computer platform having hardware such as one or more central processing units (“CPU”), a random access memory (“RAM”), and input/output (“I/O”) interfaces. The computer platform may also include an operating system and microinstruction code. The various processes and functions described herein may be either part of the microinstruction code or part of the application program, or any combination thereof, which may be executed by a CPU. In addition, various other peripheral units may be connected to the computer platform such as an additional data storage unit and a printing unit.

[0059] It is to be further understood that, because some of the constituent system components and methods depicted in the accompanying drawings are preferably implemented in software, the actual connections between the system components or the process function blocks may differ depending upon the manner in which embodiments of the present disclosure are programmed. Given the teachings herein, one of ordinary skill in the pertinent art will be able to contemplate these and similar implementations or configurations of the present invention.

[0060] Although the illustrative embodiments have been described herein with reference to the accompanying drawings, it is to be understood that the present invention is not limited to those precise embodiments, and that various changes and modifications may be effected therein by one of ordinary skill in the pertinent art without departing from the scope or spirit of the present disclosure. All such changes and modifications are intended to be included within the scope of the present invention as set forth in the appended claims.