US20080080770A1 - Method and system for identifying regions in an image - Google Patents

Method and system for identifying regions in an image Download PDF

Info

Publication number
US20080080770A1
US20080080770A1 US11/558,715 US55871506A US2008080770A1 US 20080080770 A1 US20080080770 A1 US 20080080770A1 US 55871506 A US55871506 A US 55871506A US 2008080770 A1 US2008080770 A1 US 2008080770A1
Authority
US
United States
Prior art keywords
image
voxel
region
regional
interest
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US11/558,715
Other versions
US8923577B2 (en
Inventor
Paulo Ricardo Mendonca
Rahul Bhotika
Wesley David Turner
Jingbin Wang
Saad Ahmed Sirohey
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
General Electric Co
Original Assignee
General Electric Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by General Electric Co filed Critical General Electric Co
Priority to US11/558,715 priority Critical patent/US8923577B2/en
Assigned to GENERAL ELECTRIC COMPANY reassignment GENERAL ELECTRIC COMPANY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TURNER, WESLEY DAVID, WANG, JINGBIN, BHOTIKA, RAHUL, MENDONCA, PAULO RICARDO, SIROHEY, SAAD AHMED
Priority to JP2007245122A priority patent/JP5438267B2/en
Priority to DE102007046250A priority patent/DE102007046250A1/en
Publication of US20080080770A1 publication Critical patent/US20080080770A1/en
Assigned to US ARMY, SECRETARY OF THE ARMY reassignment US ARMY, SECRETARY OF THE ARMY CONFIRMATORY LICENSE (SEE DOCUMENT FOR DETAILS). Assignors: GENERAL ELECTRIC GLOBAL RESEARCH
Application granted granted Critical
Publication of US8923577B2 publication Critical patent/US8923577B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/143Segmentation; Edge detection involving probabilistic approaches, e.g. Markov random field [MRF] modelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30028Colon; Small intestine
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30061Lung

Definitions

  • the invention relates generally to methods and an apparatus for identifying regions in images, and more particularly to methods and apparatus for labeling anatomical structures in medical images.
  • CT images are often analyzed to identify various anatomical structures such as organs, lesions, etc.
  • a chest X-ray radiograph or a computed tomography (CT) image can be employed to facilitate the detection of lung cancer.
  • CT images advantageously provide a description of anatomy in great detail and consequently is being increasingly are used for detecting and following the evolution of lesions that may lead to potential cancers.
  • radiologists search for the presence of nodules and polyps in the lung and colon using advanced lung analysis (ALA) and computed tomographic colonography (CTC) techniques. Radiologists detect nodules in the lung by viewing axial slices of the chest.
  • CT systems generally provide several images for a single CT scan. Consequently, a considerable amount of information is presented to the radiologist for use in interpreting the images and detecting suspect regions that may indicate disease. The considerable amount of data associated with a single CT scan presents a time-consuming process to the radiologist. Furthermore, this substantial amount of data may disadvantageously lead to missed cancer detection, as it is difficult to identify a suspicious area in an extensive amount of data.
  • the sheer size of the CT volumes results in significant variability in radiological readings and clinically important nodules are missed.
  • CAD computer aided detection
  • CAD techniques have been developed to highlight the anatomical structures present in various regions in the image.
  • the regions are identified and labeled according to the local shape of their surrounding structures.
  • an Eigen analysis of a Hessian matrix is used as a tool to classify voxels as belonging to a vessel or a nodule.
  • such techniques consider a very small neighborhood of each voxel needed to compute image derivatives and thus are not very robust and/or accurate in identifying the structures.
  • a method for assigning labels to regions in an image comprises deriving a probabilistic model for a plurality of geometrical structures, computing a regional response around a region in the image, computing a region score for each geometrical structure using the plurality of probabilistic models and labeling the region in the image based on the region score.
  • a medical imaging system for labeling anatomical structures in an image.
  • the system comprises an image processor configured to compute a regional response around a voxel of interest in the image, compute a voxel score for each anatomical structures based on a plurality of probabilistic models, label the voxel in the image based on the region score and a display unit configured to display the image including the labeled anatomical regions.
  • a computed tomography (CT) system for labeling anatomical structures in a CT image.
  • the system comprises an image processor configured to compute a regional response around a voxel of interest in the CT image, compute a voxel score for each anatomical structures based on a plurality of probabilistic models, label the voxel in the image based on the region score.
  • the system further comprises a display unit configured to display the CT image including the labeled anatomical structures.
  • a computer-readable medium storing computer instructions for instructing a computer system to labeling regions in an image.
  • the computer instructions include deriving a probabilistic model for a plurality of geometrical structures, computing a regional response around a region in the image, computing a region score for each geometrical structure using the plurality of probabilistic models and labeling the region in the image based on the region score.
  • FIG. 1 is a diagrammatical view of an exemplary imaging system in the form of a CT imaging system for use in producing processed images and for analyzing the images and their underlying image data in accordance with aspects of the present technique;
  • FIG. 2 is a diagrammatical view of a physical implementation of the CT system of FIG. 1 ;
  • FIG. 3 is a flow chart illustrating exemplary steps in logic for carrying out image data processing based upon CAD analysis of acquired image data, in accordance with aspects of the present technique.
  • FIG. 1 is a block diagram showing an imaging system 10 for acquiring and processing image data such as medical image data in accordance with the present technique.
  • image data may also include seismic image data or topological image date.
  • the system 10 is a computed tomography (CT) system designed to acquire X-ray projection data, to reconstruct the projection data into an image, and to process the image data for display and analysis in accordance with the present technique.
  • CT computed tomography
  • the imaging system 10 includes a source of X-ray radiation 12 .
  • the source of X-ray radiation 12 is an X-ray tube.
  • the source of X-ray radiation 12 may be one or more solid-state X-ray emitters or, indeed, any other emitter capable of generating X-rays having a spectrum and energy useful for imaging a desired object.
  • the source of radiation 12 may be positioned near a collimator 14 , which may be configured to shape a stream of radiation 16 that is emitted by the source of radiation 12 .
  • the stream of radiation 16 passes into the imaging volume containing the subject to be imaged, such as a human patient 18 .
  • the stream of radiation 16 may be generally fan-shaped or cone-shaped, depending on the configuration of the detector array, discussed below, as well as the desired method of data acquisition.
  • a portion 20 of radiation passes through or around the subject and impacts a detector array, represented generally at reference numeral 22 . Detector elements of the array produce electrical signals that represent the intensity of the incident X-ray beam. These signals are acquired and processed to reconstruct an image of the features within the subject.
  • the radiation source 12 is controlled by a system controller 24 , which furnishes both power, and control signals for CT examination sequences.
  • the detector 22 is coupled to the system controller 24 , which commands acquisition of the signals generated in the detector 22 .
  • the system controller 24 may also execute various signal processing and filtration functions, such as for initial adjustment of dynamic ranges, interleaving of digital image data, and so forth.
  • system controller 24 commands operation of the imaging system to execute examination protocols and to process acquired data.
  • system controller 24 also includes signal processing circuitry, typically based upon a general purpose or application-specific digital computer, associated memory circuitry for storing programs and routines executed by the computer, as well as configuration parameters and image data, interface circuits, and so forth.
  • the system controller 24 is coupled via a motor controller 32 to a rotational subsystem 26 and a linear positioning subsystem 28 .
  • the rotational subsystem 26 enables the X-ray source 12 , the collimator 14 and the detector 22 to be rotated one or multiple turns around the patient 18 .
  • the rotational subsystem 26 may rotate only one of the source 12 or the detector 22 or may differentially activate various X-ray emitters and/or detector elements arranged in a ring about the imaging volume.
  • the rotational subsystem 26 may include a gantry.
  • the system controller 24 may be utilized to operate the gantry.
  • the linear positioning subsystem 28 enables the patient 18 , or more specifically a patient table, to be displaced linearly.
  • the patient table may be linearly moved within the gantry to generate images of particular areas of the patient 18 .
  • the source of radiation 12 may be controlled by an X-ray controller 30 disposed within the system controller 24 .
  • the X-ray controller 30 is configured to provide power and timing signals to the X-ray source 12 .
  • system controller 24 is also illustrated comprising a data acquisition system 34 .
  • the detector 22 is coupled to the system controller 24 , and more particularly to the data acquisition system 34 .
  • the data acquisition system 34 receives data collected by readout electronics of the detector 22 .
  • the data acquisition system 34 typically receives sampled analog signals from the detector 22 and converts the data to digital signals for subsequent processing by a computer 36 .
  • the computer 36 typically is coupled to or incorporates the system controller 24 .
  • the data collected by the data acquisition system 34 may be transmitted to the computer 36 for subsequent processing and reconstruction.
  • the computer 36 may include or communicate with a memory 38 that can store data processed by the computer 36 or data to be processed by the computer 36 . It should be understood that any type of memory configured to store a large amount of data might be utilized by such an exemplary system 10 .
  • the memory 38 may be located at the acquisition system or may include remote components, such as network accessible memory media, for storing data, processing parameters, and/or routines for implementing the techniques described below.
  • the computer 36 may also be adapted to control features such as scanning operations and data acquisition that may be enabled by the system controller 24 . Furthermore, the computer 36 may be configured to receive commands and scanning parameters from an operator via an operator workstation 40 , which is typically equipped with a keyboard and other input devices (not shown). An operator may thereby control the system 10 via the input devices. Thus, the operator may observe the reconstructed image and other data relevant to the system from computer 36 , initiate imaging, and so forth.
  • a display 42 coupled to the operator workstation 40 may be utilized to observe the reconstructed image. Additionally, the scanned image may also be printed by a printer 44 , which may be coupled to the operator workstation 40 .
  • the display 42 and printer 44 may also be connected to the computer 36 , either directly or via the operator workstation 40 .
  • the operator workstation 40 may also be coupled to a picture archiving and communications system (PACS) 46 .
  • PACS 46 might be coupled to a remote system 48 , radiology department information system (RIS), hospital information system (HIS) or to an internal or external network, so that others at different locations may gain access to the image data.
  • RIS radiology department information system
  • HIS hospital information system
  • CAD computer aided diagnosis
  • the CAD unit 50 may include software configured to apply a CAD algorithm to the image data. Further, the CAD unit 50 may also be coupled to the display 42 , where the image data may be displayed on the display. In practice, the CAD unit 50 may be part of the data acquisition system 34 , or may be a completely separate component, typically remote from the data acquisition system 34 , and configured to analyze image data stored on a memory, such as the PACS 46 .
  • the computer 36 and operator workstation 40 may be coupled to other output devices, which may include standard or special purpose computer monitors and associated processing circuitry.
  • One or more operator workstations 40 may be further linked in the system for outputting system parameters, requesting examinations, viewing images, and so forth.
  • displays, printers, workstations, and similar devices supplied within the system may be local to the data acquisition components, or may be remote from these components, such as elsewhere within an institution or hospital, or in an entirely different location, linked to the image acquisition system via one or more configurable networks, such as the Internet, a virtual private network or the like.
  • an exemplary imaging system utilized in a present embodiment may be a CT scanning system 52 , as depicted in greater detail in FIG. 2 .
  • the CT scanning system 52 may be a multi-slice detector CT (MDCT) system that offers a wide array of axial coverage, high gantry rotational speed, and high spatial resolution.
  • the CT scanning system 52 may be a volumetric CT (VCT) system utilizing a cone-beam geometry and an area detector to allow the imaging of a volume, such as an entire internal organ of a subject, at high or low gantry rotational speeds.
  • VCT volumetric CT
  • the CT scanning system 52 is illustrated with a frame 54 and a gantry 56 that has an aperture 58 through which a patient 18 may be moved.
  • a patient table 60 may be positioned in the aperture 58 of the frame 54 and the gantry 56 to facilitate movement of the patient 18 , typically via linear displacement of the table 60 by the linear positioning subsystem 28 (see FIG. 1 ).
  • the gantry 56 is illustrated with the source of radiation 12 , such as an X-ray tube that emits X-ray radiation from a focal point 62 .
  • the stream of radiation is directed towards a cross section of the patient 18 including the heart.
  • the X-ray source 12 projects an X-ray beam from the focal point 62 and toward detector array 22 (see FIG. 1 ).
  • the collimator 14 (see FIG. 1 ), such as lead or tungsten shutters, typically defines the size and shape of the X-ray beam that emerges from the X-ray source 12 .
  • the detector 22 is generally formed by a plurality of detector elements, which detect the X-rays that pass through and around a subject of interest, such as the heart or chest. Each detector element produces an electrical signal that represents the intensity of the X-ray beam at the position of the element at the time the beam strikes the detector 22 .
  • the gantry 56 is rotated around the subject of interest so that a plurality of radiographic views may be collected by the computer 36 .
  • the detector 22 collects data related to the attenuated X-ray beams. Data collected from the detector 22 then undergoes pre-processing and calibration to condition the data to represent the line integrals of the attenuation coefficients of the scanned objects.
  • the processed data commonly called projections, may then be filtered and backprojected to formulate an image of the scanned area.
  • a formulated image may incorporate, in certain modes, less or more than 360 degrees of projection data.
  • the reconstructed image may be further analyzed using a CAD algorithm to enable a radiologist examining the image to identify certain anatomical structures in the image.
  • the CAD algorithm can be used to label lung nodules in CT images of the lungs and polyps in images of the colon, identify lesions in liver images, identify aneurysms in neural and cardiac images, detect pulmonary embolisms in lung images, vessel tree extraction and junction/branch detection
  • the CAD algorithm is configured to identify several regions of interest in the image and process the regions to compute a corresponding region score. The region score is then used to label the anatomical structures for further diagnosis.
  • the manner in which the algorithm is applied to the image is described in further detail in FIG. 3 .
  • FIG. 3 represents a flow chart of exemplary steps in carrying out a processing routine based upon CAD analysis.
  • the technique summarized in FIG. 3 begins at step 68 where image data may be acquired.
  • the image data acquisition of step 68 is typically initiated by an operator interfacing with the system via the operator workstation 40 (see FIG. 1 ).
  • Readout electronics detect signals generated by virtue of the impact radiation on the scanner detector, and the system processes these signals to produce useful image data.
  • image data may also be accessed from image acquisition devices, such as, but not limited to, magnetic resonance imaging (MRI) system, positron emission tomography (PET) system, a single photon emission computed tomography (SPECT) system or digital radiography systems.
  • image acquisition devices mentioned hereinabove may be used to directly acquire image data from a patient 18 (see FIG. 1 ), image data may instead include data from an archive site or data storage facility.
  • a region in the acquired image is identified.
  • the region includes a pixel of interest or a voxel of interest.
  • the region may comprise several voxels or pixels.
  • each voxel in the image is thresholded to identify the region.
  • the region usually represents various anatomical structures such as lung nodules, vessels, etc.
  • a regional response of the region is computed.
  • the regional responses of a non-linear filter are computed for the voxel of interest.
  • Regional response is defined as a response computed for the region and/or a specified neighborhood of voxels around the region.
  • the size of the neighborhood voxels identified can be set based on the size of the structures of interest to be identified. A particular embodiment uses the principal curvatures of isosurfaces at each voxel as the regional response.
  • a region score is computed for the identified region using a plurality of probabilistic models.
  • the region score represents a value that indicates a likelihood of the region belonging to a specific anatomical structure such as vessel or a nodule.
  • the region score comprises of the probability of observing the given curvature data for the region due to a specific anatomical structure.
  • the region score comprises of a function of the intensity data for the region.
  • the region is labeled by using the region score.
  • the labeled regions may be displayed to a radiologist with different colors for different anatomical structures for better visualization of the image.
  • the probabilistic models are derived by approximating or modeling an anatomical structure through a geometric shape or a combination of geometric shapes.
  • the anatomical structures are then represented using model parameters.
  • the distribution of the response is derived as a function of the model parameters.
  • Each model has its parameters and these parameters have a range of values that they can assume.
  • a probability distribution describing the likeliness of a particular parameter assuming a particular value is also derived. For deriving such a distribution, geometric and physical constraints are applied along with the incorporation of existing medical knowledge. In a more specific embodiment, the probability distribution of responses are computed for specific shape models that are absent and/or for noisy images.
  • the technique described in FIG. 3 can be applied to various medical images.
  • One specific application of the present technique is identifying nodules and vessels in CT images of the lungs used for the detection of lung cancer.
  • An example computation of a region score and the subsequent labeling of the image are described in detail below.
  • a Bayesian framework is used for the computation of the region score.
  • One of the M i representing each possible label is to be attached to data ‘D’ associated to a voxel ‘x’.
  • the region score for each model Mi ⁇ can be computed by using Bayes' law:
  • p ⁇ ( Mi ⁇ D , x ) p ⁇ ( D ⁇ Mi , x ) ⁇ P ⁇ ( Mi ⁇ x ) p ⁇ ( D ⁇ x )
  • the label for the voxel ‘x’ can then be assigned through some function that compares the region score for the different models.
  • the label assigned is the one corresponding to the model with the maximum evidence, i.e.,
  • M * arg ⁇ max Mi ⁇ ( p ⁇ ( Mi ⁇ D , x ) )
  • the second path is to assume that each datum D j is independent of every other distinct datum in the set D given a choice of model M i and its parameters m i , again followed by marginalization over m i ⁇ M i , i.e.,
  • P ⁇ ( Mi ⁇ D , x ) P ⁇ ( M i ⁇ x ) p ⁇ ( D ⁇ x ) ⁇ ⁇ m I ⁇ ⁇ j ⁇ p ⁇ ( Dj ⁇ m i , M i , x ) ⁇ p ⁇ ( m i ⁇ M i , x ) ⁇ ⁇ m i Equation ⁇ ⁇ ( 2 )
  • the third path is to not to assume any independence but marginalize over mi ⁇ Mi, i.e.,
  • a parametric model M i and the datum Dj ⁇ D for the set D associated to the voxel ‘x’ is defined.
  • an outlier model M 1 , a nodule model M 2 , a vessel model M 3 , a pleural or wall nodule model M 4 , a pleural or wall ridge model M 5 , and a junction model M 6 are considered.
  • the last five models, jointly referred to as anatomical models, are representative of the structures that can be found in lungs.
  • the first model is a catch-all for all models that does not correspond to any one of the anatomical models.
  • the first model is used to account for variability in the anatomical models.
  • the regional response comprises curvature data of the voxel.
  • the manner in which the curvature data is computed is described below.
  • the principal curvatures ‘K’ of the isosurface is given by the following equation:
  • the curvature data thus obtained is compared to the curvature data of the nodule model M 2 to estimate the likelihood of the voxel to be labeled a nodule.
  • the manner in which nodule model is generated is described below.
  • the joint cumulative probability distribution function of ⁇ 2 and ⁇ 2 , P ⁇ 2, ⁇ 2 ( ⁇ , ⁇ ) is computed from the expression for the fractional volume of the solid obtained from the ellipsoid M 2 when only the subset (0, ⁇ ) ⁇ ( ⁇ /2, ⁇ ) of its domain is considered, and is given by:
  • ⁇ K ⁇ ⁇ 2 ⁇ ( K ⁇ a , c ) ⁇ 2 ⁇ p K 1 ⁇ ( ⁇ ⁇ ⁇ ) + 3 ⁇ ( 1 - ⁇ 2 ) ⁇ c 3 2 ⁇ a ⁇ ⁇ 4 ⁇ k 2 9 ⁇ ( c 2 - a 2 ) ⁇ ( c 2 ⁇ k 1 - a 2 ⁇ k 2 ) ⁇ I Ka , c 2 ⁇ ( K ) ⁇ ⁇
  • the models for the vessel M 3 , a pleural or wall nodule M 4 , a pleural or wall ridge M 5 , and a junction M 6 can be generated using the same technique described above.
  • the region score generated by equation (4) can be compared with the probability density of the nodule model as described in equation (5) to determine the probability of the region being a nodule.
  • the computer 36 may use the computer aided diagnosis (CAD) unit 50 to identify regions of interest in the image.
  • the computer processor is configured to process the data corresponding to the regions of interest to produce a region score.
  • the region score is then compared to the various models that can be generated using the techniques described above to identify and label the anatomical structures in the image.
  • the above described invention has several advantages including robust and accurate estimates of identifying and labeling the region as the computation takes into consideration a neighborhood of voxels around the region and probabilistic models computed from various geometrical shapes.

Abstract

A method and system for visualizing regions in an image is provided. The method comprises computing a regional response around a region in the image, deriving a region score based on from the regional response for the region and labeling the region in the image by comparing the region score to a plurality of probabilistic models.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application is related to Provisional Application U.S. Ser. No. 60/847,777, entitled “Local Anatomical Signatures”, filed Sep. 28, 2006, the contents of which are herein incorporated by reference and the benefit of priority to which is claimed under 35 U.S.C. 119(e).
  • STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH & DEVELOPMENT
  • The US Government may have certain rights in this invention pursuant to subcontract number 1-0378 under prime contract number W81XWH-1-0378 awarded by the Department of Defense.
  • BACKGROUND
  • The invention relates generally to methods and an apparatus for identifying regions in images, and more particularly to methods and apparatus for labeling anatomical structures in medical images.
  • In many imaging applications, specifically medical imaging applications, the images are often analyzed to identify various anatomical structures such as organs, lesions, etc. For example, a chest X-ray radiograph or a computed tomography (CT) image can be employed to facilitate the detection of lung cancer. Specifically, CT images advantageously provide a description of anatomy in great detail and consequently is being increasingly are used for detecting and following the evolution of lesions that may lead to potential cancers.
  • For example, for the detection of lung and colon cancer, radiologists search for the presence of nodules and polyps in the lung and colon using advanced lung analysis (ALA) and computed tomographic colonography (CTC) techniques. Radiologists detect nodules in the lung by viewing axial slices of the chest. However, CT systems generally provide several images for a single CT scan. Consequently, a considerable amount of information is presented to the radiologist for use in interpreting the images and detecting suspect regions that may indicate disease. The considerable amount of data associated with a single CT scan presents a time-consuming process to the radiologist. Furthermore, this substantial amount of data may disadvantageously lead to missed cancer detection, as it is difficult to identify a suspicious area in an extensive amount of data. In addition, the sheer size of the CT volumes results in significant variability in radiological readings and clinically important nodules are missed.
  • Techniques variously described as computer aided detection, or computer assisted detection or computer assisted diagnosis, and often referred to by the acronym “CAD” have emerged as a viable approach for aiding the radiologists in the detection of lung nodules in chest radiographs and thoracic CT scans, as well as for detecting and diagnosing other anatomies and disease states.
  • Several CAD techniques have been developed to highlight the anatomical structures present in various regions in the image. In one specific technique, the regions are identified and labeled according to the local shape of their surrounding structures. In one more specific technique used to identify lung cancer, an Eigen analysis of a Hessian matrix is used as a tool to classify voxels as belonging to a vessel or a nodule. However, such techniques consider a very small neighborhood of each voxel needed to compute image derivatives and thus are not very robust and/or accurate in identifying the structures.
  • It may therefore be desirable to develop a robust technique and system for processing image data that advantageously facilitates substantially superior initial shape-based identification of regions in an image that can be consequently used for the analysis of the object being examined.
  • BRIEF DESCRIPTION
  • Briefly, according to one embodiment of the invention, a method for assigning labels to regions in an image is provided. The method comprises deriving a probabilistic model for a plurality of geometrical structures, computing a regional response around a region in the image, computing a region score for each geometrical structure using the plurality of probabilistic models and labeling the region in the image based on the region score.
  • In another embodiment, a medical imaging system for labeling anatomical structures in an image is provided. The system comprises an image processor configured to compute a regional response around a voxel of interest in the image, compute a voxel score for each anatomical structures based on a plurality of probabilistic models, label the voxel in the image based on the region score and a display unit configured to display the image including the labeled anatomical regions.
  • In another embodiment, a computed tomography (CT) system for labeling anatomical structures in a CT image is provided. The system comprises an image processor configured to compute a regional response around a voxel of interest in the CT image, compute a voxel score for each anatomical structures based on a plurality of probabilistic models, label the voxel in the image based on the region score. The system further comprises a display unit configured to display the CT image including the labeled anatomical structures.
  • In another embodiment, a computer-readable medium storing computer instructions for instructing a computer system to labeling regions in an image. The computer instructions include deriving a probabilistic model for a plurality of geometrical structures, computing a regional response around a region in the image, computing a region score for each geometrical structure using the plurality of probabilistic models and labeling the region in the image based on the region score.
  • DRAWINGS
  • These and other features, aspects, and advantages of the present invention will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
  • FIG. 1 is a diagrammatical view of an exemplary imaging system in the form of a CT imaging system for use in producing processed images and for analyzing the images and their underlying image data in accordance with aspects of the present technique;
  • FIG. 2 is a diagrammatical view of a physical implementation of the CT system of FIG. 1; and
  • FIG. 3 is a flow chart illustrating exemplary steps in logic for carrying out image data processing based upon CAD analysis of acquired image data, in accordance with aspects of the present technique.
  • DETAILED DESCRIPTION
  • FIG. 1 is a block diagram showing an imaging system 10 for acquiring and processing image data such as medical image data in accordance with the present technique. However, as will be appreciated by one skilled in the art, image data may also include seismic image data or topological image date. In the illustrated embodiment, the system 10 is a computed tomography (CT) system designed to acquire X-ray projection data, to reconstruct the projection data into an image, and to process the image data for display and analysis in accordance with the present technique. In the embodiment illustrated in FIG. 1, the imaging system 10 includes a source of X-ray radiation 12. In one exemplary embodiment, the source of X-ray radiation 12 is an X-ray tube. In other embodiments, the source of X-ray radiation 12 may be one or more solid-state X-ray emitters or, indeed, any other emitter capable of generating X-rays having a spectrum and energy useful for imaging a desired object.
  • The source of radiation 12 may be positioned near a collimator 14, which may be configured to shape a stream of radiation 16 that is emitted by the source of radiation 12. The stream of radiation 16 passes into the imaging volume containing the subject to be imaged, such as a human patient 18. The stream of radiation 16 may be generally fan-shaped or cone-shaped, depending on the configuration of the detector array, discussed below, as well as the desired method of data acquisition. A portion 20 of radiation passes through or around the subject and impacts a detector array, represented generally at reference numeral 22. Detector elements of the array produce electrical signals that represent the intensity of the incident X-ray beam. These signals are acquired and processed to reconstruct an image of the features within the subject.
  • The radiation source 12 is controlled by a system controller 24, which furnishes both power, and control signals for CT examination sequences. Moreover, the detector 22 is coupled to the system controller 24, which commands acquisition of the signals generated in the detector 22. The system controller 24 may also execute various signal processing and filtration functions, such as for initial adjustment of dynamic ranges, interleaving of digital image data, and so forth. In general, system controller 24 commands operation of the imaging system to execute examination protocols and to process acquired data. In the present context, system controller 24 also includes signal processing circuitry, typically based upon a general purpose or application-specific digital computer, associated memory circuitry for storing programs and routines executed by the computer, as well as configuration parameters and image data, interface circuits, and so forth.
  • In the embodiment illustrated in FIG. 1, the system controller 24 is coupled via a motor controller 32 to a rotational subsystem 26 and a linear positioning subsystem 28. In one embodiment, the rotational subsystem 26 enables the X-ray source 12, the collimator 14 and the detector 22 to be rotated one or multiple turns around the patient 18. In other embodiments, the rotational subsystem 26 may rotate only one of the source 12 or the detector 22 or may differentially activate various X-ray emitters and/or detector elements arranged in a ring about the imaging volume. In embodiments in which the source 12 and/or detector 22 are rotated, the rotational subsystem 26 may include a gantry. Thus, the system controller 24 may be utilized to operate the gantry. The linear positioning subsystem 28 enables the patient 18, or more specifically a patient table, to be displaced linearly. Thus, the patient table may be linearly moved within the gantry to generate images of particular areas of the patient 18.
  • Additionally, as will be appreciated by those skilled in the art, the source of radiation 12 may be controlled by an X-ray controller 30 disposed within the system controller 24. Particularly, the X-ray controller 30 is configured to provide power and timing signals to the X-ray source 12.
  • Further, the system controller 24 is also illustrated comprising a data acquisition system 34. In this exemplary embodiment, the detector 22 is coupled to the system controller 24, and more particularly to the data acquisition system 34. The data acquisition system 34 receives data collected by readout electronics of the detector 22. The data acquisition system 34 typically receives sampled analog signals from the detector 22 and converts the data to digital signals for subsequent processing by a computer 36.
  • The computer 36 typically is coupled to or incorporates the system controller 24. The data collected by the data acquisition system 34 may be transmitted to the computer 36 for subsequent processing and reconstruction. The computer 36 may include or communicate with a memory 38 that can store data processed by the computer 36 or data to be processed by the computer 36. It should be understood that any type of memory configured to store a large amount of data might be utilized by such an exemplary system 10. Moreover, the memory 38 may be located at the acquisition system or may include remote components, such as network accessible memory media, for storing data, processing parameters, and/or routines for implementing the techniques described below.
  • The computer 36 may also be adapted to control features such as scanning operations and data acquisition that may be enabled by the system controller 24. Furthermore, the computer 36 may be configured to receive commands and scanning parameters from an operator via an operator workstation 40, which is typically equipped with a keyboard and other input devices (not shown). An operator may thereby control the system 10 via the input devices. Thus, the operator may observe the reconstructed image and other data relevant to the system from computer 36, initiate imaging, and so forth.
  • A display 42 coupled to the operator workstation 40 may be utilized to observe the reconstructed image. Additionally, the scanned image may also be printed by a printer 44, which may be coupled to the operator workstation 40. The display 42 and printer 44 may also be connected to the computer 36, either directly or via the operator workstation 40. The operator workstation 40 may also be coupled to a picture archiving and communications system (PACS) 46. It should be noted that PACS 46 might be coupled to a remote system 48, radiology department information system (RIS), hospital information system (HIS) or to an internal or external network, so that others at different locations may gain access to the image data. Additionally, a computer aided diagnosis (CAD) unit 50 may be operably coupled to the computer 36. It may be noted that the CAD unit 50 may include software configured to apply a CAD algorithm to the image data. Further, the CAD unit 50 may also be coupled to the display 42, where the image data may be displayed on the display. In practice, the CAD unit 50 may be part of the data acquisition system 34, or may be a completely separate component, typically remote from the data acquisition system 34, and configured to analyze image data stored on a memory, such as the PACS 46.
  • It should be further noted that the computer 36 and operator workstation 40 may be coupled to other output devices, which may include standard or special purpose computer monitors and associated processing circuitry. One or more operator workstations 40 may be further linked in the system for outputting system parameters, requesting examinations, viewing images, and so forth. In general, displays, printers, workstations, and similar devices supplied within the system may be local to the data acquisition components, or may be remote from these components, such as elsewhere within an institution or hospital, or in an entirely different location, linked to the image acquisition system via one or more configurable networks, such as the Internet, a virtual private network or the like.
  • As noted above, an exemplary imaging system utilized in a present embodiment may be a CT scanning system 52, as depicted in greater detail in FIG. 2. The CT scanning system 52 may be a multi-slice detector CT (MDCT) system that offers a wide array of axial coverage, high gantry rotational speed, and high spatial resolution. Alternately, the CT scanning system 52 may be a volumetric CT (VCT) system utilizing a cone-beam geometry and an area detector to allow the imaging of a volume, such as an entire internal organ of a subject, at high or low gantry rotational speeds. The CT scanning system 52 is illustrated with a frame 54 and a gantry 56 that has an aperture 58 through which a patient 18 may be moved. A patient table 60 may be positioned in the aperture 58 of the frame 54 and the gantry 56 to facilitate movement of the patient 18, typically via linear displacement of the table 60 by the linear positioning subsystem 28 (see FIG. 1). The gantry 56 is illustrated with the source of radiation 12, such as an X-ray tube that emits X-ray radiation from a focal point 62. For cardiac imaging, the stream of radiation is directed towards a cross section of the patient 18 including the heart.
  • In typical operation, the X-ray source 12 projects an X-ray beam from the focal point 62 and toward detector array 22 (see FIG. 1). The collimator 14 (see FIG. 1), such as lead or tungsten shutters, typically defines the size and shape of the X-ray beam that emerges from the X-ray source 12. The detector 22 is generally formed by a plurality of detector elements, which detect the X-rays that pass through and around a subject of interest, such as the heart or chest. Each detector element produces an electrical signal that represents the intensity of the X-ray beam at the position of the element at the time the beam strikes the detector 22. The gantry 56 is rotated around the subject of interest so that a plurality of radiographic views may be collected by the computer 36.
  • Thus, as the X-ray source 12 and the detector 22 rotate, the detector 22 collects data related to the attenuated X-ray beams. Data collected from the detector 22 then undergoes pre-processing and calibration to condition the data to represent the line integrals of the attenuation coefficients of the scanned objects. The processed data, commonly called projections, may then be filtered and backprojected to formulate an image of the scanned area. A formulated image may incorporate, in certain modes, less or more than 360 degrees of projection data. Once reconstructed, the image produced by the system of FIGS. 1 and 2 reveals internal features 66 of the patient 18.
  • The reconstructed image may be further analyzed using a CAD algorithm to enable a radiologist examining the image to identify certain anatomical structures in the image. For example, the CAD algorithm can be used to label lung nodules in CT images of the lungs and polyps in images of the colon, identify lesions in liver images, identify aneurysms in neural and cardiac images, detect pulmonary embolisms in lung images, vessel tree extraction and junction/branch detection The CAD algorithm is configured to identify several regions of interest in the image and process the regions to compute a corresponding region score. The region score is then used to label the anatomical structures for further diagnosis. The manner in which the algorithm is applied to the image is described in further detail in FIG. 3.
  • FIG. 3 represents a flow chart of exemplary steps in carrying out a processing routine based upon CAD analysis. The technique summarized in FIG. 3 begins at step 68 where image data may be acquired. In a CT system, for example, the image data acquisition of step 68 is typically initiated by an operator interfacing with the system via the operator workstation 40 (see FIG. 1). Readout electronics detect signals generated by virtue of the impact radiation on the scanner detector, and the system processes these signals to produce useful image data. However, as will be appreciated by one skilled in the art, image data may also be accessed from image acquisition devices, such as, but not limited to, magnetic resonance imaging (MRI) system, positron emission tomography (PET) system, a single photon emission computed tomography (SPECT) system or digital radiography systems. In addition, while the image acquisition devices mentioned hereinabove may be used to directly acquire image data from a patient 18 (see FIG. 1), image data may instead include data from an archive site or data storage facility.
  • After acquiring the image, a region in the acquired image is identified. In one embodiment, the region includes a pixel of interest or a voxel of interest. The region may comprise several voxels or pixels. In one embodiment, each voxel in the image is thresholded to identify the region. The region usually represents various anatomical structures such as lung nodules, vessels, etc.
  • In step 72, a regional response of the region is computed. In one embodiment, the regional responses of a non-linear filter are computed for the voxel of interest. Regional response is defined as a response computed for the region and/or a specified neighborhood of voxels around the region. In one embodiment, the size of the neighborhood voxels identified can be set based on the size of the structures of interest to be identified. A particular embodiment uses the principal curvatures of isosurfaces at each voxel as the regional response.
  • In step 74, a region score is computed for the identified region using a plurality of probabilistic models. The region score represents a value that indicates a likelihood of the region belonging to a specific anatomical structure such as vessel or a nodule. In one embodiment, the region score comprises of the probability of observing the given curvature data for the region due to a specific anatomical structure. In another embodiment, the region score comprises of a function of the intensity data for the region.
  • In step 76, the region is labeled by using the region score. In a further embodiment, the labeled regions may be displayed to a radiologist with different colors for different anatomical structures for better visualization of the image.
  • The probabilistic models are derived by approximating or modeling an anatomical structure through a geometric shape or a combination of geometric shapes. The anatomical structures are then represented using model parameters. The distribution of the response is derived as a function of the model parameters. Each model has its parameters and these parameters have a range of values that they can assume.
  • In a further embodiment, a probability distribution describing the likeliness of a particular parameter assuming a particular value is also derived. For deriving such a distribution, geometric and physical constraints are applied along with the incorporation of existing medical knowledge. In a more specific embodiment, the probability distribution of responses are computed for specific shape models that are absent and/or for noisy images.
  • The technique described in FIG. 3 can be applied to various medical images. One specific application of the present technique is identifying nodules and vessels in CT images of the lungs used for the detection of lung cancer. An example computation of a region score and the subsequent labeling of the image are described in detail below.
  • In one embodiment, a Bayesian framework is used for the computation of the region score. Let M={Mi, i=1, . . . ,N}, be a set of parametric models with parameter mi in the domain Mi, that is mi ε Mi. One of the Mi representing each possible label is to be attached to data ‘D’ associated to a voxel ‘x’. In a Bayesian formulation, the region score for each model Mi ε can be computed by using Bayes' law:
  • p ( Mi D , x ) = p ( D Mi , x ) P ( Mi x ) p ( D x )
  • The label for the voxel ‘x’ can then be assigned through some function that compares the region score for the different models. In one embodiment, the label assigned is the one corresponding to the model with the maximum evidence, i.e.,
  • M * = arg max Mi ( p ( Mi D , x ) )
  • Assuming D to be a set of individual data Dj, i.e., D={Dj, j=1, . . . , M}, three alternatives paths can be followed. The first path is to assume that each datum Dj is independent of every other distinct datum in the set D given a choice of model Mi, followed by marginalization over mi ε Mi, i.e.,
  • P ( Mi D , x ) = P ( M i x ) p ( D x ) × j p ( Dj m i , M i , x ) p ( m i M i , x ) m i Equation ( 1 )
  • The second path is to assume that each datum Dj is independent of every other distinct datum in the set D given a choice of model Mi and its parameters mi, again followed by marginalization over mi ε Mi, i.e.,
  • P ( Mi D , x ) = P ( M i x ) p ( D x ) × m I j p ( Dj m i , M i , x ) p ( m i M i , x ) m i Equation ( 2 )
  • The third path is to not to assume any independence but marginalize over mi ε Mi, i.e.,
  • P ( Mi D , x ) = P ( M i x ) p ( D x ) × m I p ( D m i , M i , x ) p ( m i M i , x ) m i Equation ( 3 )
  • To compute the region score ‘p(Dj|mi, Mi, x)’ in equation (1), a parametric model Mi and the datum Dj ε D for the set D associated to the voxel ‘x’ is defined. In one embodiment, an outlier model M1, a nodule model M2, a vessel model M3, a pleural or wall nodule model M4, a pleural or wall ridge model M5, and a junction model M6 are considered. The last five models, jointly referred to as anatomical models, are representative of the structures that can be found in lungs. The first model is a catch-all for all models that does not correspond to any one of the anatomical models. The first model is used to account for variability in the anatomical models.
  • As described in FIG. 3, in one embodiment the regional response comprises curvature data of the voxel. The manner in which the curvature data is computed is described below. A volume image ‘I’ is defined as a twice differentiable mapping from a compact sub-domain R3 into R. For any given I0, I(x)=0 defines an isosurface at pints ‘x’ and ∇I(x)≠0. The principal curvatures ‘K’ of the isosurface is given by the following equation:
  • K = min max - v T Z T HZv I Equation ( 4 )
  • where the orthonormal columns of the 3×2 matrix Z span the null space of ∇I.
  • The matrix C=−ZTHZ/∥∇I∥ represents the curvature data of the volume image at voxel ‘x’. The curvature data thus obtained is compared to the curvature data of the nodule model M2 to estimate the likelihood of the voxel to be labeled a nodule. The manner in which nodule model is generated is described below.
  • Typically, the nodule model M2 chosen to represent a nodule is a solid ellipsoid with similar concentric ellipsoidal isosurfaces such that the outermost isosurface is an ellipsoid with semi-axis given by a=b<=c, where a, b and c are the lengths of the three semi-axes, i.e.,
  • M 2 : ρ × θ × Φ R 3 ( ρ , θ , ϕ ) -> [ a ρ cos θcos φ a ρsin θcos φ c ρ sin φ ]
  • where ρ=[0, 1], θ=[0, 2π), Φ=[−π/2, π/2], and each choice of ρε Π defines a different isosurface.
  • Let ‘x’ be a point in the range R2 ⊂ R3 of M2, randomly chosen according to a uniform distribution on R2, i.e., the probability that ‘x’ will be inside a subregion of M2 depends on the volume measure of the subregion. Consider the random variables Π2, Φ2 given by:
  • Π2: R2→R, x
    Figure US20080080770A1-20080403-P00001
    p(x)ε Π
  • Φ2: R2→R2, x
    Figure US20080080770A1-20080403-P00001
    φ(x)ε Φ
  • The joint cumulative probability distribution function of Π2 and Φ2, PΠ2, Φ2 (ρ, φ) is computed from the expression for the fractional volume of the solid obtained from the ellipsoid M2 when only the subset (0, ρ)×θ×(−π/2, φ) of its domain is considered, and is given by:

  • P Π2, Φ2(ρ, φ)=(ρ3(sin φ+1))/2
  • Therefore, the joint probability density PΠ2, Φ2(ρ, φ) of Π2 and Φ2 is:

  • P Π2, Φ2(ρ, φ)=3 ρ2(cos φ)/2*I(ρ)Iφ(Φ)
  • Now consider, the random vector K: R2→R, x
    Figure US20080080770A1-20080403-P00001
    k(x)=(k1, k2). Using standard results in transformations of random variables and considering a probability α2 of an outlier, the joint probability density ρK2(K|a, c) of K1 2 and K2 2 from, produces
  • ρ K 2 ( K a , c ) = α 2 p K 1 ( κ σ ) + 3 ( 1 - α 2 ) c 3 2 a 4 k 2 9 ( c 2 - a 2 ) ( c 2 k 1 - a 2 k 2 ) I Ka , c 2 ( K ) where Ka , c 2 ( K ) is the set and K a , c 2 ( K ) = { K ɛ R 2 a c 2 κ 1 , and Max ( k 1 , ( c 2 k 1 a 4 ) 1 / 3 ) k 2 < c 2 a 2 k 1 Equation ( 5 )
  • The models for the vessel M3, a pleural or wall nodule M4, a pleural or wall ridge M5, and a junction M6 can be generated using the same technique described above. The region score generated by equation (4) can be compared with the probability density of the nodule model as described in equation (5) to determine the probability of the region being a nodule.
  • As can be seen from the techniques described above the computer 36 may use the computer aided diagnosis (CAD) unit 50 to identify regions of interest in the image. The computer processor is configured to process the data corresponding to the regions of interest to produce a region score. The region score is then compared to the various models that can be generated using the techniques described above to identify and label the anatomical structures in the image.
  • The above described invention has several advantages including robust and accurate estimates of identifying and labeling the region as the computation takes into consideration a neighborhood of voxels around the region and probabilistic models computed from various geometrical shapes.
  • While only certain features of the invention have been illustrated and described herein, many modifications and changes will occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the invention.

Claims (33)

1. A method for assigning labels to regions in an image, the method comprising:
deriving a probabilistic model for a plurality of geometrical structures;
computing a regional response around a region in the image;
computing a region score for each geometrical structure using the plurality of probabilistic models; and
labeling the region in the image based on the region score.
2. The method of claim 1, wherein the geometrical structures comprises anatomical structures and the labeling comprising labeling the anatomical structures.
3. The method of claim 2, wherein the anatomical structure is one of a nodule, a vessel, a polyp, a tumor, a lesion, a fold, an aneurysm and a pulmonary embolism.
4. The method of claim 1, wherein the deriving the probabilistic models comprises:
modeling the anatomical structures using a plurality of geometric models; and
representing the anatomical structures using model parameters.
5. The method of claim 4, wherein computing the regional response comprises computing the regional response as a function of the geometrical models and the model parameters.
6. The method of claim 5, further comprising deriving a distribution of the regional response as a function of the model parameter.
7. The method of claim 6, wherein the deriving the distribution comprises applying a knowledgebase of anatomical and functional information.
8. The method of claim 6, further comprising deriving a distribution of the regional responses for noisy regions in the image.
9. The method of claim 1, wherein the regional response comprises of a response at a pixel of interest or a voxel of interest.
10. The method of claim 1, wherein the regional response comprises of a response at a neighborhood around a pixel of interest or a voxel of interest.
11. The method of claim 1, wherein the regional response comprises of principal curvatures for the region.
12. The method of claim 1, wherein the regional response comprises of a function of the intensity and/or texture data for the region.
13. The method of claim 1, further comprising identifying a set of regions from the image.
14. The method of claim 13, wherein the identifying comprises thresholding each pixel or voxel in the image to identify the regions of interest.
15. The method of claim 1, wherein the image comprises a medical image.
16. The method of claim 1, wherein the image comprises a two-dimensional image, a three-dimensional image, a four-dimensional image, or a five-dimensional image.
17. The method of claim 1, further comprising displaying the labeled regions and assigning a respective color for each label for visualizing the corresponding regions of interest.
18. A medical imaging system for labeling anatomical structures in an image, the system comprising,
an image processor configured to:
compute a regional response around a voxel of interest in the image:
compute a voxel score for each anatomical structures based on a plurality of probabilistic models;
label the voxel of interest in the image based on the voxel score; and
a display unit configured to display the image including the labeled anatomical regions.
19. The imaging system of claim 18, wherein the probabilistic models comprise histograms.
20. The imaging system of claim 19, wherein the histograms include parameters obtained from fixed shapes and a distribution of shapes.
21. The imaging system of claim 18, wherein the regional response comprises a geometric response for the image voxel.
22. The imaging system of claim 18, wherein the regional response comprises an intensity response for the image voxel.
23. The imaging system of claim 18, wherein the anatomical regions include vessels, nodules, polyps, folds, aneurysm or junctions of vessels trees.
24. The imaging system of claim 18, wherein the imaging system comprises at least one of a computed tomography (CT) system, positron emission tomography (PET) system, a single photon emission computed tomography (SPECT) system, magnetic resonance imaging system, microscopy or a digital radiography system.
25. A computed tomography (CT) system for labeling anatomical structures in a CT image, the system comprising,
an image processor configured to:
compute a regional response around a voxel of interest in the CT image;
compute a voxel score for each anatomical structures based on a plurality of probabilistic models;
label the voxel of interest in the image based on the voxel score; and
a display unit configured to display the CT image including the labeled anatomical structures.
26. The CT system of claim 25, wherein image processor is configured to develop the plurality of probabilistic models using a distribution of geometrical parameters.
27. The CT system of claim 25, wherein image processor is configured to label the voxel of interest using probabilistic models for curvature data in a neighborhood of the voxel of intelest.
28. A computer-readable medium storing computer instructions for instructing a computer system to code uncompressed data, the computer instructions including:
deriving a probabilistic model for a plurality of geometrical structures;
computing a regional response around a region in the image;
computing a region score for each geometrical structure using the plurality of probabilistic models; and
labeling the region in the image based on the region score.
29. The system of claim 28, wherein the geometrical models comprise anatomical structures
30. The system of claim 29, wherein the deriving the probabilistic models comprises:
modeling the anatomical structures using a plurality of geometric models; and
representing the anatomical structures using model parameters.
31. The system of claim 28, wherein computing the regional response comprises computing the regional response as a function of the geometrical models and the model parameters.
32. The system of claim 30, further comprising deriving a distribution of the regional response as a function of the model parameter, wherein the deriving the distribution comprises applying a knowledgebase of anatomical and functional information.
33. The system of claim 31, further comprising deriving a distribution of the regional responses for noisy regions in the image
US11/558,715 2006-09-28 2006-11-10 Method and system for identifying regions in an image Active 2033-10-01 US8923577B2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US11/558,715 US8923577B2 (en) 2006-09-28 2006-11-10 Method and system for identifying regions in an image
JP2007245122A JP5438267B2 (en) 2006-09-28 2007-09-21 Method and system for identifying regions in an image
DE102007046250A DE102007046250A1 (en) 2006-09-28 2007-09-26 Characteristic assigning method for area of image, involves deriving probability model for geometrical structures, which have anatomic structure, and evaluating divisional answer at area in image

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US84777706P 2006-09-28 2006-09-28
US11/558,715 US8923577B2 (en) 2006-09-28 2006-11-10 Method and system for identifying regions in an image

Publications (2)

Publication Number Publication Date
US20080080770A1 true US20080080770A1 (en) 2008-04-03
US8923577B2 US8923577B2 (en) 2014-12-30

Family

ID=39134710

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/558,715 Active 2033-10-01 US8923577B2 (en) 2006-09-28 2006-11-10 Method and system for identifying regions in an image

Country Status (3)

Country Link
US (1) US8923577B2 (en)
JP (1) JP5438267B2 (en)
DE (1) DE102007046250A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050152617A1 (en) * 2003-09-08 2005-07-14 Mirada Solutions Limited A British Body Corporate Similarity measures
US20090252394A1 (en) * 2007-02-05 2009-10-08 Siemens Medical Solutions Usa, Inc. Computer Aided Detection of Pulmonary Embolism with Local Characteristic Features in CT Angiography
US20090296998A1 (en) * 2008-05-30 2009-12-03 Emory University Assessing Tumor Response to Therapy
WO2010034968A1 (en) * 2008-09-29 2010-04-01 Medicsight Plc. Computer-implemented lesion detection method and apparatus
US20100226535A1 (en) * 2009-03-05 2010-09-09 Microsoft Corporation Augmenting a field of view in connection with vision-tracking
US20100238170A1 (en) * 2007-10-15 2010-09-23 Koninklijke Philips Electronics N.V. Visualization of temporal data
WO2010143100A1 (en) 2009-06-10 2010-12-16 Koninklijke Philips Electronics N.V. Visualization apparatus for visualizing an image data set
US20110311116A1 (en) * 2010-06-17 2011-12-22 Creighton University System and methods for anatomical structure labeling
US20130144907A1 (en) * 2011-12-06 2013-06-06 Microsoft Corporation Metadata extraction pipeline
CN110692065A (en) * 2017-05-30 2020-01-14 国际商业机器公司 Surface-based object recognition
US11257210B2 (en) 2018-06-25 2022-02-22 The Royal Institution For The Advancement Of Learning / Mcgill University Method and system of performing medical treatment outcome assessment or medical condition diagnostic
US11682115B2 (en) 2019-08-04 2023-06-20 Brainlab Ag Atlas-based location determination of an anatomical region of interest

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5745947B2 (en) * 2011-06-20 2015-07-08 株式会社日立メディコ Medical image processing apparatus and medical image processing method
KR102049336B1 (en) * 2012-11-30 2019-11-27 삼성전자주식회사 Apparatus and method for computer aided diagnosis
US9576107B2 (en) * 2013-07-09 2017-02-21 Biosense Webster (Israel) Ltd. Model based reconstruction of the heart from sparse samples
US11060433B2 (en) * 2013-09-18 2021-07-13 Advanced Technology Emission Solutions Inc. Retention of wires in an induction heated gaseous emissions treatment unit
US9058692B1 (en) 2014-04-16 2015-06-16 Heartflow, Inc. Systems and methods for image-based object modeling using multiple image acquisitions or reconstructions
WO2019245597A1 (en) * 2018-06-18 2019-12-26 Google Llc Method and system for improving cancer detection using deep learning

Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5255187A (en) * 1990-04-03 1993-10-19 Sorensen Mark C Computer aided medical diagnostic method and apparatus
US5617459A (en) * 1994-07-12 1997-04-01 U.S. Philips Corporation Method of processing images in order automatically to detect key points situated on the contour of an object and device for implementing this method
US5823993A (en) * 1994-02-18 1998-10-20 Lemelson; Jerome H. Computer controlled drug injection system and method
US5917929A (en) * 1996-07-23 1999-06-29 R2 Technology, Inc. User interface for computer aided diagnosis system
US6266435B1 (en) * 1993-09-29 2001-07-24 Shih-Ping Wang Computer-aided diagnosis method and system
US6272233B1 (en) * 1997-08-27 2001-08-07 Fuji Photo Film Co., Ltd. Method and apparatus for detecting prospective abnormal patterns
US6320976B1 (en) * 1999-04-01 2001-11-20 Siemens Corporate Research, Inc. Computer-assisted diagnosis method and system for automatically determining diagnostic saliency of digital images
US6434262B2 (en) * 1993-09-29 2002-08-13 Shih-Ping Wang Computer-aided diagnosis system and method
US20030179021A1 (en) * 2002-03-22 2003-09-25 Siemens Aktiengesellschaft Drive control circuit for a junction field-effect transistor
US20040146193A1 (en) * 2003-01-20 2004-07-29 Fuji Photo Film Co., Ltd. Prospective abnormal shadow detecting system
US20040151356A1 (en) * 2003-01-31 2004-08-05 University Of Chicago Method, system, and computer program product for computer-aided detection of nodules with three dimensional shape enhancement filters
US6817982B2 (en) * 2002-04-19 2004-11-16 Sonosite, Inc. Method, apparatus, and product for accurately determining the intima-media thickness of a blood vessel
US6865300B2 (en) * 2001-10-24 2005-03-08 Nik Multimedia, Inc. User definable image reference points
US20050105788A1 (en) * 2003-11-19 2005-05-19 Matthew William Turek Methods and apparatus for processing image data to aid in detecting disease
US6944330B2 (en) * 2000-09-07 2005-09-13 Siemens Corporate Research, Inc. Interactive computer-aided diagnosis method and system for assisting diagnosis of lung nodules in digital volumetric medical images
US6983063B1 (en) * 2000-06-29 2006-01-03 Siemens Corporate Research, Inc. Computer-aided diagnosis method for aiding diagnosis of three dimensional digital image data
US7024027B1 (en) * 2001-11-13 2006-04-04 Koninklijke Philips Electronics N.V. Method and apparatus for three-dimensional filtering of angiographic volume data
US20060079743A1 (en) * 2004-10-08 2006-04-13 Ferrant Matthieu D Methods and apparatus to facilitate visualization of anatomical shapes
US7058210B2 (en) * 2001-11-20 2006-06-06 General Electric Company Method and system for lung disease detection
US7233191B2 (en) * 2004-05-18 2007-06-19 Richtek Technology Corp. JFET driver circuit and JFET driving method
US7298879B2 (en) * 2002-11-20 2007-11-20 Koninklijke Philips Electronics N.V. Computer-aided detection of lung nodules

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07323024A (en) 1994-06-01 1995-12-12 Konica Corp Image diagnosis supporting apparatus
DE69630935T2 (en) * 1995-09-29 2004-11-04 Koninklijke Philips Electronics N.V. Image processing method and apparatus for automatically detecting areas of a predetermined type of cancer in an intensity image
WO2000030021A1 (en) * 1998-11-13 2000-05-25 Arch Development Corporation System for detection of malignancy in pulmonary nodules
JP2004283188A (en) 2003-03-19 2004-10-14 Konica Minolta Holdings Inc Diagnostic imaging support device, diagnostic imaging support method, program and storage medium
JP2005246032A (en) 2004-02-04 2005-09-15 Fuji Photo Film Co Ltd Abnormal shadow detecting method, apparatus, and program
DE102006025374B4 (en) 2006-05-31 2008-03-13 Technische Universität Chemnitz A junction field effect transistor arrangement and method for driving a junction field effect transistor

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5255187A (en) * 1990-04-03 1993-10-19 Sorensen Mark C Computer aided medical diagnostic method and apparatus
US6266435B1 (en) * 1993-09-29 2001-07-24 Shih-Ping Wang Computer-aided diagnosis method and system
US6434262B2 (en) * 1993-09-29 2002-08-13 Shih-Ping Wang Computer-aided diagnosis system and method
US5823993A (en) * 1994-02-18 1998-10-20 Lemelson; Jerome H. Computer controlled drug injection system and method
US5617459A (en) * 1994-07-12 1997-04-01 U.S. Philips Corporation Method of processing images in order automatically to detect key points situated on the contour of an object and device for implementing this method
US5917929A (en) * 1996-07-23 1999-06-29 R2 Technology, Inc. User interface for computer aided diagnosis system
US6272233B1 (en) * 1997-08-27 2001-08-07 Fuji Photo Film Co., Ltd. Method and apparatus for detecting prospective abnormal patterns
US6320976B1 (en) * 1999-04-01 2001-11-20 Siemens Corporate Research, Inc. Computer-assisted diagnosis method and system for automatically determining diagnostic saliency of digital images
US6983063B1 (en) * 2000-06-29 2006-01-03 Siemens Corporate Research, Inc. Computer-aided diagnosis method for aiding diagnosis of three dimensional digital image data
US6944330B2 (en) * 2000-09-07 2005-09-13 Siemens Corporate Research, Inc. Interactive computer-aided diagnosis method and system for assisting diagnosis of lung nodules in digital volumetric medical images
US6865300B2 (en) * 2001-10-24 2005-03-08 Nik Multimedia, Inc. User definable image reference points
US7024027B1 (en) * 2001-11-13 2006-04-04 Koninklijke Philips Electronics N.V. Method and apparatus for three-dimensional filtering of angiographic volume data
US7058210B2 (en) * 2001-11-20 2006-06-06 General Electric Company Method and system for lung disease detection
US20030179021A1 (en) * 2002-03-22 2003-09-25 Siemens Aktiengesellschaft Drive control circuit for a junction field-effect transistor
US6817982B2 (en) * 2002-04-19 2004-11-16 Sonosite, Inc. Method, apparatus, and product for accurately determining the intima-media thickness of a blood vessel
US7298879B2 (en) * 2002-11-20 2007-11-20 Koninklijke Philips Electronics N.V. Computer-aided detection of lung nodules
US20040146193A1 (en) * 2003-01-20 2004-07-29 Fuji Photo Film Co., Ltd. Prospective abnormal shadow detecting system
US20040151356A1 (en) * 2003-01-31 2004-08-05 University Of Chicago Method, system, and computer program product for computer-aided detection of nodules with three dimensional shape enhancement filters
US20050105788A1 (en) * 2003-11-19 2005-05-19 Matthew William Turek Methods and apparatus for processing image data to aid in detecting disease
US7233191B2 (en) * 2004-05-18 2007-06-19 Richtek Technology Corp. JFET driver circuit and JFET driving method
US20060079743A1 (en) * 2004-10-08 2006-04-13 Ferrant Matthieu D Methods and apparatus to facilitate visualization of anatomical shapes

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Paulo et al. ("Model-based analysis of local shape for lesion detection in CT scans", MICCAI, Vol. 3749, PP 688-695, October 2005) *

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7450780B2 (en) * 2003-09-08 2008-11-11 Siemens Medical Solutions, Usa, Inc. Similarity measures
US20050152617A1 (en) * 2003-09-08 2005-07-14 Mirada Solutions Limited A British Body Corporate Similarity measures
US20090252394A1 (en) * 2007-02-05 2009-10-08 Siemens Medical Solutions Usa, Inc. Computer Aided Detection of Pulmonary Embolism with Local Characteristic Features in CT Angiography
US8244012B2 (en) * 2007-02-05 2012-08-14 Siemens Medical Solutions Usa, Inc. Computer aided detection of pulmonary embolism with local characteristic features in CT angiography
US20100238170A1 (en) * 2007-10-15 2010-09-23 Koninklijke Philips Electronics N.V. Visualization of temporal data
US8872822B2 (en) * 2007-10-15 2014-10-28 Koninklijke Philips N.V. Visualization of temporal data
US8965071B2 (en) 2008-05-30 2015-02-24 Emory University Assessing tumor response to therapy
US20090296998A1 (en) * 2008-05-30 2009-12-03 Emory University Assessing Tumor Response to Therapy
WO2009155096A3 (en) * 2008-05-30 2010-03-25 Emory University Assessing tumor response to therapy
WO2009155096A2 (en) * 2008-05-30 2009-12-23 Emory University Assessing tumor response to therapy
WO2010034968A1 (en) * 2008-09-29 2010-04-01 Medicsight Plc. Computer-implemented lesion detection method and apparatus
US20100226535A1 (en) * 2009-03-05 2010-09-09 Microsoft Corporation Augmenting a field of view in connection with vision-tracking
WO2010143100A1 (en) 2009-06-10 2010-12-16 Koninklijke Philips Electronics N.V. Visualization apparatus for visualizing an image data set
CN102549623A (en) * 2009-06-10 2012-07-04 皇家飞利浦电子股份有限公司 Visualization apparatus for visualizing an image data set
US9129360B2 (en) 2009-06-10 2015-09-08 Koninklijke Philips N.V. Visualization apparatus for visualizing an image data set
US20110311116A1 (en) * 2010-06-17 2011-12-22 Creighton University System and methods for anatomical structure labeling
US20130144907A1 (en) * 2011-12-06 2013-06-06 Microsoft Corporation Metadata extraction pipeline
US9536044B2 (en) * 2011-12-06 2017-01-03 Microsoft Technology Licensing, Llc Metadata extraction pipeline
CN110692065A (en) * 2017-05-30 2020-01-14 国际商业机器公司 Surface-based object recognition
US11257210B2 (en) 2018-06-25 2022-02-22 The Royal Institution For The Advancement Of Learning / Mcgill University Method and system of performing medical treatment outcome assessment or medical condition diagnostic
US11682115B2 (en) 2019-08-04 2023-06-20 Brainlab Ag Atlas-based location determination of an anatomical region of interest

Also Published As

Publication number Publication date
JP2008080121A (en) 2008-04-10
US8923577B2 (en) 2014-12-30
JP5438267B2 (en) 2014-03-12
DE102007046250A1 (en) 2008-04-03

Similar Documents

Publication Publication Date Title
US8923577B2 (en) Method and system for identifying regions in an image
JP5138910B2 (en) 3D CAD system and method using projected images
US8208707B2 (en) Tissue classification in medical images
US6687329B1 (en) Computer aided acquisition of medical images
US7978886B2 (en) System and method for anatomy based reconstruction
EP1398722A2 (en) Computer aided processing of medical images
US20080144909A1 (en) Analysis of Pulmonary Nodules from Ct Scans Using the Contrast Agent Enhancement as a Function of Distance to the Boundary of the Nodule
US20060210131A1 (en) Tomographic computer aided diagnosis (CAD) with multiple reconstructions
EP3447733B1 (en) Selective image reconstruction
US8774485B2 (en) Systems and methods for performing segmentation and visualization of multivariate medical images
CN111316318B (en) Image feature annotation in diagnostic imaging
JP5048233B2 (en) Method and system for anatomical shape detection in a CAD system
JP7258744B2 (en) spectral computed tomography fingerprinting
US20110122134A1 (en) Image display of a tubular structure
US20160335785A1 (en) Method of repeat computer tomography scanning and system thereof
EP4336452A1 (en) Computer-implemented method for processing spectral computed tomography (spectral ct) data, computer program and spectral ct system
CN117475250A (en) Simulating pathology images based on anatomical structure data

Legal Events

Date Code Title Description
AS Assignment

Owner name: GENERAL ELECTRIC COMPANY, NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MENDONCA, PAULO RICARDO;BHOTIKA, RAHUL;TURNER, WESLEY DAVID;AND OTHERS;REEL/FRAME:018508/0933;SIGNING DATES FROM 20061030 TO 20061110

Owner name: GENERAL ELECTRIC COMPANY, NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MENDONCA, PAULO RICARDO;BHOTIKA, RAHUL;TURNER, WESLEY DAVID;AND OTHERS;SIGNING DATES FROM 20061030 TO 20061110;REEL/FRAME:018508/0933

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: US ARMY, SECRETARY OF THE ARMY, MARYLAND

Free format text: CONFIRMATORY LICENSE;ASSIGNOR:GENERAL ELECTRIC GLOBAL RESEARCH;REEL/FRAME:034709/0285

Effective date: 20141204

CC Certificate of correction
MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551)

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8