US20120134568A1 - Method and apparatus of using probabilistic atlas for feature removal/positioning - Google Patents

Method and apparatus of using probabilistic atlas for feature removal/positioning Download PDF

Info

Publication number
US20120134568A1
US20120134568A1 US13/367,744 US201213367744A US2012134568A1 US 20120134568 A1 US20120134568 A1 US 20120134568A1 US 201213367744 A US201213367744 A US 201213367744A US 2012134568 A1 US2012134568 A1 US 2012134568A1
Authority
US
United States
Prior art keywords
shape
breast
unit
image
mask
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/367,744
Inventor
Daniel Russakoff
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US13/367,744 priority Critical patent/US20120134568A1/en
Publication of US20120134568A1 publication Critical patent/US20120134568A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/755Deformable models or variational models, e.g. snakes or active contours
    • G06V10/7553Deformable models or variational models, e.g. snakes or active contours based on shape, e.g. active shape models [ASM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images
    • G06V2201/032Recognition of patterns in medical or anatomical images of protuberances, polyps nodules, etc.

Definitions

  • the present invention relates to a digital image processing technique, and more particularly to a method and apparatus for processing breast images and using a shape model for feature removal/positioning in breast images.
  • Mammography images are powerful tools used in diagnosis of medical problems of breasts.
  • An important feature in mammography images is the breast shape.
  • Clearly detected breast shapes can be used to identify breast abnormalities, such as skin retraction and skin thickening, which are characteristics of malignancy.
  • Clear breast shapes also facilitate automatic or manual comparative analysis between mammography images.
  • Accurate breast shapes may convey significant information relating to breast deformation, size, and shape evolution.
  • the position of the nipple with respect to the breast can be used to detect breast abnormalities.
  • Knowledge of the mammogram view is also important for analysis of breast images, since the mammogram view sets the direction and geometry of a breast in a mammogram image.
  • breast imaging variations pose challenges for both manual identification and computer-aided analysis of breast shapes.
  • Disclosed embodiments of this application address these and other issues by using methods and apparatuses for feature removal and positioning in breast images based on a shape modeling technique for breasts.
  • the methods and apparatuses also use an atlas for location of features in breasts.
  • the methods and apparatuses automatically determine views of mammograms using a shape modeling technique for breasts.
  • the methods and apparatuses perform automatic breast segmentation, and automatically determine nipple position in breasts.
  • the methods and apparatuses can be used for automatic detection of other features besides nipples in breasts.
  • the methods and apparatuses can be used for feature removal, feature detection, feature positioning, and segmentation for other anatomical parts besides breasts, by using shape modeling techniques for the anatomical parts and atlases for locations of features in the anatomical parts.
  • an image processing method comprises: accessing digital image data representing an image including an object; accessing reference data including a shape model relating to shape variation of objects from a baseline object, the objects and the baseline object being from a class of the object; and removing from the image an element not related to the object, by representing a shape of the object using the shape model.
  • an image processing method comprises: accessing digital image data representing an object; accessing reference data including a shape model relating to shape variation from a baseline object shape; and determining a view of the object, the determining step including performing shape registration for the object and for a minor object of the object, by representing shapes of the object and of the minor object using the shape model, to obtain an object registered shape and a mirror object registered shape, and identifying the view by performing a comparative analysis between at least one of the shape of the object, the shape of the minor object, and the baseline object shape, and at least one of the object registered shape, the mirror object registered shape, and the baseline object shape.
  • an image processing method comprises: accessing digital image data representing an object; accessing reference data including a baseline object including an element, and a shape model relating to shape variation from the baseline object; and determining location of the element in the object, the determining step including generating a correspondence between a geometric part associated with the baseline object and a geometric part associated with the object, by representing a shape of the object using the shape model, to obtain a registered shape, and mapping the element from the baseline object onto the registered shape using the correspondence.
  • an image processing apparatus comprises: an image data input unit for providing digital image data representing an image including an object; a reference data unit for providing reference data including a shape model relating to shape variation of objects from a baseline object, the objects and the baseline object being from a class of the object; and a feature removal unit for removing from the image an element not related to the object, by representing a shape of the object using the shape model.
  • an image processing apparatus comprises: an image data input unit for providing digital image data representing an object; a reference data unit for providing reference data including a shape model relating to shape variation from a baseline object shape; and a view detection unit for determining a view of the object, the view detection unit determining a view by performing shape registration for the object and for a mirror object of the object, by representing shapes of the object and of the mirror object using the shape model, to obtain an object registered shape and a mirror object registered shape, and identifying the view by performing a comparative analysis between at least one of the shape of the object, the shape of the mirror object, and the baseline object shape, and at least one of the object registered shape, the mirror object registered shape, and the baseline object shape.
  • an image processing apparatus comprises: an image data input unit for providing digital image data representing an object; a reference data unit for providing reference data including a baseline object including an element, and a shape model relating to shape variation from the baseline object; and an element detection unit for determining location of the element in the object, the element detection unit determining location by generating a correspondence between a geometric part associated with the baseline object and a geometric part associated with the object, by representing a shape of the object using the shape model, to obtain a registered shape, and mapping the element from the baseline object onto the registered shape using the correspondence.
  • FIG. 1 is a general block diagram of a system including an image processing unit for feature removal/positioning according to an embodiment of the present invention
  • FIG. 2 is a block diagram of an image processing unit for feature removal/positioning according to an embodiment of the present invention
  • FIG. 3 is a flow diagram illustrating operations performed by an image processing unit for feature removal/positioning according to an embodiment of the present invention illustrated in FIG. 2 ;
  • FIG. 4 is a block diagram of an image processing unit for nipple detection according to an embodiment of the present invention illustrated in FIG. 2 ;
  • FIG. 5 is a flow diagram illustrating operations performed by an image operations unit included in an image processing unit for feature removal/positioning according to an embodiment of the present invention illustrated in FIG. 4 ;
  • FIG. 6 is a flow diagram illustrating operations performed by a shape registration unit included in an image processing unit for feature removal/positioning according to an embodiment of the present invention illustrated in FIG. 4 ;
  • FIG. 7 is a flow diagram illustrating exemplary operations performed by a feature removal and positioning unit included in an image processing unit for feature removal/positioning according to an embodiment of the present invention illustrated in FIG. 4 ;
  • FIG. 8A illustrates an exemplary baseline breast atlas shape with identified baseline nipple position for the ML view for a shape model stored in a reference data unit;
  • FIG. 8B illustrates exemplary deformation modes for a shape model stored in a reference data unit
  • FIG. 8C illustrates another set of exemplary deformation modes for a shape model stored in a reference data unit
  • FIG. 8D illustrates exemplary aspects of the operation of calculating a cost function by a shape registration unit for a registered shape according to an embodiment of the present invention illustrated in FIG. 6 ;
  • FIG. 8E illustrates exemplary results of the operation of performing shape registration for breast masks by a shape registration unit according to an embodiment of the present invention illustrated in FIG. 6 ;
  • FIG. 8F illustrates an exemplary ML view probabilistic atlas for probability of cancer in breasts stored in a reference data unit
  • FIG. 8G illustrates an exemplary CC view probabilistic atlas for probability of cancer in breasts stored in a reference data unit
  • FIG. 8H illustrates exemplary aspects of the operation of detecting nipple position for a breast image by an image processing unit for feature removal/positioning according to an embodiment of the present invention illustrated in FIG. 4 ;
  • FIG. 8I illustrates exemplary aspects of the operation of warping a breast mask to an atlas using triangulation by a feature removal and positioning unit according to an embodiment of the present invention illustrated in FIG. 7 ;
  • FIG. 8J illustrates exemplary aspects of the operation of bilinear interpolation according to an embodiment of the present invention illustrated in FIG. 7 ;
  • FIG. 9 is a block diagram of an image processing unit for artifact removal and breast segmentation according to a second embodiment of the present invention illustrated in FIG. 2 ;
  • FIG. 10A illustrates an exemplary output of an image processing unit for artifact removal and breast segmentation according to a second embodiment of the present invention illustrated in FIG. 9 ;
  • FIG. 10B illustrates another exemplary output of an image processing unit for artifact removal and breast segmentation according to a second embodiment of the present invention illustrated in FIG. 9 ;
  • FIG. 11 is a block diagram of an image processing unit for view detection according to a third embodiment of the present invention illustrated in FIG. 2 ;
  • FIG. 12 is a block diagram of an image processing unit for feature removal/positioning including a training system according to a fourth embodiment of the present invention.
  • FIG. 1 is a general block diagram of a system including an image processing unit for feature removal/positioning according to an embodiment of the present invention.
  • the system 100 illustrated in FIG. 1 includes the following components: an image input unit 28 ; an image processing unit 38 ; a display 68 ; an image output unit 58 ; a user input unit 78 ; and a printing unit 48 . Operation of the system 100 in FIG. 1 will become apparent from the following discussion.
  • the image input unit 28 provides digital image data.
  • Digital image data may be medical images such as mammogram images, brain scan images, X-ray images, etc. Digital image data may also be images of non-anatomical objects, images of people, etc.
  • Image input unit 28 may be one or more of any number of devices providing digital image data derived from a radiological film, a diagnostic image, a photographic film, a digital system, etc.
  • Such an input device may be, for example, a scanner for scanning images recorded on a film; a digital camera; a digital mammography machine; a recording medium such as a CD-R, a floppy disk, a USB drive, etc.; a database system which stores images; a network connection; an image processing system that outputs digital data, such as a computer application that processes images; etc.
  • the image processing unit 38 receives digital image data from the image input unit 28 and performs feature removal/positioning in a manner discussed in detail below.
  • a user e.g., a radiology specialist at a medical facility, may view the output of image processing unit 38 , via display 68 and may input commands to the image processing unit 38 via the user input unit 78 .
  • the user input unit 78 includes a keyboard 85 and a mouse 87 , but other conventional input devices could also be used.
  • the image processing unit 38 may perform additional image processing functions in accordance with commands received from the user input unit 78 .
  • the printing unit 48 receives the output of the image processing unit 38 and generates a hard copy of the processed image data.
  • the processed image data may be returned as an image file, e.g., via a portable recording medium or via a network (not shown).
  • the output of image processing unit 38 may also be sent to image output unit 58 that performs further operations on image data for various purposes.
  • the image output unit 58 may be a module that performs further processing of the image data; a database that collects and compares images; a database that stores and uses feature removal/positioning results received from image processing unit 38 ; etc.
  • FIG. 2 is a block diagram of an image processing unit 38 for feature removal/positioning according to an embodiment of the present invention.
  • the image processing unit 38 includes: an image operations unit 128 ; a shape registration unit 138 ; a feature removal and positioning unit 148 ; and a reference data unit 158 .
  • the various components of FIG. 2 are illustrated as discrete elements, such an illustration is for ease of explanation and it should be recognized that certain operations of the various components may be performed by the same physical device, e.g., by one or more microprocessors.
  • Image operations unit 128 receives digital image data from image input unit 28 .
  • Digital image data can be medical images, which may be obtained through medical imaging.
  • Digital image data may be mammography images, brain scan images, chest X-ray images, etc.
  • Digital image data may also be images of non-anatomical objects, images of people, etc.
  • image processing unit 38 Operation of image processing unit 38 will be next described in the context of mammography images, for feature removal/positioning using a probabilistic atlas and/or a shape model for breasts.
  • the principles of the current invention apply equally to other areas of image processing, for feature removal/positioning using a probabilistic atlas and/or a shape model for other types of objects besides breasts.
  • Image operations unit 128 receives a set of breast images from image input unit 28 and may perform preprocessing and preparation operations on the breast images. Preprocessing and preparation operations performed by image operations unit 128 may include resizing, cropping, compression, color correction, etc., that change size and/or appearance of breast images. Image operations unit 128 may also extract breast shape information from breast images, and may store or extract information about breast images, such as views of mammograms.
  • Image operations unit 128 sends the preprocessed breast images to shape registration unit 138 , which performs shape registration for breasts in the breast images.
  • shape registration unit 138 represents breast shapes using a shape model, to obtain registered breast shapes.
  • Shape registration unit 138 retrieves information about the shape model from reference data unit 158 , which stores parameters that define the shape model.
  • Reference data unit 158 may also store one or more probabilistic atlases that include information about probability of breast structures at various locations inside breasts, and for various views of breasts recorded in mammograms. Breast structures recorded in probabilistic atlases may be, for example, cancer masses in breasts, benign formations in breasts, breast vessel areas, etc.
  • Feature removal and positioning unit 148 receives registered breast shapes from shape registration unit 138 .
  • Feature removal and positioning unit 148 retrieves data for a baseline breast image and/or data for a probabilistic atlas, from reference data unit 158 .
  • feature removal and positioning unit 148 uses retrieved data from reference data unit 158 to performs removal of features and/or geometric positioning and processing for registered breast shapes.
  • the output of feature removal and positioning unit 148 are breast images with identified features, and/or breast images from which certain features were removed.
  • the output of feature removal and positioning unit 148 may also include information about locations of removed features or locations of other features of interest in breasts, information about orientation/view of breast images, etc.
  • Feature removal and positioning unit 148 outputs breast images, together with positioning and/or feature removal information. Such output results may be output to image output unit 58 , printing unit 48 , and/or display 68 .
  • Image operations unit 128 , shape registration unit 138 , feature removal and positioning unit 148 , and reference data unit 158 are software systems/applications. Image operations unit 128 , shape registration unit 138 , feature removal and positioning unit 148 , and reference data unit 158 may also be purpose built hardware such as FPGA, ASIC, etc.
  • FIG. 3 is a flow diagram illustrating operations performed by an image processing unit 38 for feature removal/positioning according to an embodiment of the present invention illustrated in FIG. 2 .
  • Image operations unit 128 receives a breast image from image input unit 28 (S 201 ). Image operations unit 128 performs preprocessing and preparation operations on the breast image (S 203 ). Preprocessing and preparation operations performed by image operations unit 128 may include resizing, cropping, compression, color correction, etc., that change size and/or appearance of breast images. Image operations unit 128 also extracts breast shape information from the breast image (S 205 ), and stores or extracts information about the view of the breast image (S 207 ).
  • Image operations unit 128 sends the preprocessed breast image to shape registration unit 138 , which performs shape registration for the breast in the image to obtain a registered breast shape (S 209 ).
  • shape registration unit 138 uses a shape model for breast shapes (S 211 ).
  • the shape model describes how shape varies from breast to breast.
  • the shape model is retrieved from reference data unit 158 (S 211 ).
  • Feature removal and positioning unit 148 receives the registered breast shape from shape registration unit 138 .
  • Feature removal and positioning unit 148 retrieves data describing a baseline breast image, which is included in the shape model, from reference data unit 158 (S 215 ).
  • Feature removal and positioning unit 148 may also retrieve from reference data unit 158 data describing a probabilistic feature atlas (S 215 ).
  • the probabilistic atlas includes information about probability of features at various locations inside breasts.
  • feature removal and positioning unit 148 uses the retrieved data from reference data unit 158 , feature removal and positioning unit 148 performs removal of features from the breast image and/or geometric positioning and processing for the registered breast shape (S 217 ).
  • Feature removal and positioning unit 148 outputs the breast image with identified geometrical orientations, and/or from which certain features were removed (S 219 ). Such output results may be output to image output unit 58 , printing unit 48 , and/or display 68 .
  • FIG. 4 is a block diagram of an image processing unit 38 A for nipple detection according to an embodiment of the present invention illustrated in FIG. 2 .
  • the image processing unit 38 A according to this embodiment includes: an image operations unit 128 A; a shape registration unit 138 A; an atlas warping unit 340 ; a nipple detection unit 350 ; and a reference data unit 158 A.
  • the atlas warping unit 340 and the nipple detection unit 350 are included in a feature removal and positioning unit 148 A.
  • Image operations unit 128 A receives a set of breast images from image input unit 28 , and may perform preprocessing and preparation operations on the breast images. Preprocessing and preparation operations performed by image operations unit 128 A may include resizing, cropping, compression, color correction, etc., that change size and/or appearance of breast images. Image operations unit 128 A creates breast mask images including pixels that belong to the breasts in the breast images. Breast mask images are also called breast shape silhouettes in the current application. Breast mask images may be created, for example, by detecting breast borders or breast clusters, for the breasts shown in the breast images. Image operations unit 128 A may also store/extract information about breast images, such as views of the mammograms.
  • Image operations unit 128 A sends the breast mask images to shape registration unit 138 A, which performs shape registration for breast mask images.
  • shape registration unit 138 A describes breast mask images using a shape model, to obtain registered breast shapes.
  • Shape registration unit 138 A retrieves information about the shape model from reference data unit 158 A, which stores parameters that define the shape model.
  • a shape model may consist of a baseline breast atlas shape and a set of deformation modes.
  • the baseline breast atlas shape is a mean breast shape representing the average shape of a breast for a given mammogram view.
  • Other baseline breast atlas shapes may also be used.
  • the deformation modes define directions for deformation from contour points of breasts in the breast images, onto corresponding contour points of the breast in the baseline breast atlas shape.
  • the shape model is obtained by training off-line, using large sets of training breast images.
  • a baseline breast atlas shape can be obtained from the sets of training breast images. Deformation modes, describing variation of shapes of training breast images from the baseline breast atlas shape, are also obtained during training.
  • a baseline breast atlas shape is generated during off-line training from a large number of training breast mask images.
  • the baseline breast atlas shape may be, for example, a mean breast shape obtained by aligning centers of mass of training breast mask images. The alignment of centers of mass of training breast mask images results in a probabilistic map in which the brighter a pixel is, the more likely it is for the pixel to appear in a training breast mask image.
  • a probability threshold may be applied to the probabilistic map, to obtain a mean breast shape in which every pixel has a high probability of appearing in a training breast mask image.
  • the baseline breast atlas shape illustrates a baseline breast.
  • the baseline breast atlas shape also includes a baseline nipple for the baseline breast.
  • the baseline nipple position is identified in the baseline breast atlas shape during off-line training.
  • training breast mask images are warped onto the baseline breast atlas shape during off-line training, to define parameterization of breast shape.
  • Control points may be placed along the edges of the baseline breast atlas shape.
  • a deformation grid is generated using the control points.
  • the control points are warped onto training breast mask images.
  • Shape representations for the training breast mask images are generated by the corresponding warped control points, together with centers of mass of the shapes defined by the warped control points. Additional details about generating shape representations for training breast images can be found in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference.
  • Principal modes of deformation between training breast mask images and the baseline breast atlas shape may be determined using the shape representations for the training breast mask images. Principal modes of deformation can be found using Principal Components Analysis (PCA) techniques. The principal components obtained from PCA represent modes of deformation between training breast mask images and the baseline breast atlas shape. Additional details regarding extraction of deformation modes are found in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference.
  • PCA Principal Components Analysis
  • the baseline breast atlas shape, and the modes of deformation between training breast mask images and the baseline breast atlas shape define a shape model.
  • Shape models can be obtained during off-line training, for each mammogram view. Shape models are stored in reference data unit 158 A.
  • a new breast mask shape received from image operations unit 128 A may then be represented using a shape model from reference data unit 158 A.
  • a breast mask shape may be expressed as a function of the baseline breast atlas shape, which may be a mean breast shape (B a ) in an exemplary embodiment, and of the shape model deformation modes, as:
  • p is an offset (such as a 2D offset) to the mean breast shape B a to account for a rigid translation of the entire shape
  • an arbitrary breast mask may be expressed as a sum of the fixed mean breast shape (B a ), a linear combination of fixed deformation modes L i multiplied by coefficients ⁇ i , and a 2D offset p.
  • Atlas warping unit 340 receives the registration results for the breast mask image B mask — new from shape registration unit 138 A.
  • Atlas warping unit 340 then warps the breast mask image B mask — new to the mean breast shape B a vi .
  • Atlas warping unit 340 may, alternatively, warp the breast mask image B mask — new to a probabilistic feature atlas A vi specific to the view v i of the breast mask image B mask — new .
  • the probabilistic feature atlas data is stored in reference data unit 158 A.
  • the probabilistic feature atlas A vi includes an image of the mean breast shape B a — vi for view v i , together with probabilities for presence of a feature at each pixel in the mean breast shape B a — vi .
  • the probabilistic atlas A vi is a weighted pixel image, in which each pixel of the mean breast shape B a — vi is weighted by the feature probability for that pixel.
  • the probabilistic feature atlas is obtained by training off-line, using large sets of training breast images with previously identified feature structures.
  • Features recorded in probabilistic atlases may be cancer masses in breasts, benign formations in breasts, breast vessel areas, etc.
  • the shapes of training breast images are represented as linear combinations of deformation modes obtained in training.
  • previously identified features in the training breast images are mapped to the baseline-breast atlas shape obtained in training.
  • a probabilistic atlas containing probabilities for presence of a feature in the baseline breast atlas shape is obtained.
  • Atlas warping unit 340 warps the breast mask image B mask — new to the probabilistic atlas A vi ; or to the mean breast shape B a — vi , a warped breast mask image B mask — new — warped is obtained.
  • Feature probability weights from the probabilistic atlas A vi are associated with pixels in the warped image B mask — new — warped .
  • the baseline nipple position from the mean breast shape B a — vi is also associated with pixels in the warped image B mask — new — warped .
  • Nipple detection unit 350 receives the warped breast mask image B mask — new — warped , together with shape registration information of the form
  • Nipple detection unit 350 warps the B mask — new — warped image back to the original B mask — new , and an image P mask — new , is obtained. Since the baseline nipple position has been identified in the baseline breast atlas shape during off-line training, and since B mask — new — warped has the shape of the baseline breast atlas shape, the image P mask — new includes a warped nipple position from B mask — new — warped to B mask — new . Hence, the image P mask — new is an image of the breast mask image B mask — new , in which the position of the nipple has been identified. Therefore, the image P mask — new includes nipple detection results for the original breast mask B mask — new .
  • the image P mask — new includes feature probabilities for various breast features, at various pixel locations inside the breast mask image B mask — new .
  • the image P mask — new is a weighted pixel image, in which each pixel of the breast mask image B mask — new is weighted by the feature probability for that pixel.
  • the image P mask — new is a weighted pixel image, in which each pixel of the breast mask image B mask — new is weighted by the probability for cancer at that pixel.
  • mapping feature probabilities from a probabilistic atlas A vi ; to a breast mask image B mask — new to obtain a probability map for a feature in a breast mask image B mask — new can be found in the co-pending non-provisional application titled “Method and Apparatus of Using Probabilistic Atlas for Cancer Detection”, the entire contents of which are hereby incorporated by reference.
  • image P mask — new provides very useful information for detection of the nipple and for the position of the nipple with respect to the breast. If the image P mask — new includes cancer probabilities associated with pixels of the B mask — new breast mask from a probabilistic cancer atlas, image P mask — new provides information about the nipple position with respect to probable locations of cancer masses in the B mask — new breast mask. The position of the nipple with respect to the breast can be used to detect breast abnormalities. Since the position of the nipple with respect to the breast is influenced by breast abnormalities, information about nipple position and nipple proximity to high probability cancer regions in breast help in identification of cancer masses, structural changes, breast abnormalities, etc.
  • the initially identified nipple position in image P mask — new (and hence in breast mask image B mask — new ) can also be a starting point for performing a refinement of the nipple position.
  • Refinement of the nipple position can be performed, for example, in regions adjacent to or including the initially identified nipple position in image P mask — new .
  • Nipple detection unit 350 outputs the image P mask — new .
  • the image P mask — new may be output to image output unit 58 , printing unit 48 , and/or display 68 .
  • Image operations unit 128 A, shape registration unit 138 A, atlas warping unit 340 , nipple detection unit 350 , and reference data unit 158 A are software systems/applications. Image operations unit 128 A, shape registration unit 138 A, atlas warping unit 340 , nipple detection unit 350 , and reference data unit 158 A may also be purpose built hardware such as FPGA, ASIC, etc.
  • FIG. 5 is a flow diagram illustrating operations performed by an image operations unit 128 A included in an image processing unit 38 A for feature removal/positioning according to an embodiment of the present invention illustrated in FIG. 4 .
  • Image operations unit 128 A receives a raw or preprocessed breast image from image input unit 28 (S 401 ).
  • the breast image may be retrieved by image operations unit 128 A from, for example, a breast imaging apparatus, a database of breast images, etc.
  • Image operations unit 128 A may perform preprocessing operations on the breast image (S 403 ). Preprocessing operations may include resizing, cropping, compression, color correction, etc.
  • Image operations unit 128 A creates a breast mask image for the breast image (S 405 ).
  • the breast mask image includes pixels that belong to the breast.
  • the breast mask image may be created by detecting breast borders for the breast shown in the breast image.
  • Image operations unit 128 A may create a breast mask image by detecting breast borders using methods described in the US patent application titled “Method and Apparatus for Breast Border Detection”, application Ser. No. 11/366,495, by Daniel Russakoff and Akira Hasegawa, filed on Mar. 3, 2006, the entire contents of which are hereby incorporated by reference.
  • pixels in the breast image are represented in a multi-dimensional space, such as a 4-dimensional space with x-locations of pixels, y-locations of pixels, intensity value of pixels, and distance of pixels to a reference point.
  • K-means clustering of pixels is run in the multi-dimensional space, to obtain clusters for the breast image.
  • Cluster merging and connected components analysis are then run using relative intensity measures, brightness pixel values, and cluster size, to identify a cluster corresponding to the breast in the breast image.
  • a set of pixels, or a mask, containing breast pixels is obtained.
  • the set of pixels for a breast forms a breast mask B mask .
  • breast border detection techniques may also be used by image operations unit 128 A to obtain a breast mask image.
  • Image operations unit 128 A also stores information about the breast image, such as information about the view of the mammogram (S 407 ). Examples of mammogram views are MLL (medio-lateral left), MLR (medio-lateral right), CCL (cranio-caudal left), CCR (cranio-caudal right), RCC, LRR, LMLO (left medio-lateral oblique), and RMLO (right medio-lateral oblique).
  • Image operations unit 128 A outputs the breast mask image, and information about the view of the breast image (S 409 ), to shape registration unit 138 A.
  • FIG. 6 is a flow diagram illustrating operations performed by a shape registration unit 138 A included in an image processing unit 38 A for feature removal/positioning according to an embodiment of the present invention illustrated in FIG. 4 .
  • Shape registration unit 138 A fits the breast mask image B mask — new with its correct shape representation as a linear combination of the deformation modes
  • shape registration unit 138 A uses a cost function defined as the mean distance to edge. For a (p x , p y , ⁇ ) parameter set, shape registration unit 138 A calculates the new shape resulting from this parameter set by formula
  • the center of mass (Shape.COM) of Shape is then calculated (S 480 ).
  • shape registration unit 138 A For each shape point on the exterior (border) of Shape, shape registration unit 138 A generates a ray containing the Shape.COM and the shape point, finds the intersection point of the ray with the edge of B mask — new , and calculates how far the shape point is from the intersection point obtained in this manner. This technique is further illustrated in FIG. 8D .
  • the minimum distance from the shape point to the edge of B mask — new is calculated.
  • the mean distance for the Shape points to the edges of the breast mask image B mask — new is then calculated (S 482 ). Optimized ⁇ i values and 2D offset p are selected for which the mean distance of shape points of Shape to the breast mask image B mask — new edges attains a minimum (S 484 ).
  • Shape registration unit 138 A may use the downhill simplex method, also known as the Nelder-Mead or the amoeba algorithm (S 486 ), to fit the breast mask image B mask — new with its correct shape representation, by minimizing distances of the edge shape points of Shape to the edges of the breast mask image B mask — new .
  • the downhill simplex method is a single-valued minimization algorithm that does not require derivatives.
  • the downhill simplex algorithm is typically very robust.
  • the k+2 parameters (p x , p y , ⁇ ) form a simplex in a multi-dimensional space.
  • the Nelder-Mead method minimizes the selected cost function, by moving points of the simplex to decrease the cost function.
  • a point of the simplex may be moved by reflections against a plane generated by other simplex points, by reflection and expansion of the simplex obtained from a previous reflection, by contraction of the simplex, etc.
  • shape registration unit 138 A outputs the shape registration results for the breast mask image B mask — new to atlas warping unit 301 (S 492 ).
  • FIG. 7 is a flow diagram illustrating exemplary operations performed by a feature removal and positioning unit 148 A included in an image processing unit 38 A for feature removal/positioning according to an embodiment of the present invention illustrated in FIG. 4 .
  • FIG. 7 illustrates exemplary operations that may be performed by an atlas warping unit 340 and a nipple detection unit 350 included in a feature removal and positioning unit 148 A.
  • Atlas warping unit 340 warps the registered shape for breast mask image B mask — new to a probabilistic atlas A vi , or to a baseline breast atlas shape B a — vi , associated with the view v i of the breast mask image B mask — new .
  • Warping to probabilistic atlas A vi ; or to the baseline breast atlas shape B a — vi may be performed by triangulating the breast mask B mask — new based on its center of mass and edge points (S 501 ).
  • each triangle in the breast mask B mask — new corresponds to a triangle in the probabilistic atlas A vi and to a triangle in the baseline breast atlas shape B a — vi (S 503 ), as the probabilistic atlas A vi has the shape of the baseline breast atlas shape B a — vi .
  • Pixels inside corresponding triangles of the atlas A vi (or B a — vi ) can be warped back and forth into triangles of breast mask B mask — new , using a bilinear interpolation in 2D (S 503 ).
  • the bilinear interpolation in 2D may be performed by multiplying each of the triangle vertices by appropriate relative weights, as further described at FIG. 8J .
  • Nipple detection unit 350 warps back corresponding triangles of the atlas A vi (or B a — vi ), to triangles in breast mask B mask — new (S 505 ).
  • the nipple position for the breast mask image B mask — new is the warped nipple position from triangles of the baseline breast atlas shape B a — vi (or probabilistic atlas A vi ) to triangles of the breast mask image B mask — new (S 507 ).
  • an image with a location for the nipple is obtained for the breast mask B mask — new (S 507 ).
  • Feature probabilities associated with pixels in triangles of the atlas image A vi may become associated with pixels in triangles of breast mask B mask — new , as further described in the co-pending non-provisional application titled “Method and Apparatus of Using Probabilistic Atlas for Cancer Detection”, the entire contents of which are hereby incorporated by reference.
  • the image with an identified nipple location may also contain feature probability values associated with image pixels, for features such as cancer structures, benign structures, etc.
  • FIG. 8A illustrates an exemplary baseline breast atlas shape for the ML view, with identified nipple position.
  • the exemplary baseline breast atlas shape for the ML view is included in a shape model stored in a reference data unit 158 .
  • the baseline breast atlas shape in FIG. 8A is a mean breast shape representing the set of pixels that have 95% or more chance of appearing in a breast mask image in the ML view.
  • the nipple N has been identified on the mean breast shape.
  • FIG. 8B illustrates exemplary deformation modes for a shape model stored in the reference data unit 158 .
  • the breast shape in FIG. I 510 is an exemplary baseline breast atlas shape (mean shape, in this case) for the ML view.
  • the first 3 modes (L 1 , L 2 , L 3 ) of deformation are shown.
  • the first mode of deformation is L 1 .
  • Contours D 2 and D 3 define the deformation mode L 1 .
  • the deformation mode L 1 consists of directions and proportional length of movement for each contour point from the D 2 contour to a corresponding contour point from the D 3 contour.
  • Contours D 4 and D 5 define the second deformation mode L 2
  • contours D 6 and D 7 define the third deformation mode L 3 .
  • the deformation modes shown in FIG. 8B may be obtained by training, using techniques described in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference.
  • FIG. 8C illustrates another set of exemplary deformation modes for a shape model stored in the reference data unit 158 .
  • the deformation modes shown in FIG. 8C were obtained by training a shape model using 4900 training breast images of ML view, using techniques described in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference. 17 deformation modes, capturing 99% of the variance in the breast images data set, were obtained.
  • the representations of the first 4 modes L 1 , L 2 , L 3 and L 4 are shown in FIG. 8C .
  • the representations of the first 4 modes L 1 , L 2 , L 3 and L 4 together capture 85% of the data's variance.
  • the mean breast shape (baseline breast atlas shape) for the ML view is plotted with dots (points), while the arrows represent the distance traveled by one point for that mode from ⁇ 2 standard deviations to +2 standard deviations of the mean breast shape.
  • Mode L 1 captures 52% of the variance in the breast images data set
  • mode L 2 captures 18% of the variance in the breast images data set
  • mode L 3 captures 10% of the variance in the breast images data set
  • mode L 4 captures 4% of the variance in the breast images data set.
  • the rest of the deformation modes (L 5 to L 17 ) are not shown.
  • FIG. 8D illustrates exemplary aspects of the operation of calculating a cost function by a shape registration unit 138 A for a registered shape according to an embodiment of the present invention illustrated in FIG. 6 .
  • a shape bounded by contour C 512 is obtained from formula
  • B a — vi is a mean breast shape for view v i of the breast mask B mask — new
  • the center of mass COM for the Shape bounded by contour C 512 is found.
  • the distance to edge is the distance d between points S 1 and S 2 .
  • Distances d are obtained for all points on the contour (perimeter) C 512 of Shape, and a cost function is obtained as the mean of all distances d.
  • FIG. 8E illustrates exemplary results of the operation of performing shape registration for breast masks by a shape registration unit 138 A according to an embodiment of the present invention illustrated in FIG. 6 .
  • breast masks I 513 and I 514 are fit with shape representations.
  • the shape registration results bounded by contours C 513 and C 514 are effectively describing the shapes of breast masks I 513 and I 514 .
  • the downhill simplex algorithm was used by shape registration unit 138 A to obtain the shape registration results shown in FIG. 8E .
  • FIG. 8F illustrates an exemplary ML view probabilistic atlas for probability of cancer in breasts stored in the reference data unit 158 .
  • the contour C 515 is the contour of the mean breast shape (baseline breast atlas shape) B a — ML for the ML view.
  • the region R 515 A indicates the highest probability of cancer, followed by regions R 515 B, then R 515 C, and R 515 D.
  • the probability for cancer is largest in the center of a breast, and decreases towards edges of the mean breast shape.
  • FIG. 8G illustrates an exemplary CC view probabilistic atlas for probability of cancer in breasts stored in the probabilistic atlas reference data unit 158 .
  • the contour C 516 is the contour of the mean breast shape for the CC view.
  • the region R 516 A indicates the highest probability of cancer, followed by regions R 516 B, then R 516 C, and R 516 D.
  • the probability for cancer is largest in the center left region of a breast, and decreases towards edges of the mean breast shape.
  • FIG. 8H illustrates exemplary aspects of the operation of detecting nipple position for a breast image by an image processing unit 38 A for feature removal/positioning according to an embodiment of the present invention illustrated in FIG. 4 .
  • a breast image I 518 is input by image operations unit 128 A.
  • Image operations unit 128 A extracts a breast mask image I 519 for the breast image I 518 .
  • Shape registration unit 138 A performs shape registration for the breast mask image, by representing the shape of the breast mask using a shape model.
  • the shape registration contour C 520 fits the shape of the breast mask from image I 519 .
  • Atlas warping unit 340 warps the breast mask registered shape I 520 to a probabilistic atlas (or alternatively to a baseline breast atlas shape) I 522 that includes a detected baseline nipple N.
  • Atlas warping unit 340 performs warping by generating a correspondence between pixels of the breast mask registered shape I 520 and pixels of the probabilistic atlas (or of the baseline breast atlas shape) I 522 .
  • nipple detection unit 350 warps the probabilistic atlas (or baseline breast atlas shape) I 522 onto the breast mask registered shape I 520 , hence obtaining an image I 523 with detected nipple position N′ corresponding to the baseline nipple position N, for the breast image I 518 .
  • FIG. 8I illustrates exemplary aspects of the operation of warping a breast mask to an atlas using triangulation by a feature removal and positioning unit 148 A according to an embodiment of the present invention illustrated in FIG. 7 .
  • Atlas warping unit 340 warps a registered shape S 530 for a breast mask image B mask — new I 530 to a probabilistic atlas A vi ; (or to a baseline breast atlas shape) A 532 shown in image I 532 .
  • Warping to probabilistic atlas A vi ; (or to baseline breast atlas shape) A 532 is performed by triangulating the breast mask shape S 530 based on its center of mass COM_ 530 and edge points.
  • a test point P_ 530 is used to generate a triangle in the breast mask shape S 530 .
  • a triangle T_ 530 is generated using the center of mass COM_ 530 and the test point P_ 530 and touching the edges of mask shape S 530 .
  • the triangle is warped to probabilistic atlas A vi (or to baseline breast atlas shape) A 532 onto a corresponding triangle T_ 532 , with the COM_ 530 and the test point P_ 530 mapped to corresponding points PC_ 532 and P_ 532 .
  • the probabilistic atlas A vi (or baseline breast atlas shape) A 532 is then warped onto registered shape S 530 by warping each triangle T_ 532 back onto the corresponding triangle T_ 530 of the breast mask B mask — new I 530 .
  • the nipple position the probabilistic atlas A vi (or baseline breast atlas shape) A 532 is hence warped onto registered shape S 530 associated with the breast mask image B mask — new I 530 .
  • FIG. 8J illustrates exemplary aspects of the operation of bilinear interpolation according to an embodiment of the present invention illustrated in FIG. 7 .
  • the pixels inside corresponding triangles of the atlas A vi ; (or baseline breast atlas shape B a — vi ) can be warped back and forth to triangles in breast mask B mask — new , using a bilinear interpolation.
  • bilinear interpolation in 2D is performed by multiplying each of the vertices by appropriate relative weights as described in FIG. 8J .
  • the pixel intensity at point D can be obtained as:
  • T abc is the area of triangle ABC
  • wA is the area of triangle BCD
  • wB is the area of triangle ACD
  • wC is the area of triangle ABD
  • FIG. 9 is a block diagram of an image processing unit 38 B for artifact removal and breast segmentation according to a second embodiment of the present invention illustrated in FIG. 2 .
  • the image processing unit 38 B according to this embodiment includes: an image operations unit 128 B; a shape registration unit 138 B; an optional atlas warping unit 340 ; an artifact removal unit 360 ; and a reference data unit 158 B.
  • the atlas warping unit 340 and the artifact removal unit 360 are included in a feature removal and positioning unit 148 B.
  • Image operations unit 128 B receives a breast image from image input unit 28 , and may perform preprocessing and preparation operations on the breast image. Preprocessing and preparation operations performed by image operations unit 128 B may include resizing, cropping, compression, color correction, etc., that change size and/or appearance of the breast image. Image operations unit 128 B creates a breast mask image. Breast mask images may be created, for example, by detecting breast borders or breast clusters for the breasts shown in the breast image. Image operations unit 128 B may also store/extract information about the breast image, such as view of mammogram.
  • Image operations unit 128 B may perform preprocessing and breast mask extraction operations in a similar manner to image operations unit 128 A described in FIG. 5 .
  • Image operations unit 128 B may create a breast mask image by detecting breast borders using methods described in the US patent application titled “Method and Apparatus for Breast Border Detection”, application Ser. No. 11/366,495, by Daniel Russakoff and Akira Hasegawa, filed on Mar. 3, 2006, the entire contents of which are hereby incorporated by reference. Other methods may also be used to create a breast mask image.
  • Image operations unit 128 B sends the breast mask images to shape registration unit 138 B, which performs shape registration for the breast mask image.
  • shape registration unit 138 B describes the breast mask image using a shape model, to obtain a registered breast shape.
  • Shape registration unit 138 B retrieves information about the shape model from reference data unit 158 B, which stores parameters that define the shape model.
  • the reference data unit 158 B is similar to reference data unit 158 A from FIG. 4 .
  • Reference data unit 158 B stores shape models, and may also store probabilistic atlases for breast features.
  • a shape model and an optional probabilistic atlas stored by reference data unit 158 B can be generated off-line, using training breast images. Details on generation of a breast shape model and a probabilistic atlas using sets of training breast images can be found in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference.
  • a shape model stored by reference data unit 158 B includes a baseline breast atlas image and a set of deformation modes.
  • a shape model stored by reference data unit 158 B is similar to a shape model stored by reference data unit 158 A as described at FIG. 4 , with two differences.
  • One difference is that the nipple of the baseline breast atlas shape need not be identified and marked for the baseline breast atlas shape stored by reference data unit 158 B.
  • the second difference pertains to the method of generation of the shape model during off-line training.
  • the training breast images used to generate the shape model for reference data unit 158 B off-line are preferably breast images without artifacts (such as tags, noise, frames, image scratches, lead markers, imaging plates, etc.), anomalies, or unusual structures.
  • Training breast images without artifacts may be obtained by removing artifacts, anomalies, or unusual structures from the images manually or automatically, before off-line training.
  • the baseline breast atlas shape obtained as described in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference illustrates a baseline breast without artifacts, anomalies, or unusual structures.
  • the deformation modes obtained as described in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference, describe variations between shapes of training breast images and the baseline breast atlas shape.
  • Reference data unit 158 B stores information for shape models for breasts, for various views of mammograms.
  • Shape registration unit 138 B may perform shape registration in a manner similar to shape registration unit 138 A, as described at FIG. 6 .
  • Optional atlas warping unit 340 receives the registration results for a breast mask image from shape registration unit 138 B, and warps the breast mask image to the baseline breast atlas shape from the shape model associated with the view of the breast mask image.
  • Atlas warping unit 340 performs warping of breast mask images to baseline breast atlas shapes or to probabilistic atlases, as described in FIG. 4 and FIG. 7 .
  • image processing unit 38 B it is possible to remove artifacts, such as tags, noise, frames, image scratches, lead markers, imaging plates, etc., from a breast image and perform an accurate segmentation of the breast in the breast image.
  • image operations unit 128 B obtains a breast mask image B T — mask .
  • Shape registration unit 138 B then performs shape registration for the breast mask image B T — mask .
  • Shape registration unit 138 B expresses the breast mask image B T — mask as a function of the baseline breast atlas shape, which may be a mean breast shape (B a ), and shape model deformation modes, as:
  • Shape registration unit 138 B retrieves baseline breast atlas shape data and deformation modes from reference data unit 158 B. Since the shape model stored in reference data unit 158 B was generated using training breast shape images without artifacts, anomalies, or unusual structures, the Breast Shape obtained from
  • the Breast Shape will optimize a fit to the original breast mask image B T — mask , except for the artifacts that were present in the original breast mask image B T — mask .
  • the artifacts present in the original breast mask image B T — mask have not been learned by the shape model stored in reference data unit 158 B, and will not be fit.
  • the Breast Shape represents a segmentation of the breast in the breast mask image B T — mask , without the artifacts may have been present in breast mask image B T — mask .
  • Artifact removal unit 360 receives the Breast Shape together with the breast mask image B T — mask from shape registration unit 138 B, and may extract artifacts by subtracting the Breast Shape from the breast mask image B T — mask , to obtain an artifact mask image I Art
  • Artifact removal unit 360 can then apply the artifact mask image I Art to the original breast image I T , to identify artifact positions in the original breast image I T and remove the artifacts. Artifact removal unit 360 outputs a breast image I T ′ without artifacts.
  • breast segmentation with artifact removal may be combined with feature detection.
  • artifact removal may be achieved for an original breast image I T together with cancer detection using a probabilistic cancer atlas and/or comparative left-right breast analysis, as described in the co-pending non-provisional application titled “Method and Apparatus of Using Probabilistic Atlas for Cancer Detection”, the entire contents of which are hereby incorporated by reference.
  • Image operations unit 128 B, shape registration unit 138 B, optional atlas warping unit 340 , artifact removal unit 360 , and reference data unit 158 B are software systems/applications. Image operations unit 128 B, shape registration unit 138 B, optional atlas warping unit 340 , artifact removal unit 360 , and reference data unit 158 B may also be purpose built hardware such as FPGA, ASIC, etc.
  • FIG. 10A illustrates an exemplary output of an image processing unit 38 B for artifact removal and breast segmentation according to a second embodiment of the present invention illustrated in FIG. 9 .
  • a breast mask I 581 with a tag T 582 is segmented by image processing unit 35 B using a shape model that was constrained to remain within the shape space of typical breasts without artifacts.
  • the final segmented breast shape 1583 obtained by image processing unit 35 B does not contain the tag T 582 , as the segmented breast shape is constrained by the shape model to resemble a breast.
  • FIG. 10B illustrates another exemplary output of an image processing unit 38 B for artifact removal and breast segmentation according to a second embodiment of the present invention illustrated in FIG. 9 .
  • a breast mask I 591 with a skin fold T 592 is segmented by image processing unit 35 B using a shape model that was constrained to remain within the shape space of typical breasts without artifacts.
  • the final segmented breast shape I 593 obtained by image processing unit 35 B does not contain the skin fold T 592 , as the segmented breast shape is constrained by the shape model to resemble a breast.
  • FIG. 11 is a block diagram of an image processing unit 38 C for view detection according to a third embodiment of the present invention illustrated in FIG. 2 .
  • the image processing unit 38 C according to this embodiment includes: an image operations unit 128 C; a shape registration unit 138 C; a view decision unit 148 C; and a reference data unit 158 C.
  • the view decision unit 148 C is a feature removal and positioning unit.
  • Image operations unit 128 C receives a breast image from image input unit 28 , and may perform preprocessing and preparation operations on the breast image. Preprocessing and preparation operations performed by image operations unit 128 C may include resizing, cropping, compression, color correction, etc., that change size and/or appearance of the breast image. Image operations unit 128 C creates a breast mask image. Breast mask images may be created, for example, by detecting breast borders or breast clusters for the breasts shown in the breast image. Image operations unit 128 C may also store/extract information about the breast image, such as view of mammogram.
  • Image operations unit 128 C may perform preprocessing and breast mask extraction operations in a similar manner to image operations unit 128 A described in FIG. 5 .
  • Image operations unit 128 C may create a breast mask image by detecting breast borders using methods described in the US patent application titled “Method and Apparatus for Breast Border Detection”, application Ser. No. 11/366,495, by Daniel Russakoff and Akira Hasegawa, filed on Mar. 3, 2006, the entire contents of which are hereby incorporated by reference.
  • Image operations unit 128 C sends the breast mask images to shape registration unit 138 C, which performs shape registration for the breast mask image.
  • shape registration unit 138 C describes the breast mask image using a shape model, to obtain a registered breast shape.
  • Shape registration unit 138 C retrieves information about the shape model from reference data unit 158 C, which stores parameters that define the shape model.
  • the reference data unit 158 C is similar to reference data unit 158 A from FIG. 4 .
  • the reference data unit 158 C stores shape models, and may also store probabilistic atlases.
  • a shape model stored by reference data unit 158 C can be generated off-line, using training breast images. Details on generation of a breast shape model using sets of training breast images can be found in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference.
  • a shape model stored by reference data unit 158 C includes a baseline breast atlas image and a set of deformation modes.
  • Shape registration unit 138 C may perform shape registration in a manner similar to shape registration unit 138 A, as described at FIG. 6 .
  • Shape registration unit 138 C receives from image operations unit 128 C a breast mask image B mask of unknown mammogram view.
  • B mask could be, for example, an ML mammogram view for which the view direction of left or right is not known.
  • Shape registration unit 138 C fits the breast mask image B mask to a shape model M associated with one of left or right views, and obtains a registered image R 1 . Shape registration unit 138 C then flips the breast mask image B mask about a vertical axis to obtain a flipped breast mask B mask — Flipped , and then fits the flipped breast mask image B mask — Flipped to the same shape model M, to obtain a registered image R 2 .
  • View detection unit 148 C receives breast mask images B mask and B mask — Flipped , and registered images R 1 and R 2 . View detection unit 148 C then compares the fit of R 1 to B mask , and the fit of R 2 to B mask — Flipped . If the fit of R 1 to B mask is better than the fit of R 2 to B mask — Flipped , then the view associated with shape model M is the view of the breast image B mask . On the other hand, if the fit of R 2 to the B mask — Flipped is better than fit of R 1 to B mask , then the view associated with shape model M is the view of breast image B mask — Flipped . The view direction of the breast mask image B mask is hence detected. View detection results are output to printing unit 48 , display 68 , d or image output unit 58 .
  • the view of breast mask image B mask may also be detected by comparison to a baseline shape.
  • B a be the baseline breast atlas shape associated with the shape model M.
  • View detection unit 148 C compares the differences between R 1 and B a , and the differences between R 2 and B a . If the differences between R 1 and B a are smaller than the differences between R 2 and B a , then the view associated with baseline breast atlas shape B a (and hence with shape model M) is the view of breast image B mask . On the other hand, if the differences between R 2 and B a are smaller than the differences between R 1 and B a , then the view associated with baseline breast atlas shape B a (and hence with shape model M) is the view of breast image B mask — Flipped .
  • the view of breast mask images B mask may also be detected by direct comparison of B mask and B mask — Flipped with B a , without performing shape registration of B mask and B mask — Flipped . If the differences between B mask and B a are smaller than the differences between B mask — Flipped and B a , then the view associated with baseline breast atlas shape B a is the view of breast image B mask . On the other hand, if the differences between B mask and B a are larger than the differences between B mask — Flipped and B a , then the view associated with baseline breast atlas shape B a is the view of breast image B mask — Flipped .
  • FIG. 12 is a block diagram of an image processing unit 39 for feature removal/positioning including a training system 772 according to a fourth embodiment of the present invention.
  • the image processing unit 39 includes the following components: an image operations unit 620 ; a baseline shape unit 710 ; a shape parameterization unit 720 ; a deformation analysis unit 730 ; a training shape registration unit 740 ; an atlas output unit 750 ; an image operations unit 128 ; a shape registration unit 138 ; a feature removal and positioning unit 148 ; and a reference data unit 158 .
  • Image operations unit 620 , baseline shape unit 710 , shape parameterization unit 720 , deformation analysis unit 730 , training shape registration unit 740 , and atlas output unit 750 are included in a training system 772 .
  • Training shape registration unit 740 and atlas output unit 750 are optional, and may be included depending on the application.
  • Image operations unit 128 , shape registration unit 138 , feature removal and positioning unit 148 , and reference data unit 158 are included in an operation system 38 .
  • Operation of the image processing unit 39 can generally be divided into two stages: (1) training; and (2) operation for positioning and for feature removal or detection.
  • the image operations unit 620 , baseline shape unit 710 , shape parameterization unit 720 , deformation analysis unit 730 , training shape registration unit 740 , and atlas output unit 750 train to generate a shape model and a probabilistic feature atlas for breast shapes.
  • the knowledge accumulated through training by training system 772 is sent to reference data unit 158 .
  • Image operations unit 620 and shape model unit 630 trains to generate a shape model.
  • Optional probabilistic atlas generation unit 640 trains to generate a probabilistic atlas.
  • the shape model and the probabilistic atlas are sent and stored in reference data unit 158 .
  • the image operations unit 128 , the shape registration unit 138 , the feature removal and positioning unit 148 , and the reference data unit 158 may function in like manner to the corresponding elements of the first, second, or third embodiments illustrated in FIGS. 4 , 9 , and 11 , or as a combination of two or more of the first, second, and third embodiments illustrated in FIGS. 4 , 9 , and 11 .
  • reference data unit 158 provides reference data training knowledge to shape registration unit 138 and to feature removal and positioning unit 148 , for use in nipple detection, view detection, and artifact removal from breast images.
  • the principles involved in the operation for artifact removal from breast images have been described in FIGS. 9 , 5 , 6 , 7 , 8 A, 8 B, 8 C, 8 D, 8 E, 8 F, 8 G, 8 H, 8 I and 8 J.
  • the principles involved in the operation for view detection for new breast images have been described in FIGS. 11 , 5 , 6 , 7 , 8 A, 8 B, 8 C, 8 D, 8 E, 8 F, 8 G, 8 H, 8 I and 8 J.
  • image operations unit 620 receives a set of training breast images from image input unit 28 , performs preprocessing and preparation operations on the breast images, creates training breast mask images, and stores/extracts information about breast images, such as view of mammograms. Additional details regarding operation of image operations unit 620 are described in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference. Image operations unit 620 may create breast mask images by extracting breast borders using methods described in the US patent application titled “Method and Apparatus for Breast Border Detection”, application Ser. No.
  • Baseline shape unit 710 receives training breast mask images from image operations unit 620 , and generates a baseline breast atlas shape such as, for example, a mean breast shape, from the training breast mask images.
  • Baseline shape unit 710 may align the centers of mass of the training breast mask images. The alignment of centers of mass of training breast mask images results in a probabilistic map in which the brighter a pixel is, the more likely it is for the pixel to appear in a training breast mask image. A probability threshold may then be applied to the probabilistic map, to obtain a baseline breast atlas shape, such as, for example, a mean breast shape. Additional details regarding operation of baseline shape unit 710 are described in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference.
  • Shape parameterization unit 720 receives the training breast mask images and the baseline breast atlas shape, and warps the training breast mask images onto the baseline breast atlas shape, to define parameterization of breast shape.
  • Shape parameterization unit 720 may use shape parameterization techniques adapted from “Automatic Generation of Shape Models Using Nonrigid Registration with a Single Segmented Template Mesh” by G. Heitz, T. Rohlfing and C. Maurer, Proceedings of Vision, Modeling and Visualization, 2004, the entire contents of which are hereby incorporated by reference.
  • Control points may be placed along the edges of the baseline breast atlas shape.
  • a deformation grid is generated using the control points. Using the deformation grid, the control points are warped onto training breast mask images.
  • Shape information for training breast mask images is then given by the corresponding warped control points together with centers of mass of the shapes defined by the warped control points.
  • Warping of control points from the baseline breast atlas shape onto training breast mask images may be performed by non-rigid registration, with B-splines transformations used to define warps from baseline breast atlas shape to training breast mask images.
  • Shape parameterization unit 720 may perform non-rigid registration using techniques discussed in “Automatic Construction of 3-D Statistical Deformation Models of the Brain Using Nonrigid Registration”, by D. Rueckert, A. Frangi and J. Schnabel, IEEE Transactions on Medical Imaging, 22(8), p. 1014-1025, August 2003, the entire contents of which are hereby incorporated by reference.
  • Shape parameterization unit 720 outputs shape representations for training breast mask images. Additional details regarding operation of shape parameterization unit 720 are described in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference.
  • Deformation analysis unit 730 uses breast shape parameterization results to learn a shape model that describes how shape varies from breast to breast. Using representations of shape for the training breast mask images, deformation analysis unit 730 finds the principal modes of deformation between the training breast mask images and the baseline breast atlas shape. Deformation analysis unit 730 may use Principal Components Analysis (PCA) techniques to find the principal modes of deformation. The principal components obtained from PCA represent modes of deformation between training breast mask images and the baseline breast atlas shape. Additional details regarding operation of deformation analysis unit 730 are described in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference.
  • PCA Principal Components Analysis
  • the baseline breast atlas shape and the modes of deformation between training breast mask images and the baseline breast atlas shape define a shape model.
  • a shape model can be obtained for each mammogram view.
  • Shape model information is sent to reference data unit 158 , to be used during operation of image processing unit 39 .
  • Training shape registration unit 740 receives data that defines the shape model. Training shape registration unit 740 then fits training breast mask images with their correct shape representations, which are linear combinations of the principal modes of shape variation. Shape registration unit 740 may use the downhill simplex method, also known as the Nelder-Mead or the amoeba algorithm, to optimize parameters of the shape model for each training breast mask image in the training dataset, and optimally describe training breast mask images using the shape model. Additional details regarding operation of training shape registration unit 740 are described in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference.
  • Atlas output unit 750 receives from training shape registration unit 740 the results of shape registration for the set of training breast mask images analyzed.
  • the set of training breast mask images have features that have been previously localized. Features could be cancer structures, benign structures, vessel areas, etc.
  • shape registration results the localized features in the training breast mask images are mapped from the training breast mask images onto the baseline breast atlas shape.
  • An atlas is created with locations of the features in the baseline breast atlas shape. Since a large number of training breast mask images with previously localized features are used, the atlas is a probabilistic atlas that gives the probability for feature presence at each pixel inside the baseline breast atlas shape. One probabilistic atlas may be generated for each mammogram view.
  • the probabilistic feature atlases for various breast views are sent to reference data unit 158 , to be used during operation of image processing unit 39 . Additional details regarding operation of atlas output unit 750 are described in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference.
  • Image operations unit 620 , baseline shape unit 710 , shape parameterization unit 720 , deformation analysis unit 730 , training shape registration unit 740 , atlas output unit 750 , image operations unit 128 , shape registration unit 138 , feature removal and positioning unit 148 , and probabilistic atlas reference unit 158 are software systems/applications.
  • Image operations unit 620 , baseline shape unit 710 , shape parameterization unit 720 , deformation analysis unit 730 , training shape registration unit 740 , atlas output unit 750 , image operations unit 128 , shape registration unit 138 , feature removal and positioning unit 148 , and probabilistic atlas reference unit 158 may also be purpose built hardware such as FPGA, ASIC, etc.
  • Methods and apparatuses disclosed in this application can be used for breast segmentation, artifact removal, mammogram view identification, nipple detection, etc. Methods and apparatuses disclosed in this application can be combined with methods and apparatuses disclosed in the co-pending non-provisional application titled “Method and Apparatus of Using Probabilistic Atlas for Cancer Detection”, the entire contents of which are hereby incorporated by reference, to perforin breast segmentation, artifact removal, mammogram view identification, nipple detection, together with cancer detection for mammography images.
  • Shape models and probabilistic atlases generated using techniques described in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference, can be used for breast segmentation, artifact removal, mammogram view identification, nipple detection, and cancer detection. Additional applications, such as temporal subtraction between breast images can be implemented using methods and apparatuses disclosed in this application, and methods and apparatuses disclosed in “Method and Apparatus of Using Probabilistic Atlas for Cancer Detection”.
  • the methods and apparatuses disclosed in this application can be used for automatic detection of other features besides nipples in breasts.
  • the methods and apparatuses can be used for feature removal, feature detection, feature positioning, and segmentation for other anatomical parts besides breasts, by using shape modeling techniques for the anatomical parts and atlases for locations of features in the anatomical parts.
  • the methods and apparatuses disclosed in this application can be coupled with methods and apparatuses from “Method and Apparatus of Using Probabilistic Atlas for Cancer Detection” using shape models and probabilistic atlases generated as described in “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, to perform feature removal, feature detection, feature positioning, and object segmentation for other objects and anatomical objects besides breasts, and other features besides cancer structures or breast features.

Abstract

Methods and apparatuses process images. The method according to one embodiment accesses digital image data representing an image including an object; accesses reference data including a shape model relating to shape variation of objects from a baseline object, the objects and the baseline object being from a class of the object; and removes from the image an element not related to the object, by representing a shape of the object using the shape model.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application is a Divisional of co-pending U.S. application Ser. No. 11/640,960 filed on Dec. 19, 2006, and for which priority is claimed under 35 U.S.C. §120; the entire contents of U.S. application Ser. No. 11/640,960 are hereby incorporated by reference.
  • This non-provisional application is related to the following non-provisional applications/patents: U.S. application Ser. No. 11/640,946 filed on Dec. 19, 2006, now U.S. Pat. No. 7,907,768, titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”; and U.S. application Ser. No. 11/640,947 filed on Dec. 19, 2006, now U.S. Pat. No. 7,792,348, titled “Method and Apparatus of Using Probabilistic Atlas for Cancer Detection” which were filed concurrently with U.S. application Ser. No. 11/640,960 which is the parent of the present application; the entire contents of all of the above patent applications and patents are hereby incorporated by reference.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates to a digital image processing technique, and more particularly to a method and apparatus for processing breast images and using a shape model for feature removal/positioning in breast images.
  • 2. Description of the Related Art
  • Mammography images are powerful tools used in diagnosis of medical problems of breasts. An important feature in mammography images is the breast shape. Clearly detected breast shapes can be used to identify breast abnormalities, such as skin retraction and skin thickening, which are characteristics of malignancy. Clear breast shapes also facilitate automatic or manual comparative analysis between mammography images. Accurate breast shapes may convey significant information relating to breast deformation, size, and shape evolution. The position of the nipple with respect to the breast can be used to detect breast abnormalities. Knowledge of the mammogram view is also important for analysis of breast images, since the mammogram view sets the direction and geometry of a breast in a mammogram image.
  • Unclear or inaccurate breast shapes may obscure abnormal breast growth and deformation. Mammography images with unclear, unusual, or abnormal breast shapes or breast borders pose challenges when used in software applications that process and compare breast images.
  • Due to the way the mammogram acquisition process works, the region where the breast tapers off has decreased breast contour contrast, which makes breast borders unclear and poses challenges for breast segmentation. Non-uniform background regions, tags, labels, or scratches present in mammography images may obscure the breast shape and create problems for processing of breast images. Reliable breast shape detection is further complicated by variations in anatomical shapes of breasts and medical imaging conditions. Such variations include: 1) anatomical shape variations between breasts of various people or between breasts of the same person; 2) lighting variations in breast images taken at different times; 3) pose and view changes in mammograms; 4) change in anatomical structure of breasts due to the aging of people; etc. Such breast imaging variations pose challenges for both manual identification and computer-aided analysis of breast shapes.
  • Disclosed embodiments of this application address these and other issues by using methods and apparatuses for feature removal and positioning in breast images based on a shape modeling technique for breasts. The methods and apparatuses also use an atlas for location of features in breasts. The methods and apparatuses automatically determine views of mammograms using a shape modeling technique for breasts. The methods and apparatuses perform automatic breast segmentation, and automatically determine nipple position in breasts. The methods and apparatuses can be used for automatic detection of other features besides nipples in breasts. The methods and apparatuses can be used for feature removal, feature detection, feature positioning, and segmentation for other anatomical parts besides breasts, by using shape modeling techniques for the anatomical parts and atlases for locations of features in the anatomical parts.
  • SUMMARY OF THE INVENTION
  • The present invention is directed to methods and apparatuses for processing images. According to a first aspect of the present invention, an image processing method comprises: accessing digital image data representing an image including an object; accessing reference data including a shape model relating to shape variation of objects from a baseline object, the objects and the baseline object being from a class of the object; and removing from the image an element not related to the object, by representing a shape of the object using the shape model.
  • According to a second aspect of the present invention, an image processing method comprises: accessing digital image data representing an object; accessing reference data including a shape model relating to shape variation from a baseline object shape; and determining a view of the object, the determining step including performing shape registration for the object and for a minor object of the object, by representing shapes of the object and of the minor object using the shape model, to obtain an object registered shape and a mirror object registered shape, and identifying the view by performing a comparative analysis between at least one of the shape of the object, the shape of the minor object, and the baseline object shape, and at least one of the object registered shape, the mirror object registered shape, and the baseline object shape.
  • According to a third aspect of the present invention, an image processing method comprises: accessing digital image data representing an object; accessing reference data including a baseline object including an element, and a shape model relating to shape variation from the baseline object; and determining location of the element in the object, the determining step including generating a correspondence between a geometric part associated with the baseline object and a geometric part associated with the object, by representing a shape of the object using the shape model, to obtain a registered shape, and mapping the element from the baseline object onto the registered shape using the correspondence.
  • According to a fourth aspect of the present invention, an image processing apparatus comprises: an image data input unit for providing digital image data representing an image including an object; a reference data unit for providing reference data including a shape model relating to shape variation of objects from a baseline object, the objects and the baseline object being from a class of the object; and a feature removal unit for removing from the image an element not related to the object, by representing a shape of the object using the shape model.
  • According to a fifth aspect of the present invention, an image processing apparatus comprises: an image data input unit for providing digital image data representing an object; a reference data unit for providing reference data including a shape model relating to shape variation from a baseline object shape; and a view detection unit for determining a view of the object, the view detection unit determining a view by performing shape registration for the object and for a mirror object of the object, by representing shapes of the object and of the mirror object using the shape model, to obtain an object registered shape and a mirror object registered shape, and identifying the view by performing a comparative analysis between at least one of the shape of the object, the shape of the mirror object, and the baseline object shape, and at least one of the object registered shape, the mirror object registered shape, and the baseline object shape.
  • According to a sixth aspect of the present invention, an image processing apparatus comprises: an image data input unit for providing digital image data representing an object; a reference data unit for providing reference data including a baseline object including an element, and a shape model relating to shape variation from the baseline object; and an element detection unit for determining location of the element in the object, the element detection unit determining location by generating a correspondence between a geometric part associated with the baseline object and a geometric part associated with the object, by representing a shape of the object using the shape model, to obtain a registered shape, and mapping the element from the baseline object onto the registered shape using the correspondence.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Further aspects and advantages of the present invention will become apparent upon reading the following detailed description in conjunction with the accompanying drawings, in which:
  • FIG. 1 is a general block diagram of a system including an image processing unit for feature removal/positioning according to an embodiment of the present invention;
  • FIG. 2 is a block diagram of an image processing unit for feature removal/positioning according to an embodiment of the present invention;
  • FIG. 3 is a flow diagram illustrating operations performed by an image processing unit for feature removal/positioning according to an embodiment of the present invention illustrated in FIG. 2;
  • FIG. 4 is a block diagram of an image processing unit for nipple detection according to an embodiment of the present invention illustrated in FIG. 2;
  • FIG. 5 is a flow diagram illustrating operations performed by an image operations unit included in an image processing unit for feature removal/positioning according to an embodiment of the present invention illustrated in FIG. 4;
  • FIG. 6 is a flow diagram illustrating operations performed by a shape registration unit included in an image processing unit for feature removal/positioning according to an embodiment of the present invention illustrated in FIG. 4;
  • FIG. 7 is a flow diagram illustrating exemplary operations performed by a feature removal and positioning unit included in an image processing unit for feature removal/positioning according to an embodiment of the present invention illustrated in FIG. 4;
  • FIG. 8A illustrates an exemplary baseline breast atlas shape with identified baseline nipple position for the ML view for a shape model stored in a reference data unit;
  • FIG. 8B illustrates exemplary deformation modes for a shape model stored in a reference data unit;
  • FIG. 8C illustrates another set of exemplary deformation modes for a shape model stored in a reference data unit;
  • FIG. 8D illustrates exemplary aspects of the operation of calculating a cost function by a shape registration unit for a registered shape according to an embodiment of the present invention illustrated in FIG. 6;
  • FIG. 8E illustrates exemplary results of the operation of performing shape registration for breast masks by a shape registration unit according to an embodiment of the present invention illustrated in FIG. 6;
  • FIG. 8F illustrates an exemplary ML view probabilistic atlas for probability of cancer in breasts stored in a reference data unit;
  • FIG. 8G illustrates an exemplary CC view probabilistic atlas for probability of cancer in breasts stored in a reference data unit;
  • FIG. 8H illustrates exemplary aspects of the operation of detecting nipple position for a breast image by an image processing unit for feature removal/positioning according to an embodiment of the present invention illustrated in FIG. 4;
  • FIG. 8I illustrates exemplary aspects of the operation of warping a breast mask to an atlas using triangulation by a feature removal and positioning unit according to an embodiment of the present invention illustrated in FIG. 7;
  • FIG. 8J illustrates exemplary aspects of the operation of bilinear interpolation according to an embodiment of the present invention illustrated in FIG. 7;
  • FIG. 9 is a block diagram of an image processing unit for artifact removal and breast segmentation according to a second embodiment of the present invention illustrated in FIG. 2;
  • FIG. 10A illustrates an exemplary output of an image processing unit for artifact removal and breast segmentation according to a second embodiment of the present invention illustrated in FIG. 9;
  • FIG. 10B illustrates another exemplary output of an image processing unit for artifact removal and breast segmentation according to a second embodiment of the present invention illustrated in FIG. 9;
  • FIG. 11 is a block diagram of an image processing unit for view detection according to a third embodiment of the present invention illustrated in FIG. 2; and
  • FIG. 12 is a block diagram of an image processing unit for feature removal/positioning including a training system according to a fourth embodiment of the present invention.
  • DETAILED DESCRIPTION
  • Aspects of the invention are more specifically set forth in the accompanying description with reference to the appended figures. FIG. 1 is a general block diagram of a system including an image processing unit for feature removal/positioning according to an embodiment of the present invention. The system 100 illustrated in FIG. 1 includes the following components: an image input unit 28; an image processing unit 38; a display 68; an image output unit 58; a user input unit 78; and a printing unit 48. Operation of the system 100 in FIG. 1 will become apparent from the following discussion.
  • The image input unit 28 provides digital image data. Digital image data may be medical images such as mammogram images, brain scan images, X-ray images, etc. Digital image data may also be images of non-anatomical objects, images of people, etc. Image input unit 28 may be one or more of any number of devices providing digital image data derived from a radiological film, a diagnostic image, a photographic film, a digital system, etc. Such an input device may be, for example, a scanner for scanning images recorded on a film; a digital camera; a digital mammography machine; a recording medium such as a CD-R, a floppy disk, a USB drive, etc.; a database system which stores images; a network connection; an image processing system that outputs digital data, such as a computer application that processes images; etc.
  • The image processing unit 38 receives digital image data from the image input unit 28 and performs feature removal/positioning in a manner discussed in detail below. A user, e.g., a radiology specialist at a medical facility, may view the output of image processing unit 38, via display 68 and may input commands to the image processing unit 38 via the user input unit 78. In the embodiment illustrated in FIG. 1, the user input unit 78 includes a keyboard 85 and a mouse 87, but other conventional input devices could also be used.
  • In addition to performing feature removal/positioning in accordance with embodiments of the present invention, the image processing unit 38 may perform additional image processing functions in accordance with commands received from the user input unit 78. The printing unit 48 receives the output of the image processing unit 38 and generates a hard copy of the processed image data. In addition or as an alternative to generating a hard copy of the output of the image processing unit 38, the processed image data may be returned as an image file, e.g., via a portable recording medium or via a network (not shown). The output of image processing unit 38 may also be sent to image output unit 58 that performs further operations on image data for various purposes. The image output unit 58 may be a module that performs further processing of the image data; a database that collects and compares images; a database that stores and uses feature removal/positioning results received from image processing unit 38; etc.
  • FIG. 2 is a block diagram of an image processing unit 38 for feature removal/positioning according to an embodiment of the present invention. As shown in FIG. 2, the image processing unit 38 according to this embodiment includes: an image operations unit 128; a shape registration unit 138; a feature removal and positioning unit 148; and a reference data unit 158. Although the various components of FIG. 2 are illustrated as discrete elements, such an illustration is for ease of explanation and it should be recognized that certain operations of the various components may be performed by the same physical device, e.g., by one or more microprocessors.
  • Generally, the arrangement of elements for the image processing unit 38 illustrated in FIG. 2 performs preprocessing and preparation of digital image data, registration of shapes of objects from digital image data, and feature removal and positioning for objects in digital image data. Image operations unit 128 receives digital image data from image input unit 28. Digital image data can be medical images, which may be obtained through medical imaging. Digital image data may be mammography images, brain scan images, chest X-ray images, etc. Digital image data may also be images of non-anatomical objects, images of people, etc.
  • Operation of image processing unit 38 will be next described in the context of mammography images, for feature removal/positioning using a probabilistic atlas and/or a shape model for breasts. However, the principles of the current invention apply equally to other areas of image processing, for feature removal/positioning using a probabilistic atlas and/or a shape model for other types of objects besides breasts.
  • Image operations unit 128 receives a set of breast images from image input unit 28 and may perform preprocessing and preparation operations on the breast images. Preprocessing and preparation operations performed by image operations unit 128 may include resizing, cropping, compression, color correction, etc., that change size and/or appearance of breast images. Image operations unit 128 may also extract breast shape information from breast images, and may store or extract information about breast images, such as views of mammograms.
  • Image operations unit 128 sends the preprocessed breast images to shape registration unit 138, which performs shape registration for breasts in the breast images. For shape registration, shape registration unit 138 represents breast shapes using a shape model, to obtain registered breast shapes. Shape registration unit 138 retrieves information about the shape model from reference data unit 158, which stores parameters that define the shape model. Reference data unit 158 may also store one or more probabilistic atlases that include information about probability of breast structures at various locations inside breasts, and for various views of breasts recorded in mammograms. Breast structures recorded in probabilistic atlases may be, for example, cancer masses in breasts, benign formations in breasts, breast vessel areas, etc.
  • Feature removal and positioning unit 148 receives registered breast shapes from shape registration unit 138. Feature removal and positioning unit 148 retrieves data for a baseline breast image and/or data for a probabilistic atlas, from reference data unit 158. Using retrieved data from reference data unit 158, feature removal and positioning unit 148 performs removal of features and/or geometric positioning and processing for registered breast shapes. The output of feature removal and positioning unit 148 are breast images with identified features, and/or breast images from which certain features were removed. The output of feature removal and positioning unit 148 may also include information about locations of removed features or locations of other features of interest in breasts, information about orientation/view of breast images, etc. Feature removal and positioning unit 148 outputs breast images, together with positioning and/or feature removal information. Such output results may be output to image output unit 58, printing unit 48, and/or display 68.
  • Operation of the components included in image processing unit 38 illustrated in FIG. 2 will be next described with reference to FIG. 3. Image operations unit 128, shape registration unit 138, feature removal and positioning unit 148, and reference data unit 158 are software systems/applications. Image operations unit 128, shape registration unit 138, feature removal and positioning unit 148, and reference data unit 158 may also be purpose built hardware such as FPGA, ASIC, etc.
  • FIG. 3 is a flow diagram illustrating operations performed by an image processing unit 38 for feature removal/positioning according to an embodiment of the present invention illustrated in FIG. 2.
  • Image operations unit 128 receives a breast image from image input unit 28 (S201). Image operations unit 128 performs preprocessing and preparation operations on the breast image (S203). Preprocessing and preparation operations performed by image operations unit 128 may include resizing, cropping, compression, color correction, etc., that change size and/or appearance of breast images. Image operations unit 128 also extracts breast shape information from the breast image (S205), and stores or extracts information about the view of the breast image (S207).
  • Image operations unit 128 sends the preprocessed breast image to shape registration unit 138, which performs shape registration for the breast in the image to obtain a registered breast shape (S209). For shape registration, shape registration unit 138 uses a shape model for breast shapes (S211). The shape model describes how shape varies from breast to breast. The shape model is retrieved from reference data unit 158 (S211).
  • Feature removal and positioning unit 148 receives the registered breast shape from shape registration unit 138. Feature removal and positioning unit 148 retrieves data describing a baseline breast image, which is included in the shape model, from reference data unit 158 (S215). Feature removal and positioning unit 148 may also retrieve from reference data unit 158 data describing a probabilistic feature atlas (S215). The probabilistic atlas includes information about probability of features at various locations inside breasts. Using the retrieved data from reference data unit 158, feature removal and positioning unit 148 performs removal of features from the breast image and/or geometric positioning and processing for the registered breast shape (S217). Feature removal and positioning unit 148 outputs the breast image with identified geometrical orientations, and/or from which certain features were removed (S219). Such output results may be output to image output unit 58, printing unit 48, and/or display 68.
  • FIG. 4 is a block diagram of an image processing unit 38A for nipple detection according to an embodiment of the present invention illustrated in FIG. 2. As shown in FIG. 4, the image processing unit 38A according to this embodiment includes: an image operations unit 128A; a shape registration unit 138A; an atlas warping unit 340; a nipple detection unit 350; and a reference data unit 158A. The atlas warping unit 340 and the nipple detection unit 350 are included in a feature removal and positioning unit 148A.
  • Image operations unit 128A receives a set of breast images from image input unit 28, and may perform preprocessing and preparation operations on the breast images. Preprocessing and preparation operations performed by image operations unit 128A may include resizing, cropping, compression, color correction, etc., that change size and/or appearance of breast images. Image operations unit 128A creates breast mask images including pixels that belong to the breasts in the breast images. Breast mask images are also called breast shape silhouettes in the current application. Breast mask images may be created, for example, by detecting breast borders or breast clusters, for the breasts shown in the breast images. Image operations unit 128A may also store/extract information about breast images, such as views of the mammograms.
  • Image operations unit 128A sends the breast mask images to shape registration unit 138A, which performs shape registration for breast mask images. For shape registration, shape registration unit 138A describes breast mask images using a shape model, to obtain registered breast shapes. Shape registration unit 138A retrieves information about the shape model from reference data unit 158A, which stores parameters that define the shape model.
  • Each mammogram view is associated with a shape model. A shape model may consist of a baseline breast atlas shape and a set of deformation modes. In one embodiment, the baseline breast atlas shape is a mean breast shape representing the average shape of a breast for a given mammogram view. Other baseline breast atlas shapes may also be used. The deformation modes define directions for deformation from contour points of breasts in the breast images, onto corresponding contour points of the breast in the baseline breast atlas shape. The shape model is obtained by training off-line, using large sets of training breast images. A baseline breast atlas shape can be obtained from the sets of training breast images. Deformation modes, describing variation of shapes of training breast images from the baseline breast atlas shape, are also obtained during training. Details on generation of a breast shape model using sets of training breast images can be found in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference.
  • A baseline breast atlas shape is generated during off-line training from a large number of training breast mask images. The baseline breast atlas shape may be, for example, a mean breast shape obtained by aligning centers of mass of training breast mask images. The alignment of centers of mass of training breast mask images results in a probabilistic map in which the brighter a pixel is, the more likely it is for the pixel to appear in a training breast mask image. A probability threshold may be applied to the probabilistic map, to obtain a mean breast shape in which every pixel has a high probability of appearing in a training breast mask image. Hence, the baseline breast atlas shape illustrates a baseline breast. Additional details regarding generation of a baseline breast atlas shape/mean breast shape can be found in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference. The baseline breast atlas shape also includes a baseline nipple for the baseline breast. The baseline nipple position is identified in the baseline breast atlas shape during off-line training.
  • To extract deformation modes for a shape model, training breast mask images are warped onto the baseline breast atlas shape during off-line training, to define parameterization of breast shape. Control points may be placed along the edges of the baseline breast atlas shape. A deformation grid is generated using the control points. Using the deformation grid, the control points are warped onto training breast mask images. Shape representations for the training breast mask images are generated by the corresponding warped control points, together with centers of mass of the shapes defined by the warped control points. Additional details about generating shape representations for training breast images can be found in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference.
  • Principal modes of deformation between training breast mask images and the baseline breast atlas shape may be determined using the shape representations for the training breast mask images. Principal modes of deformation can be found using Principal Components Analysis (PCA) techniques. The principal components obtained from PCA represent modes of deformation between training breast mask images and the baseline breast atlas shape. Additional details regarding extraction of deformation modes are found in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference.
  • The baseline breast atlas shape, and the modes of deformation between training breast mask images and the baseline breast atlas shape define a shape model. Shape models can be obtained during off-line training, for each mammogram view. Shape models are stored in reference data unit 158A.
  • A new breast mask shape received from image operations unit 128A may then be represented using a shape model from reference data unit 158A. A breast mask shape may be expressed as a function of the baseline breast atlas shape, which may be a mean breast shape (Ba) in an exemplary embodiment, and of the shape model deformation modes, as:
  • Breast Shape = p + B a + i = 1 k α i L i ( 1 )
  • where p is an offset (such as a 2D offset) to the mean breast shape Ba to account for a rigid translation of the entire shape, Li, i=1 . . . k is the set of deformation modes of the shape model, and αi, i=1 . . . k are a set of parameters that define the deviations of Breast Shape from the mean breast shape along the axes associated with the principal deformation modes. The parameters αi, i=1 . . . k are specific to each breast mask. Hence, an arbitrary breast mask may be expressed as a sum of the fixed mean breast shape (Ba), a linear combination of fixed deformation modes Li multiplied by coefficients αi, and a 2D offset p. Details on how a mean breast shape/baseline breast atlas shape Ba and deformation modes Li, i=1 . . . k are obtained during training, using training breast images can be found in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference.
  • Each mammogram view vi is associated with a mean breast shape (Ba vi) specific to that view, and with a set of deformation modes Li vi, i=1 . . . kvi specific to that view.
  • For each breast mask image Bmask new received from image operations unit 128A, shape registration unit 138A retrieves the mean breast shape (Ba vi) and the set of deformation modes Li vi, i=1 . . . kvi associated with the view vi of the breast mask image Bmask new. Shape registration unit 138A next identifies the parameters αi, i=1 . . . kvi and the 2D offset p for the breast mask image Bmask new, to fit the breast mask image Bmask new with its correct shape representation of the form:
  • Breast Shape = B a _ vi + p + i = 1 k vi α i L i _ vi
  • Atlas warping unit 340 receives the registration results for the breast mask image Bmask new from shape registration unit 138A. Registration results for the breast mask image Bmask new include the parameters αi, i=1 . . . kvi for the breast mask image Bmask new and the functional representation
  • Breast Shape = B a _ vi + p + i = 1 k vi α i L i _ vi
  • for the breast mask image Bmask new. Atlas warping unit 340 then warps the breast mask image Bmask new to the mean breast shape Ba vi. Atlas warping unit 340 may, alternatively, warp the breast mask image Bmask new to a probabilistic feature atlas Avi specific to the view vi of the breast mask image Bmask new. The probabilistic feature atlas data is stored in reference data unit 158A.
  • The probabilistic feature atlas Avi; includes an image of the mean breast shape Ba vi for view vi, together with probabilities for presence of a feature at each pixel in the mean breast shape Ba vi. Hence, the probabilistic atlas Avi; is a weighted pixel image, in which each pixel of the mean breast shape Ba vi is weighted by the feature probability for that pixel.
  • The probabilistic feature atlas is obtained by training off-line, using large sets of training breast images with previously identified feature structures. Features recorded in probabilistic atlases may be cancer masses in breasts, benign formations in breasts, breast vessel areas, etc. The shapes of training breast images are represented as linear combinations of deformation modes obtained in training. Using the shape representations for the training breast images, previously identified features in the training breast images are mapped to the baseline-breast atlas shape obtained in training. By overlapping feature positions from the training images onto the baseline breast atlas shape, a probabilistic atlas containing probabilities for presence of a feature in the baseline breast atlas shape is obtained. Additional details on generation of a probabilistic atlas using sets of training breast images with previously identified features can be found in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference.
  • After atlas warping unit 340 warps the breast mask image Bmask new to the probabilistic atlas Avi; or to the mean breast shape Ba vi, a warped breast mask image Bmask new warped is obtained. Feature probability weights from the probabilistic atlas Avi are associated with pixels in the warped image Bmask new warped. The baseline nipple position from the mean breast shape Ba vi is also associated with pixels in the warped image Bmask new warped.
  • Nipple detection unit 350 receives the warped breast mask image Bmask new warped, together with shape registration information of the form
  • Breast Shape = B a _ vi + p + i = 1 k vi α i L i _ vi ,
  • that establishes a correspondence between pixels of Bmask new warped and pixels of Bmask new.
  • Nipple detection unit 350 warps the Bmask new warped image back to the original Bmask new, and an image Pmask new, is obtained. Since the baseline nipple position has been identified in the baseline breast atlas shape during off-line training, and since Bmask new warped has the shape of the baseline breast atlas shape, the image Pmask new includes a warped nipple position from Bmask new warped to Bmask new. Hence, the image Pmask new is an image of the breast mask image Bmask new, in which the position of the nipple has been identified. Therefore, the image Pmask new includes nipple detection results for the original breast mask Bmask new.
  • If atlas warping unit 340 warped the breast mask image Bmask new to probabilistic atlas Avi, the image Pmask new includes feature probabilities for various breast features, at various pixel locations inside the breast mask image Bmask new. Hence, in this case, the image Pmask new is a weighted pixel image, in which each pixel of the breast mask image Bmask new is weighted by the feature probability for that pixel. If the feature is a cancer structure for example, the image Pmask new is a weighted pixel image, in which each pixel of the breast mask image Bmask new is weighted by the probability for cancer at that pixel. Additional details on mapping feature probabilities from a probabilistic atlas Avi; to a breast mask image Bmask new to obtain a probability map for a feature in a breast mask image Bmask new can be found in the co-pending non-provisional application titled “Method and Apparatus of Using Probabilistic Atlas for Cancer Detection”, the entire contents of which are hereby incorporated by reference.
  • The identified nipple position in image Pmask new provides very useful information for detection of the nipple and for the position of the nipple with respect to the breast. If the image Pmask new includes cancer probabilities associated with pixels of the Bmask new breast mask from a probabilistic cancer atlas, image Pmask new provides information about the nipple position with respect to probable locations of cancer masses in the Bmask new breast mask. The position of the nipple with respect to the breast can be used to detect breast abnormalities. Since the position of the nipple with respect to the breast is influenced by breast abnormalities, information about nipple position and nipple proximity to high probability cancer regions in breast help in identification of cancer masses, structural changes, breast abnormalities, etc.
  • The initially identified nipple position in image Pmask new (and hence in breast mask image Bmask new) can also be a starting point for performing a refinement of the nipple position. Refinement of the nipple position can be performed, for example, in regions adjacent to or including the initially identified nipple position in image Pmask new.
  • Nipple detection unit 350 outputs the image Pmask new. The image Pmask new may be output to image output unit 58, printing unit 48, and/or display 68.
  • Image operations unit 128A, shape registration unit 138A, atlas warping unit 340, nipple detection unit 350, and reference data unit 158A are software systems/applications. Image operations unit 128A, shape registration unit 138A, atlas warping unit 340, nipple detection unit 350, and reference data unit 158A may also be purpose built hardware such as FPGA, ASIC, etc.
  • FIG. 5 is a flow diagram illustrating operations performed by an image operations unit 128A included in an image processing unit 38A for feature removal/positioning according to an embodiment of the present invention illustrated in FIG. 4.
  • Image operations unit 128A receives a raw or preprocessed breast image from image input unit 28 (S401). The breast image may be retrieved by image operations unit 128A from, for example, a breast imaging apparatus, a database of breast images, etc. Image operations unit 128A may perform preprocessing operations on the breast image (S403). Preprocessing operations may include resizing, cropping, compression, color correction, etc.
  • Image operations unit 128A creates a breast mask image for the breast image (S405). The breast mask image includes pixels that belong to the breast. The breast mask image may be created by detecting breast borders for the breast shown in the breast image. Image operations unit 128A may create a breast mask image by detecting breast borders using methods described in the US patent application titled “Method and Apparatus for Breast Border Detection”, application Ser. No. 11/366,495, by Daniel Russakoff and Akira Hasegawa, filed on Mar. 3, 2006, the entire contents of which are hereby incorporated by reference. With the techniques described in the “Method and Apparatus for Breast Border Detection” application, pixels in the breast image are represented in a multi-dimensional space, such as a 4-dimensional space with x-locations of pixels, y-locations of pixels, intensity value of pixels, and distance of pixels to a reference point. K-means clustering of pixels is run in the multi-dimensional space, to obtain clusters for the breast image. Cluster merging and connected components analysis are then run using relative intensity measures, brightness pixel values, and cluster size, to identify a cluster corresponding to the breast in the breast image. A set of pixels, or a mask, containing breast pixels is obtained. The set of pixels for a breast forms a breast mask Bmask.
  • Other breast border detection techniques may also be used by image operations unit 128A to obtain a breast mask image.
  • Image operations unit 128A also stores information about the breast image, such as information about the view of the mammogram (S407). Examples of mammogram views are MLL (medio-lateral left), MLR (medio-lateral right), CCL (cranio-caudal left), CCR (cranio-caudal right), RCC, LRR, LMLO (left medio-lateral oblique), and RMLO (right medio-lateral oblique). Image operations unit 128A outputs the breast mask image, and information about the view of the breast image (S409), to shape registration unit 138A.
  • FIG. 6 is a flow diagram illustrating operations performed by a shape registration unit 138A included in an image processing unit 38A for feature removal/positioning according to an embodiment of the present invention illustrated in FIG. 4.
  • Shape registration unit 138A receives from image operations unit 128A a preprocessed breast image, represented as a breast mask image Bmask new (S470). Information about the mammogram view vi of the breast image is also received (S470). Shape registration unit 138A retrieves from reference data unit 158A data that defines the shape model for that view, including a mean breast shape (Ba vi) and shape model deformation modes Li vi, i=1 . . . kvi for the view vi of the breast mask image Bmask new (S472).
  • Shape registration unit 138A fits the breast mask image Bmask new with its correct shape representation as a linear combination of the deformation modes,
  • Shape = B a _ vi + p + i = 1 k vi α i L i _ vi ,
  • by determining parameters αi, i=1 . . . kvi and the 2D offset p.
  • To fit the breast mask image Bmask new with its correct shape representation, shape registration unit 138A optimizes the αi values, together with an x offset px and a y offset py, for a total of k+2 parameters: (px, py, α), where α=(α1, α2, . . . , αk) and p=(px, py) (S478). For optimization, shape registration unit 138A uses a cost function defined as the mean distance to edge. For a (px, py, α) parameter set, shape registration unit 138A calculates the new shape resulting from this parameter set by formula
  • Shape = B a _ vi + p + i = 1 k vi α i L i _ vi ( S 480 ) .
  • The center of mass (Shape.COM) of Shape is then calculated (S480). For each shape point on the exterior (border) of Shape, shape registration unit 138A generates a ray containing the Shape.COM and the shape point, finds the intersection point of the ray with the edge of Bmask new, and calculates how far the shape point is from the intersection point obtained in this manner. This technique is further illustrated in FIG. 8D. In an alternative embodiment, the minimum distance from the shape point to the edge of Bmask new is calculated. The mean distance for the Shape points to the edges of the breast mask image Bmask new is then calculated (S482). Optimized αi values and 2D offset p are selected for which the mean distance of shape points of Shape to the breast mask image Bmask new edges attains a minimum (S484).
  • Shape registration unit 138A may use the downhill simplex method, also known as the Nelder-Mead or the amoeba algorithm (S486), to fit the breast mask image Bmask new with its correct shape representation, by minimizing distances of the edge shape points of Shape to the edges of the breast mask image Bmask new. The downhill simplex method is a single-valued minimization algorithm that does not require derivatives. The downhill simplex algorithm is typically very robust.
  • With the Nelder-Mead method, the k+2 parameters (px, py, α) form a simplex in a multi-dimensional space. The Nelder-Mead method minimizes the selected cost function, by moving points of the simplex to decrease the cost function. A point of the simplex may be moved by reflections against a plane generated by other simplex points, by reflection and expansion of the simplex obtained from a previous reflection, by contraction of the simplex, etc.
  • Once parameters of the shape model are optimized for the breast mask image Bmask new, shape registration unit 138A outputs the shape registration results for the breast mask image Bmask new to atlas warping unit 301 (S492).
  • FIG. 7 is a flow diagram illustrating exemplary operations performed by a feature removal and positioning unit 148A included in an image processing unit 38A for feature removal/positioning according to an embodiment of the present invention illustrated in FIG. 4. FIG. 7 illustrates exemplary operations that may be performed by an atlas warping unit 340 and a nipple detection unit 350 included in a feature removal and positioning unit 148A.
  • Atlas warping unit 340 warps the registered shape for breast mask image Bmask new to a probabilistic atlas Avi, or to a baseline breast atlas shape Ba vi, associated with the view vi of the breast mask image Bmask new. Warping to probabilistic atlas Avi; or to the baseline breast atlas shape Ba vi may be performed by triangulating the breast mask Bmask new based on its center of mass and edge points (S501). After shape registration has been performed by shape registration unit 138A, each triangle in the breast mask Bmask new corresponds to a triangle in the probabilistic atlas Avi and to a triangle in the baseline breast atlas shape Ba vi (S503), as the probabilistic atlas Avi has the shape of the baseline breast atlas shape Ba vi. Pixels inside corresponding triangles of the atlas Avi (or Ba vi) can be warped back and forth into triangles of breast mask Bmask new, using a bilinear interpolation in 2D (S503). In an exemplary implementation, the bilinear interpolation in 2D may be performed by multiplying each of the triangle vertices by appropriate relative weights, as further described at FIG. 8J.
  • Nipple detection unit 350 warps back corresponding triangles of the atlas Avi (or Ba vi), to triangles in breast mask Bmask new (S505). The nipple position for the breast mask image Bmask new is the warped nipple position from triangles of the baseline breast atlas shape Ba vi (or probabilistic atlas Avi) to triangles of the breast mask image Bmask new (S507). Hence, an image with a location for the nipple is obtained for the breast mask Bmask new (S507).
  • Feature probabilities associated with pixels in triangles of the atlas image Avi; may become associated with pixels in triangles of breast mask Bmask new, as further described in the co-pending non-provisional application titled “Method and Apparatus of Using Probabilistic Atlas for Cancer Detection”, the entire contents of which are hereby incorporated by reference. Hence, the image with an identified nipple location may also contain feature probability values associated with image pixels, for features such as cancer structures, benign structures, etc.
  • FIG. 8A illustrates an exemplary baseline breast atlas shape for the ML view, with identified nipple position. The exemplary baseline breast atlas shape for the ML view is included in a shape model stored in a reference data unit 158. The baseline breast atlas shape in FIG. 8A is a mean breast shape representing the set of pixels that have 95% or more chance of appearing in a breast mask image in the ML view. The nipple N has been identified on the mean breast shape.
  • FIG. 8B illustrates exemplary deformation modes for a shape model stored in the reference data unit 158. The breast shape in FIG. I510 is an exemplary baseline breast atlas shape (mean shape, in this case) for the ML view.
  • The first 3 modes (L1, L2, L3) of deformation are shown. The first mode of deformation is L1. Contours D2 and D3 define the deformation mode L1. The deformation mode L1 consists of directions and proportional length of movement for each contour point from the D2 contour to a corresponding contour point from the D3 contour. Contours D4 and D5 define the second deformation mode L2, and contours D6 and D7 define the third deformation mode L3.
  • The deformation modes shown in FIG. 8B may be obtained by training, using techniques described in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference.
  • FIG. 8C illustrates another set of exemplary deformation modes for a shape model stored in the reference data unit 158. The deformation modes shown in FIG. 8C were obtained by training a shape model using 4900 training breast images of ML view, using techniques described in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference. 17 deformation modes, capturing 99% of the variance in the breast images data set, were obtained. The representations of the first 4 modes L1, L2, L3 and L4 are shown in FIG. 8C. The representations of the first 4 modes L1, L2, L3 and L4 together capture 85% of the data's variance. For each mode shown in FIG. 8C, the mean breast shape (baseline breast atlas shape) for the ML view is plotted with dots (points), while the arrows represent the distance traveled by one point for that mode from −2 standard deviations to +2 standard deviations of the mean breast shape. Mode L1 captures 52% of the variance in the breast images data set, mode L2 captures 18% of the variance in the breast images data set, mode L3 captures 10% of the variance in the breast images data set, and mode L4 captures 4% of the variance in the breast images data set. The rest of the deformation modes (L5 to L17) are not shown.
  • FIG. 8D illustrates exemplary aspects of the operation of calculating a cost function by a shape registration unit 138A for a registered shape according to an embodiment of the present invention illustrated in FIG. 6. Shape registration is performed for the breast mask Bmask new I511 using an αi, i=1 . . . k parameter set and a 2D offset p. A shape bounded by contour C512 is obtained from formula
  • Shape = B a _ vi + p + i = 1 k α i L i _ vi ,
  • where Ba vi is a mean breast shape for view vi of the breast mask Bmask new, and Li vi, i=1 . . . kvi are shape model deformation modes. The center of mass COM for the Shape bounded by contour C512 is found. For a point S1 on the contour (perimeter) of Shape, a line is drawn through the COM point. The line intersects the contour of breast mask Bmask new I511 at point S2. The distance to edge is the distance d between points S1 and S2. Distances d are obtained for all points on the contour (perimeter) C512 of Shape, and a cost function is obtained as the mean of all distances d.
  • FIG. 8E illustrates exemplary results of the operation of performing shape registration for breast masks by a shape registration unit 138A according to an embodiment of the present invention illustrated in FIG. 6. As shown in FIG. 8E, breast masks I513 and I514 are fit with shape representations. The shape registration results bounded by contours C513 and C514 are effectively describing the shapes of breast masks I513 and I514. The downhill simplex algorithm was used by shape registration unit 138A to obtain the shape registration results shown in FIG. 8E.
  • FIG. 8F illustrates an exemplary ML view probabilistic atlas for probability of cancer in breasts stored in the reference data unit 158. For the ML view probabilistic atlas in FIG. 8F, the contour C515 is the contour of the mean breast shape (baseline breast atlas shape) Ba ML for the ML view. The region R515A indicates the highest probability of cancer, followed by regions R515B, then R515C, and R515D. As shown in the probabilistic atlas, the probability for cancer is largest in the center of a breast, and decreases towards edges of the mean breast shape.
  • FIG. 8G illustrates an exemplary CC view probabilistic atlas for probability of cancer in breasts stored in the probabilistic atlas reference data unit 158. For the CC view probabilistic atlas in FIG. 8G, the contour C516 is the contour of the mean breast shape for the CC view. The region R516A indicates the highest probability of cancer, followed by regions R516B, then R516C, and R516D. As shown in the probabilistic atlas, the probability for cancer is largest in the center left region of a breast, and decreases towards edges of the mean breast shape.
  • FIG. 8H illustrates exemplary aspects of the operation of detecting nipple position for a breast image by an image processing unit 38A for feature removal/positioning according to an embodiment of the present invention illustrated in FIG. 4. As illustrated in FIG. 8H, a breast image I518 is input by image operations unit 128A. Image operations unit 128A extracts a breast mask image I519 for the breast image I518. Shape registration unit 138A performs shape registration for the breast mask image, by representing the shape of the breast mask using a shape model. The shape registration contour C520 fits the shape of the breast mask from image I519. Atlas warping unit 340 warps the breast mask registered shape I520 to a probabilistic atlas (or alternatively to a baseline breast atlas shape) I522 that includes a detected baseline nipple N. Atlas warping unit 340 performs warping by generating a correspondence between pixels of the breast mask registered shape I520 and pixels of the probabilistic atlas (or of the baseline breast atlas shape) I522. Using the correspondence, nipple detection unit 350 warps the probabilistic atlas (or baseline breast atlas shape) I522 onto the breast mask registered shape I520, hence obtaining an image I523 with detected nipple position N′ corresponding to the baseline nipple position N, for the breast image I518.
  • FIG. 8I illustrates exemplary aspects of the operation of warping a breast mask to an atlas using triangulation by a feature removal and positioning unit 148A according to an embodiment of the present invention illustrated in FIG. 7.
  • Atlas warping unit 340 warps a registered shape S530 for a breast mask image Bmask new I530 to a probabilistic atlas Avi; (or to a baseline breast atlas shape) A532 shown in image I532. Warping to probabilistic atlas Avi; (or to baseline breast atlas shape) A532 is performed by triangulating the breast mask shape S530 based on its center of mass COM_530 and edge points. A test point P_530 is used to generate a triangle in the breast mask shape S530. For example, a triangle T_530 is generated using the center of mass COM_530 and the test point P_530 and touching the edges of mask shape S530. The triangle is warped to probabilistic atlas Avi (or to baseline breast atlas shape) A532 onto a corresponding triangle T_532, with the COM_530 and the test point P_530 mapped to corresponding points PC_532 and P_532. The probabilistic atlas Avi (or baseline breast atlas shape) A532 is then warped onto registered shape S530 by warping each triangle T_532 back onto the corresponding triangle T_530 of the breast mask Bmask new I530. The nipple position the probabilistic atlas Avi (or baseline breast atlas shape) A532 is hence warped onto registered shape S530 associated with the breast mask image Bmask new I530.
  • FIG. 8J illustrates exemplary aspects of the operation of bilinear interpolation according to an embodiment of the present invention illustrated in FIG. 7. The pixels inside corresponding triangles of the atlas Avi; (or baseline breast atlas shape Ba vi) can be warped back and forth to triangles in breast mask Bmask new, using a bilinear interpolation. For a correspondence between two triangles, bilinear interpolation in 2D is performed by multiplying each of the vertices by appropriate relative weights as described in FIG. 8J. Given a triangle with vertices A, B, and C, the pixel intensity at point D can be obtained as:

  • D=A*wA/T abc +B*wB/T abc +C*wC/T abc  (2)
  • where A, B, and C are pixel intensities at triangle vertices, Tabc is the area of triangle ABC, wA is the area of triangle BCD, wB is the area of triangle ACD, and wC is the area of triangle ABD, so that Tabc=wA+wB+wC. Hence, given pixels A, B, and C of a triangle inside atlas Avi (or inside Ba vi), and corresponding pixels A′, B′, and C′ of a corresponding triangle in breast mask Bmask new, a pixel D inside triangle ABC can be warped to a pixel D′ inside triangle A′B′C′, using equation (2) in triangle A′B′C′.
  • FIG. 9 is a block diagram of an image processing unit 38B for artifact removal and breast segmentation according to a second embodiment of the present invention illustrated in FIG. 2. As shown in FIG. 9, the image processing unit 38B according to this embodiment includes: an image operations unit 128B; a shape registration unit 138B; an optional atlas warping unit 340; an artifact removal unit 360; and a reference data unit 158B. The atlas warping unit 340 and the artifact removal unit 360 are included in a feature removal and positioning unit 148B.
  • Image operations unit 128B receives a breast image from image input unit 28, and may perform preprocessing and preparation operations on the breast image. Preprocessing and preparation operations performed by image operations unit 128B may include resizing, cropping, compression, color correction, etc., that change size and/or appearance of the breast image. Image operations unit 128B creates a breast mask image. Breast mask images may be created, for example, by detecting breast borders or breast clusters for the breasts shown in the breast image. Image operations unit 128B may also store/extract information about the breast image, such as view of mammogram.
  • Image operations unit 128B may perform preprocessing and breast mask extraction operations in a similar manner to image operations unit 128A described in FIG. 5. Image operations unit 128B may create a breast mask image by detecting breast borders using methods described in the US patent application titled “Method and Apparatus for Breast Border Detection”, application Ser. No. 11/366,495, by Daniel Russakoff and Akira Hasegawa, filed on Mar. 3, 2006, the entire contents of which are hereby incorporated by reference. Other methods may also be used to create a breast mask image.
  • Image operations unit 128B sends the breast mask images to shape registration unit 138B, which performs shape registration for the breast mask image. For shape registration, shape registration unit 138B describes the breast mask image using a shape model, to obtain a registered breast shape. Shape registration unit 138B retrieves information about the shape model from reference data unit 158B, which stores parameters that define the shape model.
  • The reference data unit 158B is similar to reference data unit 158A from FIG. 4. Reference data unit 158B stores shape models, and may also store probabilistic atlases for breast features. A shape model and an optional probabilistic atlas stored by reference data unit 158B can be generated off-line, using training breast images. Details on generation of a breast shape model and a probabilistic atlas using sets of training breast images can be found in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference. A shape model stored by reference data unit 158B includes a baseline breast atlas image and a set of deformation modes. A shape model stored by reference data unit 158B is similar to a shape model stored by reference data unit 158A as described at FIG. 4, with two differences. One difference is that the nipple of the baseline breast atlas shape need not be identified and marked for the baseline breast atlas shape stored by reference data unit 158B. The second difference pertains to the method of generation of the shape model during off-line training. The training breast images used to generate the shape model for reference data unit 158B off-line are preferably breast images without artifacts (such as tags, noise, frames, image scratches, lead markers, imaging plates, etc.), anomalies, or unusual structures. Training breast images without artifacts may be obtained by removing artifacts, anomalies, or unusual structures from the images manually or automatically, before off-line training. In that case, the baseline breast atlas shape obtained as described in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference, illustrates a baseline breast without artifacts, anomalies, or unusual structures. The deformation modes obtained as described in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference, describe variations between shapes of training breast images and the baseline breast atlas shape. Hence, linear combinations of the deformation modes will produce breast shapes without artifacts, anomalies, or unusual structures, because the deformation modes were obtained from training breast images that did not include artifacts, anomalies, or unusual structures. Reference data unit 158B stores information for shape models for breasts, for various views of mammograms.
  • Shape registration unit 138B may perform shape registration in a manner similar to shape registration unit 138A, as described at FIG. 6. Optional atlas warping unit 340 receives the registration results for a breast mask image from shape registration unit 138B, and warps the breast mask image to the baseline breast atlas shape from the shape model associated with the view of the breast mask image. Atlas warping unit 340 performs warping of breast mask images to baseline breast atlas shapes or to probabilistic atlases, as described in FIG. 4 and FIG. 7.
  • Using the image processing unit 38B it is possible to remove artifacts, such as tags, noise, frames, image scratches, lead markers, imaging plates, etc., from a breast image and perform an accurate segmentation of the breast in the breast image. For a breast image IT including artifacts, image operations unit 128B obtains a breast mask image BT mask. Shape registration unit 138B then performs shape registration for the breast mask image BT mask. Shape registration unit 138B expresses the breast mask image BT mask as a function of the baseline breast atlas shape, which may be a mean breast shape (Ba), and shape model deformation modes, as:
  • Breast Shape = B a + p + i = 1 k α i L i ,
  • where Li, i=1 . . . k is the set of deformation modes of the shape model, αi, i=1 . . . k are a set of parameters optimized by shape registration unit 138B for breast mask image BT mask, and p is an offset (such as a 2D offset) to the mean breast shape Ba to account for a rigid translation of the entire shape. Shape registration unit 138B retrieves baseline breast atlas shape data and deformation modes from reference data unit 158B. Since the shape model stored in reference data unit 158B was generated using training breast shape images without artifacts, anomalies, or unusual structures, the Breast Shape obtained from
  • Breast Shape = B a + p + i = 1 k α i L i
  • with optimized αi and p parameters will not include artifacts, anomalies, or unusual structures. In other words, the Breast Shape will optimize a fit to the original breast mask image BT mask, except for the artifacts that were present in the original breast mask image BT mask. The artifacts present in the original breast mask image BT mask have not been learned by the shape model stored in reference data unit 158B, and will not be fit. Hence, the Breast Shape represents a segmentation of the breast in the breast mask image BT mask, without the artifacts may have been present in breast mask image BT mask.
  • Artifact removal unit 360 receives the Breast Shape together with the breast mask image BT mask from shape registration unit 138B, and may extract artifacts by subtracting the Breast Shape from the breast mask image BT mask, to obtain an artifact mask image IArt
  • Artifact removal unit 360 can then apply the artifact mask image IArt to the original breast image IT, to identify artifact positions in the original breast image IT and remove the artifacts. Artifact removal unit 360 outputs a breast image IT′ without artifacts.
  • If the reference data unit 158B contains a probabilistic feature atlas, and atlas warping unit 340 is present in image processing unit 38B, breast segmentation with artifact removal may be combined with feature detection. For example, artifact removal may be achieved for an original breast image IT together with cancer detection using a probabilistic cancer atlas and/or comparative left-right breast analysis, as described in the co-pending non-provisional application titled “Method and Apparatus of Using Probabilistic Atlas for Cancer Detection”, the entire contents of which are hereby incorporated by reference.
  • Image operations unit 128B, shape registration unit 138B, optional atlas warping unit 340, artifact removal unit 360, and reference data unit 158B are software systems/applications. Image operations unit 128B, shape registration unit 138B, optional atlas warping unit 340, artifact removal unit 360, and reference data unit 158B may also be purpose built hardware such as FPGA, ASIC, etc.
  • FIG. 10A illustrates an exemplary output of an image processing unit 38B for artifact removal and breast segmentation according to a second embodiment of the present invention illustrated in FIG. 9. A breast mask I581 with a tag T582 is segmented by image processing unit 35B using a shape model that was constrained to remain within the shape space of typical breasts without artifacts. The final segmented breast shape 1583 obtained by image processing unit 35B does not contain the tag T582, as the segmented breast shape is constrained by the shape model to resemble a breast.
  • FIG. 10B illustrates another exemplary output of an image processing unit 38B for artifact removal and breast segmentation according to a second embodiment of the present invention illustrated in FIG. 9. A breast mask I591 with a skin fold T592 is segmented by image processing unit 35B using a shape model that was constrained to remain within the shape space of typical breasts without artifacts. The final segmented breast shape I593 obtained by image processing unit 35B does not contain the skin fold T592, as the segmented breast shape is constrained by the shape model to resemble a breast.
  • FIG. 11 is a block diagram of an image processing unit 38C for view detection according to a third embodiment of the present invention illustrated in FIG. 2. As shown in FIG. 11, the image processing unit 38C according to this embodiment includes: an image operations unit 128C; a shape registration unit 138C; a view decision unit 148C; and a reference data unit 158C. The view decision unit 148C is a feature removal and positioning unit.
  • Image operations unit 128C receives a breast image from image input unit 28, and may perform preprocessing and preparation operations on the breast image. Preprocessing and preparation operations performed by image operations unit 128C may include resizing, cropping, compression, color correction, etc., that change size and/or appearance of the breast image. Image operations unit 128C creates a breast mask image. Breast mask images may be created, for example, by detecting breast borders or breast clusters for the breasts shown in the breast image. Image operations unit 128C may also store/extract information about the breast image, such as view of mammogram.
  • Image operations unit 128C may perform preprocessing and breast mask extraction operations in a similar manner to image operations unit 128A described in FIG. 5. Image operations unit 128C may create a breast mask image by detecting breast borders using methods described in the US patent application titled “Method and Apparatus for Breast Border Detection”, application Ser. No. 11/366,495, by Daniel Russakoff and Akira Hasegawa, filed on Mar. 3, 2006, the entire contents of which are hereby incorporated by reference.
  • Image operations unit 128C sends the breast mask images to shape registration unit 138C, which performs shape registration for the breast mask image. For shape registration, shape registration unit 138C describes the breast mask image using a shape model, to obtain a registered breast shape. Shape registration unit 138C retrieves information about the shape model from reference data unit 158C, which stores parameters that define the shape model.
  • The reference data unit 158C is similar to reference data unit 158A from FIG. 4. The reference data unit 158C stores shape models, and may also store probabilistic atlases.
  • A shape model stored by reference data unit 158C can be generated off-line, using training breast images. Details on generation of a breast shape model using sets of training breast images can be found in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference. A shape model stored by reference data unit 158C includes a baseline breast atlas image and a set of deformation modes.
  • Shape registration unit 138C may perform shape registration in a manner similar to shape registration unit 138A, as described at FIG. 6. Shape registration unit 138C receives from image operations unit 128C a breast mask image Bmask of unknown mammogram view. Bmask could be, for example, an ML mammogram view for which the view direction of left or right is not known.
  • Shape registration unit 138C fits the breast mask image Bmask to a shape model M associated with one of left or right views, and obtains a registered image R1. Shape registration unit 138C then flips the breast mask image Bmask about a vertical axis to obtain a flipped breast mask Bmask Flipped, and then fits the flipped breast mask image Bmask Flipped to the same shape model M, to obtain a registered image R2.
  • View detection unit 148C receives breast mask images Bmask and Bmask Flipped, and registered images R1 and R2. View detection unit 148C then compares the fit of R1 to Bmask, and the fit of R2 to Bmask Flipped. If the fit of R1 to Bmask is better than the fit of R2 to Bmask Flipped, then the view associated with shape model M is the view of the breast image Bmask. On the other hand, if the fit of R2 to the Bmask Flipped is better than fit of R1 to Bmask, then the view associated with shape model M is the view of breast image Bmask Flipped. The view direction of the breast mask image Bmask is hence detected. View detection results are output to printing unit 48, display 68, d or image output unit 58.
  • The view of breast mask image Bmask may also be detected by comparison to a baseline shape. Let Ba be the baseline breast atlas shape associated with the shape model M. View detection unit 148C compares the differences between R1 and Ba, and the differences between R2 and Ba. If the differences between R1 and Ba are smaller than the differences between R2 and Ba, then the view associated with baseline breast atlas shape Ba (and hence with shape model M) is the view of breast image Bmask. On the other hand, if the differences between R2 and Ba are smaller than the differences between R1 and Ba, then the view associated with baseline breast atlas shape Ba (and hence with shape model M) is the view of breast image Bmask Flipped.
  • The view of breast mask images Bmask may also be detected by direct comparison of Bmask and Bmask Flipped with Ba, without performing shape registration of Bmask and Bmask Flipped. If the differences between Bmask and Ba are smaller than the differences between Bmask Flipped and Ba, then the view associated with baseline breast atlas shape Ba is the view of breast image Bmask. On the other hand, if the differences between Bmask and Ba are larger than the differences between Bmask Flipped and Ba, then the view associated with baseline breast atlas shape Ba is the view of breast image Bmask Flipped.
  • FIG. 12 is a block diagram of an image processing unit 39 for feature removal/positioning including a training system 772 according to a fourth embodiment of the present invention. As shown in FIG. 12, the image processing unit 39 includes the following components: an image operations unit 620; a baseline shape unit 710; a shape parameterization unit 720; a deformation analysis unit 730; a training shape registration unit 740; an atlas output unit 750; an image operations unit 128; a shape registration unit 138; a feature removal and positioning unit 148; and a reference data unit 158. Image operations unit 620, baseline shape unit 710, shape parameterization unit 720, deformation analysis unit 730, training shape registration unit 740, and atlas output unit 750 are included in a training system 772. Training shape registration unit 740 and atlas output unit 750 are optional, and may be included depending on the application. Image operations unit 128, shape registration unit 138, feature removal and positioning unit 148, and reference data unit 158 are included in an operation system 38.
  • Operation of the image processing unit 39 can generally be divided into two stages: (1) training; and (2) operation for positioning and for feature removal or detection.
  • The principles involved in the training stage have been described in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference. In accordance with this fourth embodiment illustrated in FIG. 12, the image operations unit 620, baseline shape unit 710, shape parameterization unit 720, deformation analysis unit 730, training shape registration unit 740, and atlas output unit 750 train to generate a shape model and a probabilistic feature atlas for breast shapes. The knowledge accumulated through training by training system 772 is sent to reference data unit 158. Image operations unit 620 and shape model unit 630 trains to generate a shape model. Optional probabilistic atlas generation unit 640 trains to generate a probabilistic atlas. The shape model and the probabilistic atlas are sent and stored in reference data unit 158.
  • In accordance with this fourth embodiment of the present invention, the image operations unit 128, the shape registration unit 138, the feature removal and positioning unit 148, and the reference data unit 158 may function in like manner to the corresponding elements of the first, second, or third embodiments illustrated in FIGS. 4, 9, and 11, or as a combination of two or more of the first, second, and third embodiments illustrated in FIGS. 4, 9, and 11. During regular operation of image processing unit 39, reference data unit 158 provides reference data training knowledge to shape registration unit 138 and to feature removal and positioning unit 148, for use in nipple detection, view detection, and artifact removal from breast images. The principles involved in the operation for nipple detection for new breast images have been described in FIGS. 4, 5, 6, 7, 8A, 8B, 8C, 8D, 8E, 8F, 8G, 8H, 8I and 8J. The principles involved in the operation for artifact removal from breast images have been described in FIGS. 9, 5, 6, 7, 8A, 8B, 8C, 8D, 8E, 8F, 8G, 8H, 8I and 8J. The principles involved in the operation for view detection for new breast images have been described in FIGS. 11, 5, 6, 7, 8A, 8B, 8C, 8D, 8E, 8F, 8G, 8H, 8I and 8J.
  • During the training stage, image operations unit 620 receives a set of training breast images from image input unit 28, performs preprocessing and preparation operations on the breast images, creates training breast mask images, and stores/extracts information about breast images, such as view of mammograms. Additional details regarding operation of image operations unit 620 are described in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference. Image operations unit 620 may create breast mask images by extracting breast borders using methods described in the US patent application titled “Method and Apparatus for Breast Border Detection”, application Ser. No. 11/366,495, by Daniel Russakoff and Akira Hasegawa, filed on Mar. 3, 2006, the entire contents of which are hereby incorporated by reference. Other breast border detection techniques can also be used by image operations unit 620 to obtain shape mask images for breast images.
  • Baseline shape unit 710 receives training breast mask images from image operations unit 620, and generates a baseline breast atlas shape such as, for example, a mean breast shape, from the training breast mask images. Baseline shape unit 710 may align the centers of mass of the training breast mask images. The alignment of centers of mass of training breast mask images results in a probabilistic map in which the brighter a pixel is, the more likely it is for the pixel to appear in a training breast mask image. A probability threshold may then be applied to the probabilistic map, to obtain a baseline breast atlas shape, such as, for example, a mean breast shape. Additional details regarding operation of baseline shape unit 710 are described in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference.
  • Shape parameterization unit 720 receives the training breast mask images and the baseline breast atlas shape, and warps the training breast mask images onto the baseline breast atlas shape, to define parameterization of breast shape. Shape parameterization unit 720 may use shape parameterization techniques adapted from “Automatic Generation of Shape Models Using Nonrigid Registration with a Single Segmented Template Mesh” by G. Heitz, T. Rohlfing and C. Maurer, Proceedings of Vision, Modeling and Visualization, 2004, the entire contents of which are hereby incorporated by reference. Control points may be placed along the edges of the baseline breast atlas shape. A deformation grid is generated using the control points. Using the deformation grid, the control points are warped onto training breast mask images. Shape information for training breast mask images is then given by the corresponding warped control points together with centers of mass of the shapes defined by the warped control points. Warping of control points from the baseline breast atlas shape onto training breast mask images may be performed by non-rigid registration, with B-splines transformations used to define warps from baseline breast atlas shape to training breast mask images. Shape parameterization unit 720 may perform non-rigid registration using techniques discussed in “Automatic Construction of 3-D Statistical Deformation Models of the Brain Using Nonrigid Registration”, by D. Rueckert, A. Frangi and J. Schnabel, IEEE Transactions on Medical Imaging, 22(8), p. 1014-1025, August 2003, the entire contents of which are hereby incorporated by reference. Shape parameterization unit 720 outputs shape representations for training breast mask images. Additional details regarding operation of shape parameterization unit 720 are described in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference.
  • Deformation analysis unit 730 uses breast shape parameterization results to learn a shape model that describes how shape varies from breast to breast. Using representations of shape for the training breast mask images, deformation analysis unit 730 finds the principal modes of deformation between the training breast mask images and the baseline breast atlas shape. Deformation analysis unit 730 may use Principal Components Analysis (PCA) techniques to find the principal modes of deformation. The principal components obtained from PCA represent modes of deformation between training breast mask images and the baseline breast atlas shape. Additional details regarding operation of deformation analysis unit 730 are described in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference.
  • The baseline breast atlas shape and the modes of deformation between training breast mask images and the baseline breast atlas shape, define a shape model. A shape model can be obtained for each mammogram view. Shape model information is sent to reference data unit 158, to be used during operation of image processing unit 39.
  • Training shape registration unit 740 receives data that defines the shape model. Training shape registration unit 740 then fits training breast mask images with their correct shape representations, which are linear combinations of the principal modes of shape variation. Shape registration unit 740 may use the downhill simplex method, also known as the Nelder-Mead or the amoeba algorithm, to optimize parameters of the shape model for each training breast mask image in the training dataset, and optimally describe training breast mask images using the shape model. Additional details regarding operation of training shape registration unit 740 are described in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference.
  • Atlas output unit 750 receives from training shape registration unit 740 the results of shape registration for the set of training breast mask images analyzed. The set of training breast mask images have features that have been previously localized. Features could be cancer structures, benign structures, vessel areas, etc. Using shape registration results, the localized features in the training breast mask images are mapped from the training breast mask images onto the baseline breast atlas shape. An atlas is created with locations of the features in the baseline breast atlas shape. Since a large number of training breast mask images with previously localized features are used, the atlas is a probabilistic atlas that gives the probability for feature presence at each pixel inside the baseline breast atlas shape. One probabilistic atlas may be generated for each mammogram view. The probabilistic feature atlases for various breast views are sent to reference data unit 158, to be used during operation of image processing unit 39. Additional details regarding operation of atlas output unit 750 are described in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference.
  • Image operations unit 620, baseline shape unit 710, shape parameterization unit 720, deformation analysis unit 730, training shape registration unit 740, atlas output unit 750, image operations unit 128, shape registration unit 138, feature removal and positioning unit 148, and probabilistic atlas reference unit 158 are software systems/applications. Image operations unit 620, baseline shape unit 710, shape parameterization unit 720, deformation analysis unit 730, training shape registration unit 740, atlas output unit 750, image operations unit 128, shape registration unit 138, feature removal and positioning unit 148, and probabilistic atlas reference unit 158 may also be purpose built hardware such as FPGA, ASIC, etc.
  • Methods and apparatuses disclosed in this application can be used for breast segmentation, artifact removal, mammogram view identification, nipple detection, etc. Methods and apparatuses disclosed in this application can be combined with methods and apparatuses disclosed in the co-pending non-provisional application titled “Method and Apparatus of Using Probabilistic Atlas for Cancer Detection”, the entire contents of which are hereby incorporated by reference, to perforin breast segmentation, artifact removal, mammogram view identification, nipple detection, together with cancer detection for mammography images. Shape models and probabilistic atlases generated using techniques described in the co-pending non-provisional application titled “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, the entire contents of which are hereby incorporated by reference, can be used for breast segmentation, artifact removal, mammogram view identification, nipple detection, and cancer detection. Additional applications, such as temporal subtraction between breast images can be implemented using methods and apparatuses disclosed in this application, and methods and apparatuses disclosed in “Method and Apparatus of Using Probabilistic Atlas for Cancer Detection”.
  • The methods and apparatuses disclosed in this application can be used for automatic detection of other features besides nipples in breasts. The methods and apparatuses can be used for feature removal, feature detection, feature positioning, and segmentation for other anatomical parts besides breasts, by using shape modeling techniques for the anatomical parts and atlases for locations of features in the anatomical parts. The methods and apparatuses disclosed in this application can be coupled with methods and apparatuses from “Method and Apparatus of Using Probabilistic Atlas for Cancer Detection” using shape models and probabilistic atlases generated as described in “Method and Apparatus for Probabilistic Atlas Based on Shape Modeling Technique”, to perform feature removal, feature detection, feature positioning, and object segmentation for other objects and anatomical objects besides breasts, and other features besides cancer structures or breast features.
  • Although detailed embodiments and implementations of the present invention have been described above, it should be apparent that various modifications are possible without departing from the spirit and scope of the present invention.

Claims (12)

1. An image processing method, said method comprising:
accessing digital image data representing an object;
accessing reference data including
a baseline object including an element, and
a shape model relating to shape variation from said baseline object; and
determining location of said element in said object, said determining step including
generating a correspondence between a geometric part associated with said baseline object and a geometric part associated with said object, by representing a shape of said object using said shape model, to obtain a registered shape, and
mapping said element from said baseline object onto said registered shape using said correspondence.
2. The image processing method as recited in claim 1, wherein
said shape model includes deformation modes to describe shape deformation between said shape of said object and said baseline object, and
said generating sub-step represents said shape of said object using said shape model by fitting said shape of said object using combinations of said deformation modes.
3. The image processing method as recited in claim 2, wherein said generating sub-step fits said shape of said object to linear combinations of said deformation modes, by optimizing linear coefficients for said deformation modes.
4. The image processing method as recited in claim 1, wherein said object is a breast, and said element is a nipple of said breast.
5. The image processing method as recited in claim 1, further comprising:
training using a plurality of training objects to generate said baseline object and said shape model, said training step including
generating said baseline object by aligning shapes of said plurality of training objects using centers of mass of said plurality of training objects, and
determining deformation modes using Principal Component Analysis, to describe shape deformations between said shapes of said plurality of training objects and said baseline object.
6. The image processing method as recited in claim 1, wherein said generating step includes
triangulating said registered shape using center of mass and edge points of said registered shape to obtain a plurality of triangles, and
generating said correspondence between said plurality of triangles and a plurality of baseline triangles in said baseline object.
7. An image processing apparatus, said apparatus comprising:
an image data input unit for providing digital image data representing an object;
a reference data unit for providing reference data including
a baseline object including an element, and
a shape model relating to shape variation from said baseline object; and
an element detection unit for determining location of said element in said object, said element detection unit determining location by
generating a correspondence between a geometric part associated with said baseline object and a geometric part associated with said object, by representing a shape of said object using said shape model, to obtain a registered shape, and
mapping said element from said baseline object onto said registered shape using said correspondence.
8. The apparatus according to claim 7, wherein
said shape model includes deformation modes to describe shape deformation between said shape of said object and said baseline object, and
said element detection unit represents said shape of said object using said shape model by fitting said shape of said object using combinations of said deformation modes.
9. The apparatus according to claim 8, wherein said element detection unit fits said shape of said object to linear combinations of said deformation modes, by optimizing linear coefficients for said deformation modes.
10. The apparatus according to claim 7, wherein said object is a breast, and said element is a nipple of said breast.
11. The apparatus according to claim 7, further comprising:
a training unit for training using a plurality of training objects to generate said baseline object and said shape model, said training unit training by
generating said baseline object by aligning shapes of said plurality of training objects using centers of mass of said plurality of training objects, and
determining deformation modes using Principal Component Analysis, to describe shape deformations between said shapes of said plurality of training objects and said baseline object.
12. The apparatus according to claim 7, wherein said element detection unit generating said correspondence by
triangulating said registered shape using center of mass and edge points of said registered shape to obtain a plurality of triangles, and
generating said correspondence between said plurality of triangles and a plurality of baseline triangles in said baseline object.
US13/367,744 2006-12-19 2012-02-07 Method and apparatus of using probabilistic atlas for feature removal/positioning Abandoned US20120134568A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/367,744 US20120134568A1 (en) 2006-12-19 2012-02-07 Method and apparatus of using probabilistic atlas for feature removal/positioning

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/640,960 US8135199B2 (en) 2006-12-19 2006-12-19 Method and apparatus of using probabilistic atlas for feature removal/positioning
US13/367,744 US20120134568A1 (en) 2006-12-19 2012-02-07 Method and apparatus of using probabilistic atlas for feature removal/positioning

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/640,960 Division US8135199B2 (en) 2006-12-19 2006-12-19 Method and apparatus of using probabilistic atlas for feature removal/positioning

Publications (1)

Publication Number Publication Date
US20120134568A1 true US20120134568A1 (en) 2012-05-31

Family

ID=39527307

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/640,960 Expired - Fee Related US8135199B2 (en) 2006-12-19 2006-12-19 Method and apparatus of using probabilistic atlas for feature removal/positioning
US13/367,744 Abandoned US20120134568A1 (en) 2006-12-19 2012-02-07 Method and apparatus of using probabilistic atlas for feature removal/positioning

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/640,960 Expired - Fee Related US8135199B2 (en) 2006-12-19 2006-12-19 Method and apparatus of using probabilistic atlas for feature removal/positioning

Country Status (2)

Country Link
US (2) US8135199B2 (en)
JP (3) JP2008178672A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110142308A1 (en) * 2009-12-10 2011-06-16 Canon Kabushiki Kaisha Information processing apparatus, information processing method, and storage medium
US20120114213A1 (en) * 2009-07-17 2012-05-10 Koninklijke Philips Electronics N.V. Multi-modality breast imaging
US20170055844A1 (en) * 2015-08-27 2017-03-02 Canon Kabushiki Kaisha Apparatus and method for acquiring object information

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8634610B2 (en) * 2008-06-20 2014-01-21 The Trustees Of The University Of Pennsylvania System and method for assessing cancer risk
US8139832B2 (en) * 2008-12-12 2012-03-20 Hologic, Inc. Processing medical images of the breast to detect anatomical abnormalities therein
EP2504781A4 (en) * 2009-11-26 2014-05-14 Agency Science Tech & Res A method for construction and use of a probabilistic atlas for diagnosis and prediction of a medical outcome
US9454823B2 (en) * 2010-07-28 2016-09-27 arian Medical Systems, Inc. Knowledge-based automatic image segmentation
WO2012112907A2 (en) * 2011-02-17 2012-08-23 Dartmouth College System and method for providing registration between breast shapes before and during surgery
CN102956035A (en) * 2011-08-25 2013-03-06 深圳市蓝韵实业有限公司 Preprocessing method and preprocessing system used for extracting breast regions in mammographic images
CN103020969B (en) * 2012-12-25 2015-12-23 中国科学院深圳先进技术研究院 A kind of disposal route of CT image liver segmentation and system
EP2779090B1 (en) * 2013-03-11 2018-09-19 Siemens Healthcare GmbH Assignment of localisation data
US9305358B2 (en) * 2013-07-01 2016-04-05 Kabushiki Kaisha Toshiba Medical image processing
GB201320688D0 (en) * 2013-11-22 2014-01-08 Materialise Nv System and method for constructing a statistical shape model
CN103637815B (en) * 2013-12-18 2015-08-19 深圳市安健科技有限公司 The determination method and system of mammary gland automatic exposure reference zone
JP6738332B2 (en) * 2014-12-16 2020-08-12 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Correspondence probability map driven visualization
WO2017116512A1 (en) * 2015-12-28 2017-07-06 Metritrack, Inc. System and method for the coregistration of medical image data
US11246551B2 (en) 2016-09-20 2022-02-15 KUB Technologies, Inc. System and method for computer aided detection (CAD) in a breast specimen radiograph
US10503998B2 (en) 2016-11-07 2019-12-10 Gracenote, Inc. Recurrent deep neural network system for detecting overlays in images
JP6643416B2 (en) * 2018-08-01 2020-02-12 キヤノン株式会社 Image processing apparatus, image processing method, and program
JP7125312B2 (en) * 2018-09-07 2022-08-24 富士フイルムヘルスケア株式会社 MAGNETIC RESONANCE IMAGING APPARATUS, IMAGE PROCESSING APPARATUS, AND IMAGE PROCESSING METHOD
JP7304508B2 (en) * 2019-02-19 2023-07-07 株式会社シンクアウト Information processing system and information processing program
US11315221B2 (en) * 2019-04-01 2022-04-26 Canon Medical Systems Corporation Apparatus and method for image reconstruction using feature-aware deep learning

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6453058B1 (en) * 1999-06-07 2002-09-17 Siemens Corporate Research, Inc. Computer-assisted diagnosis method using correspondence checking and change detection of salient features in digital images
US20070047790A1 (en) * 2005-08-30 2007-03-01 Agfa-Gevaert N.V. Method of Segmenting Anatomic Entities in Digital Medical Images
US20080075367A1 (en) * 2006-09-21 2008-03-27 Microsoft Corporation Object Detection and Recognition System
US20080205757A1 (en) * 2005-02-10 2008-08-28 Koninklijke Philips Electronics N. V. Method, A System And A Computer Program For Segmenting A Surface In A Multidimensional Dataset

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2720383B2 (en) * 1987-03-11 1998-03-04 株式会社豊田中央研究所 Image density region detection device
JPH0279177A (en) * 1988-09-16 1990-03-19 Fuji Photo Film Co Ltd Method and detecting contour of breast in radiation picture
US5111512A (en) * 1991-05-14 1992-05-05 At&T Bell Laboratories Method for signature verification
JP3486461B2 (en) * 1994-06-24 2004-01-13 キヤノン株式会社 Image processing apparatus and method
US6356272B1 (en) * 1996-08-29 2002-03-12 Sanyo Electric Co., Ltd. Texture information giving method, object extracting method, three-dimensional model generating method and apparatus for the same
JPH10305015A (en) * 1997-05-09 1998-11-17 Toshiba Iyou Syst Eng Kk Image displaying method utilizing after-image effect and device therefor
US6118887A (en) * 1997-10-10 2000-09-12 At&T Corp. Robust multi-modal method for recognizing objects
JP2000342558A (en) * 1999-06-04 2000-12-12 Konica Corp Image positioning processor and inter-picture arithmetic processor
GB0006598D0 (en) * 2000-03-17 2000-05-10 Isis Innovation Three-dimensional reconstructions from images
WO2002014982A2 (en) * 2000-08-11 2002-02-21 Holomage, Inc. Method of and system for generating and viewing multi-dimensional images
US6873718B2 (en) * 2001-10-12 2005-03-29 Siemens Corporate Research, Inc. System and method for 3D statistical shape model for the left ventricle of the heart
US7305131B2 (en) * 2002-10-01 2007-12-04 Hewlett-Packard Development Company, L.P. Extracting graphical bar codes from an input image
JP2006527057A (en) * 2003-06-13 2006-11-30 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Segmentation of 3D images
US7212664B2 (en) * 2003-08-07 2007-05-01 Mitsubishi Electric Research Laboratories, Inc. Constructing heads from 3D models and 2D silhouettes
JP4027867B2 (en) * 2003-09-10 2007-12-26 株式会社日立ハイテクノロジーズ Biomagnetic field measurement device
US7190826B2 (en) * 2003-09-16 2007-03-13 Electrical Geodesics, Inc. Measuring the location of objects arranged on a surface, using multi-camera photogrammetry
JP2006234494A (en) * 2005-02-23 2006-09-07 Aisin Seiki Co Ltd Object recognizing
US20070081706A1 (en) * 2005-09-28 2007-04-12 Xiang Zhou Systems and methods for computer aided diagnosis and decision support in whole-body imaging
US7885455B2 (en) * 2006-04-14 2011-02-08 UTC Fire & Security Americas Corporation, Inc Method of combining images of multiple resolutions to produce an enhanced active appearance model
US7907768B2 (en) * 2006-12-19 2011-03-15 Fujifilm Corporation Method and apparatus for probabilistic atlas based on shape modeling technique

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6453058B1 (en) * 1999-06-07 2002-09-17 Siemens Corporate Research, Inc. Computer-assisted diagnosis method using correspondence checking and change detection of salient features in digital images
US20080205757A1 (en) * 2005-02-10 2008-08-28 Koninklijke Philips Electronics N. V. Method, A System And A Computer Program For Segmenting A Surface In A Multidimensional Dataset
US20070047790A1 (en) * 2005-08-30 2007-03-01 Agfa-Gevaert N.V. Method of Segmenting Anatomic Entities in Digital Medical Images
US20080075367A1 (en) * 2006-09-21 2008-03-27 Microsoft Corporation Object Detection and Recognition System

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120114213A1 (en) * 2009-07-17 2012-05-10 Koninklijke Philips Electronics N.V. Multi-modality breast imaging
US8977018B2 (en) * 2009-07-17 2015-03-10 Koninklijke Philips N.V. Multi-modality breast imaging
US20110142308A1 (en) * 2009-12-10 2011-06-16 Canon Kabushiki Kaisha Information processing apparatus, information processing method, and storage medium
US8768018B2 (en) * 2009-12-10 2014-07-01 Canon Kabushiki Kaisha Information processing apparatus, information processing method, and storage medium
US20170055844A1 (en) * 2015-08-27 2017-03-02 Canon Kabushiki Kaisha Apparatus and method for acquiring object information

Also Published As

Publication number Publication date
US20080144940A1 (en) 2008-06-19
JP2013172989A (en) 2013-09-05
US8135199B2 (en) 2012-03-13
JP2013146585A (en) 2013-08-01
JP2008178672A (en) 2008-08-07

Similar Documents

Publication Publication Date Title
US8135199B2 (en) Method and apparatus of using probabilistic atlas for feature removal/positioning
US7792348B2 (en) Method and apparatus of using probabilistic atlas for cancer detection
US7907768B2 (en) Method and apparatus for probabilistic atlas based on shape modeling technique
US8958625B1 (en) Spiculated malignant mass detection and classification in a radiographic image
Sluimer et al. Toward automated segmentation of the pathological lung in CT
US7653263B2 (en) Method and system for volumetric comparative image analysis and diagnosis
US6766043B2 (en) Pleural nodule detection from CT thoracic images
US8345943B2 (en) Method and apparatus for registration and comparison of medical images
EP1465109A2 (en) Method for automated analysis of digital chest radiographs
US9659390B2 (en) Tomosynthesis reconstruction with rib suppression
US20060004278A1 (en) Method, system, and computer software product for feature-based correlation of lesions from multiple images
Hogeweg et al. Clavicle segmentation in chest radiographs
US20070003118A1 (en) Method and system for projective comparative image analysis and diagnosis
US20070014448A1 (en) Method and system for lateral comparative image analysis and diagnosis
WO1999005641A1 (en) Method for detecting interval changes in radiographs
US20060050944A1 (en) Nipple detection apparatus and program
Yoshida Local contralateral subtraction based on bilateral symmetry of lung for reduction of false positives in computerized detection of pulmonary nodules
EP1956554B1 (en) Visual enhancement of interval changes using a temporal subtraction technique
JP2004188202A (en) Automatic analysis method of digital radiograph of chest part
Klinder et al. Lobar fissure detection using line enhancing filters
Iakovidis et al. Robust model-based detection of the lung field boundaries in portable chest radiographs supported by selective thresholding
Gleason et al. Automatic screening of polycystic kidney disease in x-ray CT images of laboratory mice
Katartzis et al. A MRF-based approach for the measurement of skin thickness in mammography
Nikitenko Hierarchical elastic registration of mammograms

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION