WO1996038815A1 - Automatic border delineation and dimensioning of regions using contrast enhanced imaging - Google Patents

Automatic border delineation and dimensioning of regions using contrast enhanced imaging Download PDF

Info

Publication number
WO1996038815A1
WO1996038815A1 PCT/US1996/008257 US9608257W WO9638815A1 WO 1996038815 A1 WO1996038815 A1 WO 1996038815A1 US 9608257 W US9608257 W US 9608257W WO 9638815 A1 WO9638815 A1 WO 9638815A1
Authority
WO
WIPO (PCT)
Prior art keywords
border
pixel
recited
point
baseline
Prior art date
Application number
PCT/US1996/008257
Other languages
French (fr)
Inventor
Harold Levene
Original Assignee
Molecular Biosystems, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Molecular Biosystems, Inc. filed Critical Molecular Biosystems, Inc.
Priority to JP8536746A priority Critical patent/JPH11506950A/en
Priority to AU59629/96A priority patent/AU5962996A/en
Priority to EP96916909A priority patent/EP0829068A1/en
Publication of WO1996038815A1 publication Critical patent/WO1996038815A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/28Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user
    • G06T2207/20104Interactive definition of region of interest [ROI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30048Heart; Cardiac

Definitions

  • the present invention relates in general to a method for processing ultrasound
  • ROIs regions of interest
  • SPECT single photon emission computed tomography
  • PET positron emission tomography
  • CT computed tomography
  • MRI magnetic resonance imaging
  • SPECT single photon emission computed tomography
  • PET positron emission tomography
  • MRI magnetic resonance imaging
  • angiography angiography
  • ROI region of interest
  • the ultrasonic transducer utilized is placed on a body surface overlying the area to be imaged, and sound waves are directed toward that area.
  • the transducer detects reflected sound waves and the attached scanner translates the data into video images.
  • the amount of energy reflected depends upon the frequency of the transmission and the acoustic properties of the substance. Changes in the substance's acoustic properties (e.g. variance in the acoustic impedance) are most prominent at the interfaces of different acoustic densities and compressibilities, such as liquid-solid or liquid-gas. Consequently, when ultrasonic energy is directed through tissue, organ structures generate sound reflection signals for detection by the ultrasonic scanner. These signals can be intensified by the proper use of a contrast agent.
  • contrast agents there are several types of contrast agents including liquid emulsions, solids, encapsulated fluids and those which employ the use of gas.
  • the latter agents are of particular importance because of their efficiency as a reflector of ultrasound. Resonant gas bubbles scatter sound a thousand times more efficiently than a solid particle of the same size.
  • These types of agents include free bubbles of gas as well as those which are encapsulated by a shell material.
  • Contrast enhanced images have the property that their presence in a particular
  • ROI produce a contrast visually recognizable from surrounding regions that are not suffused with the agent.
  • myocardial contrast echocardiography (“MCE").
  • MCE myocardial contrast echocardiography
  • the ejection fraction is a global measure of systolic function, while regional wall motion is a local measure.
  • EF ejection fraction
  • ESN is the end-systolic ventricular volume
  • the computerized image processing starts with a human operator selecting three image frames from a cardiac cycle: the opening end-diastolic frame, the end-systolic frame, and the closing end-diastolic frame. Once selected, the operator defines the endocardial and epicardial borders on each of the three selected frames. After the borders are defined for the first three frames, they are refined and the borders in the other frames from other points within the cardiac cycle are automatically determined by Geiser et al.'s process.
  • the ventricle may not be completely
  • contrast agent it is not likely that all areas are simultaneously opacified. For example, attenuation and the effects of shadowing may produce an image whereby one region of the left ventricle is at maximum brightness while, in other regions, no contrast is observed at all.
  • identification of the border region during the end-diastole or end-systole might lead to either an over or under estimation of the motion of the ventricle. If the ejection fraction or regional wall motion are over-estimated, the cardiologist might rule out a suspicion of ischemia, when it is in fact present. On the other hand, if the ejection fraction or regional wall motion are under-estimated, then the cardiologist might suspect ischemia where none is present and sent the patient on to a more expensive diagnostic procedure (e.g. angiography or nuclear imaging) or an expensive and invasive therapeutic procedure (e.g. angioplasty).
  • a more expensive diagnostic procedure e.g. angiography or nuclear imaging
  • an expensive and invasive therapeutic procedure e.g. angioplasty
  • the present invention is a novel system and method for automatically identifying borders of regions of interest within an image of a patient's organ or tissue. Initially, the operator of the system identifies a given set of images that will be taken for the system to
  • the set of the organ in question is the heart
  • images selected for analysis will usually be images that are taken at the same point in the
  • the system begins to generate images - before, during and after the administration of a contrast agent.
  • the system begins its automatic processing.
  • the steps of the processing include the identification of baseline image frames, identification of baseline intensities for each given pixel in the ROI, baseline subtraction on a per-pixel basis, determining a probability of signal-to-noise ratio for each pixel, and thresholding each pixel to determine if a pixel belongs to an area inside the border region or an area outside the border region.
  • the method refines the set by locally minimizing a total cost function that relates a low value to points typically found on a
  • Figure 1 depicts the manner in which ultrasound images are taken of a patient's heart by an ultrasound image processor that is used in accordance with the principles of
  • FIG 2 is a high level block diagram of one embodiment of an image processor unit that is used in accordance with the principles of the present invention.
  • Figures 3-7 depict a flow chart of the presently claimed border delineation method.
  • Figures 8(A) and 8(B) depicts how the present system may select candidate heart chamber border pixels.
  • the present invention encompasses general methods for the imaging and diagnosis of any patient tissues or organs capable of being imaged, the present description will be given from the standpoint of imaging the human heart. In many ways, the problems involved with imaging the human heart for purposes of border delineation and dimensioning are more difficult than with other organs.
  • the present description of the method for imaging the heart may then be simplified in order to image other patient organs and tissues that do not experience such difficulties.
  • the present invention should not be limited to merely for imaging the human heart; but encompasses all tissues
  • the present description is based upon administration of a contrast agent used with ultrasound imaging methodology.
  • the present invention should not be limited to merely ultrasound; but also encompasses other methodologies that may (or may not) use a contrast agent that is uniquely suited to that particular methodology.
  • Ultrasound methodology is described in greater detail in co-pending and co-assigned patent application Serial Number 08/428,723 entitled “A METHOD FOR PROCESSING REAL-TIME CONTRAST ENHANCED ULTRASONIC IMAGES", filed on April 25, 1995 by Levene et al., and herein incorporated by reference.
  • Ultrasound imaging systems are well known in the art. Typical systems are manufactured by, for example, Hewlett Packard Company; Acuson, Inc.; Toshiba America Medical Systems, Inc.; and Advanced Technology Laboratories. These systems are employed for two-dimensional imaging. Another type of imaging system is based on
  • ultrasound contrast agents are also well-known in the art. They include,
  • liquid emulsions solids
  • encapsulated fluids encapsulated
  • gaseous agents are of particular importance because of their efficiency as a reflector of ultrasound. Resonant gas bubbles scatter sound a thousand times more efficiently than a solid particle of the same size. These types of agents include free bubbles of gas as well as those which are encapsulated
  • the contrast agent may be administered via any of the known routes. These routes include, but are not limited to intravenous (IV), intramuscular (IM), intraarterial (IA), and intracardiac (IC).
  • IV intravenous
  • IM intramuscular
  • IA intraarterial
  • IC intracardiac
  • tissue or organ that receives a flow of blood may have images processed in the manner of the invention.
  • These tissues/organs may include, but are not limited to the kidneys, liver, brain, testes, muscles, and heart.
  • Short axis views may bisect the heart at different planes, at the level of the mitral valve, at the level of the papillary muscles, or at the level of the apex, for example.
  • the apical four chamber view with the transducer slightly tilted gives the five chamber view, where the aorta is visualized with the usual four chambers.
  • FIG. 1 a cut-away view of patient 30 attached to echocardiographic transducer 36 is shown. A transducer is placed on the patient,
  • Images may alternatively be acquired transthoracically or
  • An injection (34) of contrast agent is made into the patient's vein so that the contrast agent reaches the heart and interacts with the ultrasound waves generated by transducer 36. Sound waves reflected and detected at transducer 36 are sent as input into image processing system 38.
  • image processing system As the contrast agent enters into various heart regions, image processing system
  • Tissue areas that do not brighten when expected may indicate a disease condition in the area (e.g. poor or no circulation, presence of thrombus, necrosis or the like).
  • Image processing system 38 comprises diagnostic ultrasound scanner 40, optional analog-to-digital converter 42, image processor 44, digital-to-analog converter 56, and color monitor 58.
  • Ultrasound scanner 40 encompasses any means of radiating ultrasound waves to the region of interest and
  • Scanner 40 could comprise transducer 36 and a means of producing electrical signals in accordance with the reflected waves detected. It will be appreciated that such scanners are well known in the art.
  • the electrical signals generated by scanner 40 could either be digital or analog. If
  • the signals are digital, then the current embodiment could input those signals into image processor 44 directly. Otherwise, an optional A/D converter 42 could be used to convert
  • Image processor 44 takes these digital signals and processes them to provide video images as output.
  • the current embodiment of image processor 44 comprises a central processing unit 46, trackball 48 for user-supplied input of predefined regions of interest, keyboard 50, and memory 52.
  • Memory 52 may be large enough to retain several video images and store the border delineation method 54 of the present invention.
  • CPU 44 thus analyzes the video images according to stored border delineation method 54.
  • D/A converter 56 After a given video image is processed by image processor 44, the video image is output in digital form to D/A converter 56. D/A converter thereby supplies color monitor 58 with an analog signal capable of rendering on the monitor. It will be appreciated that the present invention could alternatively use a digital color monitor, in which case D/A converter 56 would be optional.
  • FIGS 3-7 are flowcharts describing the border delineation method as currently embodied. The method starts at step 100 with the operator selecting a point of interest in the cardiac cycle where
  • grey scale (not contrast-enhanced) ultrasound imaging is started at step 102.
  • a decision is made as to whether to process the current image. If the image is at the point of interest in the cardiac cycle, then the image is processed at steps 104 thru 108. Otherwise , it is not processed.
  • Noncontrast enhanced imaging is continued until a sufficient number of initial baseline images are taken at step 110. These initial images, together with later images taken after the contrast agent has "washed out", form the basis of the entirety of the baseline images.
  • a contrast agent is administered to the patient at step 114 and "washes into” the chambers of the heart, first, then slowly perfuses into the tissues of the heart muscles themselves. The images are then captured at the selected point(s) in the cardiac cycle until the contrast agent is no longer present in the heart's chamber at steps 116 thru 122. This could be determined by selecting a "trigger" region of interest (T-ROI) that is used to identify whether the contrast agent is the heart chamber. A most advantageous T-ROI to be selected would be somewhere in the heart chamber because the heart chamber receives the contrast agent prior to perfusion in the heart muscle.
  • T-ROI region of interest
  • image motion correction is performed to improve the quality of the images at step 128. This may be done either manually or in an automated fashion. If done manually, for example, the operator would indicate on each image to what extent and in what direction one image would need to move to register with a reference image. Such a manual method is described in "Digital Subtraction Myocardial Contrast Echocardiography: Design and Application of a New Analysis Program for Myocardial Perfusion Imaging," M. Halmann et al., J. Am. Soc. Echocardiogr. 7:355-362 (1994).
  • the operator After motion correction is performed, the operator then preselects a general region of interest on a given frame in order to give the process in an initial region for which to locate the border thereof at step 130. This may be accomplished by having the operator circle the region of interest with a light pen on an interactive video screen or by
  • This selected region is used only to restrict the search area for the endocardium border in order to reduce the processing time.
  • a properly selected region should include the left ventricle surrounded by myocardial tissue. The analysis then begins on each pixel within the ROI.
  • the set of true baseline frames are selected from the set of initial, pre-contrast frames and the set of post-contrast frames. Steps 134, 136, and 138 depict three different ways in which this set may be formed. First, the operator could manually select all of the baseline frames. Second, the operator identifies an area clearly within the left ventricle
  • the standard deviation of the pixel intensity is calculated. For any given pixel, the data points over time are compared against the computed standard deviation in step 144. If the pixel intensity is within the standard deviation for a putative baseline value, then the pixel data point is considered a baseline value.
  • the pixel data point is outside the standard deviation, and the data point is removed from any further consideration at step 146.
  • the linear regression analysis is then re-calculated, including the standard deviation. This defines an iterative process for each pixel over time.
  • the pixels of the chamber are determined, at step 148. By clearly identifying the pixels of the chamber, the method may then discard these pixels from further consideration in delineating the border pixels.
  • the first step to accomplishing this goal is baseline subtraction. For each pixel in the ROI, another linear regression analysis is performed on the baseline pixel intensity over time at step 152. This provides a linear best-fit curve having a derived slope and intercept at step 154. For each non-baseline frame occurring at a given time, ti, the baseline intensity is derived from the linear curve as occurring for that particular time. The baseline value is then subtracted from the non-baseline pixel intensity at step 156.
  • the signal derived solely from the contrast, Si is determined.
  • the observed pixel intensity may have decreased to an extent to be less than the estimated baseline intensity. In such a case, Si is taken to be zero.
  • a composite signal-to-noise ratio (S/N) is determined from the signal, Sk, and the signals from the temporally adjacent heart cycles, Sk-i and Sk+i- A peak signal may arise from spurious noise, so that the signals are weighted according to the equation:
  • ROC characteristic characteristic
  • the signal-to-noise ratio is then treated as a standardized, normal variable and the
  • P[(S/N) k ] may be calculated as follows:
  • the maximum signal-to-noise ratio over the non-baseline frames is determined. Because there is a
  • a probability threshold may be established distinguishing the two regions, with probabilities above the threshold identifying pixels in the myocardium and probabilities below the threshold identifying pixels in the left ventricle. This comparison is accomplished at step 164 and continues until all the pixels i the ROI have been analyzed.
  • cost weighting is used. In that case, if a small area within the chamber near the border is misclassified, an edge detection method will have
  • the center of mass of the ventricle pixels, (xi, y ), is then determined at step 172 and referred to as the center of the left ventricle:
  • m is the number of ventricle pixels.
  • the envelope (or border) of the ventricle pixels is now determined from the binary image.
  • the ventricular pixels are searched to find the points that have the minimum and maximum y value and the minimum and maximum x value - thus, defining a maximum of four points. It should be appreciated that the orientation of the images is not important. At each of these four locations, there may be one or more points; it is most convenient to pick a location with only one point, but is not necessary. In the case of all
  • any one of the points at any of the locations will suffice as the reference point of step 178.
  • the point is identified as the first point belonging to the border and it serves as the starting point of the envelope tracing method.
  • the envelope is traced by determining which adjacent point
  • the angle, 1, of the reference point, (X 2 , Y 2 ), is determined as follows:
  • Figures 8 A and 8B depict the selection of candidate border points in the myocardium.
  • Figure 8 A shows a color picture of a heart chamber (colored red in the Figure) surrounded by the dark myocardium.
  • Figure 8B shows an enlarged view of the region in Figure 8 A that is bordered by the white box.
  • the border is gradually and automatically filled out (as depicted as the white solid curve).
  • the last border point selected is depicted as the white circle. From this last border point, the radial lines are sent out to help determine the next border point.
  • a candidate border point is found along each radial line, with the ventricular pixel nearest to the reference point chosen. Radial lines are radiated out over 180 degrees , from 1 to 1 + 180 degrees. The cost function is then calculated for each candidate point. If the cost of all points is above a threshold cost, then the angular range of radial lines is increased. The candidate point with the lowest cost is chosen as the adjacent border point and
  • the cost function may have global and local factors. Global factors, for example, may emphasize a smoothness in the change of the area of the left ventricle over the cardiac cycle. Local factors emphasize regional border characteristics.
  • the cost factors may have global and local factors. Global factors, for example, may emphasize a smoothness in the change of the area of the left ventricle over the cardiac cycle. Local factors emphasize regional border characteristics.
  • the cost factors may have global and local factors. Global factors, for example, may emphasize a smoothness in the change of the area of the left ventricle over the cardiac cycle. Local factors emphasize regional border characteristics.
  • the cost factors may have global and local factors. Global factors, for example, may emphasize a smoothness in the change of the area of the left ventricle over the cardiac cycle. Local factors emphasize regional border characteristics.
  • the cost factors may have global and local factors. Global factors, for example, may emphasize a smoothness in the change of the area of the left ventricle over the cardiac cycle. Local factors emphasize regional border characteristics.
  • the cost factors may have global and local factors. Global factors, for
  • G(ps) may be determined using the Sobel operators as defined as
  • the cost factor, c 2 for the candidate point is:
  • the angle of the gradient of the pixel intensity about the border should be slowly changing for a smooth contour.
  • the cost factor, c 3 for the candidate point is given as:
  • > r is the angle of the gradient at the reference point
  • > c is the gradient angle for the candidate point
  • > is the angle between a line from the reference point to the center of the ventricle and the candidate point.
  • the background of the summary image may consist of an average of the baseline frames. Superimposed upon this background is the border, which may be highlighted in a different color.
  • a possible format to display the border is depicted in Figure 8B as the solid white border line. The border is thus shown as the continuous broad white band that encloses the left ventricle chamber.

Abstract

The present invention is a novel system and method for automatically identifying borders of regions of interest within an image of a patient's organ or tissue. The system generates images - before, during and after the administration of a contrast agent. Once the set of images have been taken, the system begins automatic processing of the images. The steps of the processing include the identification of baseline image frames, identification of baseline intensities for each given pixel in the ROI, baseline subtraction on a per-pixel basis, determining a probability of signal-to-noise ratio for each pixel, and thresholding each pixel to determine if a pixel belongs to an area inside the border region or an area outside the border region. To exactly determine which pixels that are at the border, the method refines the set by locally minimizing a total cost function that relates a low value to points typically found on a contrast enhanced image. The border of the region of interest is thereby determined.

Description

AUTOMATIC BORDER DELINEATION AND DIMENSIONING OF REGIONS USING CONTRAST ENHANCED IMAGING
FIELD OF THE INVENTION
The present invention relates in general to a method for processing ultrasound
images of a patient's organs and tissue and, in particular, to a method for delineating
borders and dimensioning regions of the organs and tissues of the patient in such images.
BACKGROUND OF THE INVENTION
In medical diagnostic imaging, it is important to image regions of interest (ROIs)
within a patient and analyze these images to provide effective diagnosis of potential disease conditions. A necessary component of this diagnosis is the ability to discriminate
between various structures of the patient's tissues - including, but not limited to, organs,
tumors, vessels and the like - to identify the particular ROI for diagnosis.
The problems of structure identification are exacerbated in cases where the ROI is located in a tissue or organ that is moving significantly during the course of imaging. One
organ that experiences a good deal of movement during imaging is the heart. Several
imaging modalities are currently used. For example, it is known to use single photon emission computed tomography ("SPECT"), positron emission tomography ("PET"), computed tomography ("CT"), magnetic resonance imaging ("MRI"), angiography and
ultrasound. An overview of these different modalities is provided in: Cardiac Imaging - A
Companion to Braunwald's Heart Disease, edited by Melvin L. Marcus, Heinrich R.
Schelbert, David J. Skorton, and Gerald L. Wolf (W. B. Saunders Co., Philadelphia,
1991). One modality that has found particular usefulness is contrast enhanced ultrasound imaging. Briefly, this technique utilizes ultrasonic imaging, which is based on the
principle that waves of sound energy can be focused upon a "region of interest" ("ROI")
and reflected in such a way as to produce an image thereof. The ultrasonic transducer utilized is placed on a body surface overlying the area to be imaged, and sound waves are directed toward that area. The transducer detects reflected sound waves and the attached scanner translates the data into video images.
When ultrasonic energy is transmitted through a substance, the amount of energy reflected depends upon the frequency of the transmission and the acoustic properties of the substance. Changes in the substance's acoustic properties (e.g. variance in the acoustic impedance) are most prominent at the interfaces of different acoustic densities and compressibilities, such as liquid-solid or liquid-gas. Consequently, when ultrasonic energy is directed through tissue, organ structures generate sound reflection signals for detection by the ultrasonic scanner. These signals can be intensified by the proper use of a contrast agent.
There are several types of contrast agents including liquid emulsions, solids, encapsulated fluids and those which employ the use of gas. The latter agents are of particular importance because of their efficiency as a reflector of ultrasound. Resonant gas bubbles scatter sound a thousand times more efficiently than a solid particle of the same size. These types of agents include free bubbles of gas as well as those which are encapsulated by a shell material.
Contrast enhanced images have the property that their presence in a particular
ROI produce a contrast visually recognizable from surrounding regions that are not suffused with the agent. One example of this type of imaging is myocardial contrast echocardiography ("MCE"). In MCE, an intravascular injection of a contrast agent
washes into the patient's heart while, simultaneously, ultrasound waves are directed to and reflected from the heart - thereby producing a sequence of echocardiographic images. In the field of echocardiography, important diagnostic measures include: (1) analysis of regional wall motion; and (2) the determination of the ejection fraction. Abnormal systolic function is a diagnostic indication of cardiac disease; and measurements
of the ejection fraction and regional wall motion are most useful in detecting chronic
ischemia. The ejection fraction is a global measure of systolic function, while regional wall motion is a local measure.
The "ejection fraction" ("EF") is a widely used measure of the contractile ability of the ventricle. EF is defined as the ratio of the total ventricular stroke volume ("SN") to the end-diastolic ventricular volume ("EDN"). In equation form, we have:
SV EDV-ESV
EF EDV EDV
where ESN is the end-systolic ventricular volume.
Accurate determination of EF and wall motion, however, is based on a precise identification of certain heart structures of the patient, such as the left ventricle and the endocardial border in the left ventricle. Currently, identification of the endocardial border is made from non-contrast enhanced images. Endocardial borders in these non-contrast enhanced images are either manually traced by trained echocardiographers or determined by image processing methods tailored specifically for non-contrast enhanced images. Such an image processing method is described in A Second-generation Computer-based
Edge Detection Algorithm for Short-axis. Two-dimensional Echocardiographic Images: Accuracy and Improvement in Interobserver Variability, by Geiser et al. and published in Vol. 3, No. 2, March-April 1990 issue of the Journal of the American Society of
Echocardiography (pps 79-90).
In Geiser et al.'s method, the computerized image processing starts with a human operator selecting three image frames from a cardiac cycle: the opening end-diastolic frame, the end-systolic frame, and the closing end-diastolic frame. Once selected, the operator defines the endocardial and epicardial borders on each of the three selected frames. After the borders are defined for the first three frames, they are refined and the borders in the other frames from other points within the cardiac cycle are automatically determined by Geiser et al.'s process.
The disadvantage with Geiser et al.'s process for identification of the endocardium is that it is performed without contrast enhancement of the heart's image. Without contrast enhancement, several imaging problems occur. For example, the fibers within the myocardium create more or less backscatter depending upon their orientation relative to the incident ultrasound beam - fibers that are parallel to the beam scatter less, so in these regions it is more difficult to differentiate the endocardium from the hypoechoic chamber region. These regions occur in the lateral regions of the image. Merely increasing the gain is not a satisfactory solution, because many instruments have gain dependent lateral resolution, so that the proper identification of the border is adversely affected. One way to avoid this difficulty is to image with contrast enhancement. The use of contrast agents, such as ALBUNEX® (a registered trademark of Molecular
Biosystems, Inc.), in echocardiograms has enhanced the image resolution of patient heart structures. By adding contrast agent into the heart's chamber, the chamber initially becomes greatly illuminated in comparison to the myocardium (including the endocardium). Later, once the agent has washed out of the chamber, the myocardium
remains illuminated relative to the chamber due to the perfusion of agent into the
myocardium tissue. In either case, the border region between the myocardium and the
chamber is greatly differentiated - even in the lateral regions where the problem of differentiation without contrast enhancement is greatest without contrast.
Although the use of contrast agents has aided in the differentiation of the endocardium border, the typical method of border delineation remains a manual process of "eyeballing" the border by a trained cardiologist. However, there are still problems with manual methods of border identification. Specifically a single frame of echocardiographic image data is selected during the time of approximate maximum ventricular opacification by the contrast agent. A trained echocardiographer then manually traces, in the echocardiographer's best judgment, what appears to be the endocardial border in that single frame. The echocardiographer's judgment is based on the perceived differences in the texture of the brightness in the image. For those frames where contrast agents have perfused into the myocardium while
agent is still in the left ventricle chamber, the difference in texture may be less apparent. Hence, this manual process leaves much to chance in accurately determining the endocardium border. Additionally, in the single chosen frame, the ventricle may not be completely
opacified - while all areas of the left ventricle may be opacified at some point during the
injection of contrast agent, it is not likely that all areas are simultaneously opacified. For example, attenuation and the effects of shadowing may produce an image whereby one region of the left ventricle is at maximum brightness while, in other regions, no contrast is observed at all.
Either of these problems may cause a border region of the left ventricle to be difficult to identify, leading to uncertainty in the diagnosis process. Specifically, improper
identification of the border region during the end-diastole or end-systole might lead to either an over or under estimation of the motion of the ventricle. If the ejection fraction or regional wall motion are over-estimated, the cardiologist might rule out a suspicion of ischemia, when it is in fact present. On the other hand, if the ejection fraction or regional wall motion are under-estimated, then the cardiologist might suspect ischemia where none is present and sent the patient on to a more expensive diagnostic procedure (e.g. angiography or nuclear imaging) or an expensive and invasive therapeutic procedure (e.g. angioplasty).
Thus, it is desirable to develop a method for the accurate identification of the borders of patient tissues, such as the endocardial border of the heart.
It is, therefore, an object of the present invention to provide a method for such accurate border identification.
It is another object of the present invention to provide an improved method of diagnosis of ejection fraction and regional wall motion. SUMMARY OF THE INVENTION Other features and advantages of the present invention will be apparent from the
following description of the preferred embodiments, and from the claims.
The present invention is a novel system and method for automatically identifying borders of regions of interest within an image of a patient's organ or tissue. Initially, the operator of the system identifies a given set of images that will be taken for the system to
automatically analyze. For example, if the organ in question is the heart, then the set of
images selected for analysis will usually be images that are taken at the same point in the
cardiac cycle. Once the criteria for image set inclusion is determined (e.g. images from the same point in the cardiac cycle), the system begins to generate images - before, during and after the administration of a contrast agent. Once the set of images have been taken, the system begins its automatic processing. Broadly, the steps of the processing include the identification of baseline image frames, identification of baseline intensities for each given pixel in the ROI, baseline subtraction on a per-pixel basis, determining a probability of signal-to-noise ratio for each pixel, and thresholding each pixel to determine if a pixel belongs to an area inside the border region or an area outside the border region. To exactly determine which pixels are at the border, the method refines the set by locally minimizing a total cost function that relates a low value to points typically found on a
contrast enhanced image. The border of the region of interest is thereby determined.
For a full understanding of the present invention, reference should now be made to the following detailed description of the preferred embodiments of the invention and to the accompanying drawings. BRIEF DESCRIPTION OF THE DRAWINGS The file of this patent contains at least one drawing executed in color. Copies of
this patent with color drawings will be provided by the Patent and Trademark Office upon request and payment of the necessary fee.
Figure 1 depicts the manner in which ultrasound images are taken of a patient's heart by an ultrasound image processor that is used in accordance with the principles of
the present invention.
Figure 2 is a high level block diagram of one embodiment of an image processor unit that is used in accordance with the principles of the present invention. Figures 3-7 depict a flow chart of the presently claimed border delineation method.
Figures 8(A) and 8(B) depicts how the present system may select candidate heart chamber border pixels.
DETAILED DESCRIPTION OF THE INVENTION Although the present invention encompasses general methods for the imaging and diagnosis of any patient tissues or organs capable of being imaged, the present description will be given from the standpoint of imaging the human heart. In many ways, the problems involved with imaging the human heart for purposes of border delineation and dimensioning are more difficult than with other organs.
One reason is that the regions on both sides of the border may be contrast enhanced. However, the chief reason is motion. The human heart, in the course of normal function, moves a great deal. As most border delineation methods require a number of images (some having the heart perfused with a contrast agent) to accurately determine the border, the movement of the heart tissue from frame-to-frame presents a problem when correlating the parts of heart tissue - especially when tissues do not
necessarily occupy the same pixel position in different frames. The present description of the method for imaging the heart may then be simplified in order to image other patient organs and tissues that do not experience such difficulties. Thus, the present invention should not be limited to merely for imaging the human heart; but encompasses all tissues
capable of being imaged.
Likewise, the present description is based upon administration of a contrast agent used with ultrasound imaging methodology. Again, the present invention should not be limited to merely ultrasound; but also encompasses other methodologies that may (or may not) use a contrast agent that is uniquely suited to that particular methodology. Ultrasound methodology is described in greater detail in co-pending and co-assigned patent application Serial Number 08/428,723 entitled "A METHOD FOR PROCESSING REAL-TIME CONTRAST ENHANCED ULTRASONIC IMAGES", filed on April 25, 1995 by Levene et al., and herein incorporated by reference.
Ultrasound imaging systems are well known in the art. Typical systems are manufactured by, for example, Hewlett Packard Company; Acuson, Inc.; Toshiba America Medical Systems, Inc.; and Advanced Technology Laboratories. These systems are employed for two-dimensional imaging. Another type of imaging system is based on
three-dimensional imaging. An example of this type of system is manufactured by, for example, TomTec Imaging Systems, Inc. The present invention may be employed with either two-dimensional or three-dimensional imaging systems. Likewise, ultrasound contrast agents are also well-known in the art. They include,
but are not limited to liquid emulsions, solids; encapsulated fluids, encapsulated
biocompatible gases and combinations thereof. Fluorinated liquids and gases are especially useful in contrast compositions. The gaseous agents are of particular importance because of their efficiency as a reflector of ultrasound. Resonant gas bubbles scatter sound a thousand times more efficiently than a solid particle of the same size. These types of agents include free bubbles of gas as well as those which are encapsulated
by a shell material. The contrast agent may be administered via any of the known routes. These routes include, but are not limited to intravenous (IV), intramuscular (IM), intraarterial (IA), and intracardiac (IC).
It is appreciated that any tissue or organ that receives a flow of blood may have images processed in the manner of the invention. These tissues/organs may include, but are not limited to the kidneys, liver, brain, testes, muscles, and heart.
The angles and direction used to obtain views of the organs during imaging are well known in the art. For most organs, the various views used are only derived from the planes of the organ,, as there is not a problem with lungs or ribs defining an acoustic window. Therefore, the views are termed sagittal, transverse, and longitudinal.
When imaging the heart, there are three orthogonal planes, the long axis, the short axis, and the four chamber axis. There are also apical, parasternal, subcostal, or suprastemal acoustic windows. The common names for the views that are derived from
these are the parasternal short axis, apical long axis, parasternal long axis, suprastemal long axis, subcostal short axis, subcostal four chamber, apical two chamber, and apical four chamber. Short axis views may bisect the heart at different planes, at the level of the mitral valve, at the level of the papillary muscles, or at the level of the apex, for example. Lastly, the apical four chamber view with the transducer slightly tilted gives the five chamber view, where the aorta is visualized with the usual four chambers. For a further
description of these various views, see Echocardiography. 5th edition, edited by Harvey Feigenbaum (Lea & Febiger, Philadelphia, 1994).
Referring now to Figure 1, a cut-away view of patient 30 attached to echocardiographic transducer 36 is shown. A transducer is placed on the patient,
proximate to heart muscle 32. Images may alternatively be acquired transthoracically or
transesophageally. An injection (34) of contrast agent is made into the patient's vein so that the contrast agent reaches the heart and interacts with the ultrasound waves generated by transducer 36. Sound waves reflected and detected at transducer 36 are sent as input into image processing system 38.
As the contrast agent enters into various heart regions, image processing system
38 detects an increased amplitude in the reflected ultrasound waves, which is characterized by a brightening of the image. Tissue areas that do not brighten when expected may indicate a disease condition in the area (e.g. poor or no circulation, presence of thrombus, necrosis or the like).
Referring now to Figure 2, an embodiment, in block diagram form, of image processing system 38 is depicted. Image processing system 38 comprises diagnostic ultrasound scanner 40, optional analog-to-digital converter 42, image processor 44, digital-to-analog converter 56, and color monitor 58. Ultrasound scanner 40 encompasses any means of radiating ultrasound waves to the region of interest and
detecting the reflected waves. Scanner 40 could comprise transducer 36 and a means of producing electrical signals in accordance with the reflected waves detected. It will be appreciated that such scanners are well known in the art.
The electrical signals generated by scanner 40 could either be digital or analog. If
the signals are digital, then the current embodiment could input those signals into image processor 44 directly. Otherwise, an optional A/D converter 42 could be used to convert
the analog signals.
Image processor 44 takes these digital signals and processes them to provide video images as output. The current embodiment of image processor 44 comprises a central processing unit 46, trackball 48 for user-supplied input of predefined regions of interest, keyboard 50, and memory 52. Memory 52 may be large enough to retain several video images and store the border delineation method 54 of the present invention. CPU 44 thus analyzes the video images according to stored border delineation method 54.
After a given video image is processed by image processor 44, the video image is output in digital form to D/A converter 56. D/A converter thereby supplies color monitor 58 with an analog signal capable of rendering on the monitor. It will be appreciated that the present invention could alternatively use a digital color monitor, in which case D/A converter 56 would be optional.
Having described a current embodiment of the present invention, the border delineation method of the present invention will now be described. Figures 3-7 are flowcharts describing the border delineation method as currently embodied. The method starts at step 100 with the operator selecting a point of interest in the cardiac cycle where
the set of images to be processed will always occur. The same point in the cycle is
primarily used to image the heart at the same point in its contraction and to reduce the amount of heart distortion and drift from frame-to-frame because the heart is presumably in the same place at the same point in the cardiac cycle.
Of all the point in the cardiac cycle, the most frequently used are the end-systolic
and the end-diastolic points. These points are particularly useful in imaging the heart because they represent the point of maximum contraction and maximum relaxation of the heart in the cardiac cycle. These cardiac points are useful because they are used to measure the contractile ability of the heart, i.e., the ejection fraction of the heart.
Once having decided the point (or points) of the cardiac cycle to capture images,
grey scale (not contrast-enhanced) ultrasound imaging is started at step 102. As images are being generated, a decision is made as to whether to process the current image. If the image is at the point of interest in the cardiac cycle, then the image is processed at steps 104 thru 108. Otherwise , it is not processed. Noncontrast enhanced imaging is continued until a sufficient number of initial baseline images are taken at step 110. These initial images, together with later images taken after the contrast agent has "washed out", form the basis of the entirety of the baseline images.
Once the requisite number of initial baseline frames have been taken, then a contrast agent is administered to the patient at step 114 and "washes into" the chambers of the heart, first, then slowly perfuses into the tissues of the heart muscles themselves. The images are then captured at the selected point(s) in the cardiac cycle until the contrast agent is no longer present in the heart's chamber at steps 116 thru 122. This could be determined by selecting a "trigger" region of interest (T-ROI) that is used to identify whether the contrast agent is the heart chamber. A most advantageous T-ROI to be selected would be somewhere in the heart chamber because the heart chamber receives the contrast agent prior to perfusion in the heart muscle.
After the contrast agent has "washed out" of the heart, several post-contrast, baseline image frames are taken, and the ultrasound imaging is terminated at steps 124 and 126. It should be appreciated that the step of obtaining the post-contrast baseline values could be omitted in order for real time processing implementation. After obtaining
baseline frames, image motion correction is performed to improve the quality of the images at step 128. This may be done either manually or in an automated fashion. If done manually, for example, the operator would indicate on each image to what extent and in what direction one image would need to move to register with a reference image. Such a manual method is described in "Digital Subtraction Myocardial Contrast Echocardiography: Design and Application of a New Analysis Program for Myocardial Perfusion Imaging," M. Halmann et al., J. Am. Soc. Echocardiogr. 7:355-362 (1994).
Examples of automated methods are described in, for example, "Quantification of Images Obtained During Myocardial Contrast Echocardiography," A.R. Jayaweera et al., Echocardiography 11:385-396 (1994) and "Color Coding of Digitized Echocardiograms: Description of a New Technique and Application in Detecting and Correcting for Cardiac Translation," J.R. Bates et al., J. Am. Soc. Echocardiogr. 7:363-369 (1994).
After motion correction is performed, the operator then preselects a general region of interest on a given frame in order to give the process in an initial region for which to locate the border thereof at step 130. This may be accomplished by having the operator circle the region of interest with a light pen on an interactive video screen or by
drawing with a mouse or using keys. This selected region is used only to restrict the search area for the endocardium border in order to reduce the processing time. A properly selected region should include the left ventricle surrounded by myocardial tissue. The analysis then begins on each pixel within the ROI.
The set of true baseline frames are selected from the set of initial, pre-contrast frames and the set of post-contrast frames. Steps 134, 136, and 138 depict three different ways in which this set may be formed. First, the operator could manually select all of the baseline frames. Second, the operator identifies an area clearly within the left ventricle
and the mean pixel intensity is calculated as a function of time. The operator can then
identify baseline frames from a plot of intensity versus time. Lastly, the system could automatically determine the baseline for each pixel starting at step 138. A linear regression is performed on air of the data points and the standard deviation of the fit is calculated at steps 140 and 142. The analysis may be
performed with a varying number of frames at the beginning and a varying number of frames at the end of the sequence, until the best fit is determined. It will be appreciated that such regression analysis is well known to those skilled in the art.
After the linear regression analysis has been performed, the standard deviation of the pixel intensity is calculated. For any given pixel, the data points over time are compared against the computed standard deviation in step 144. If the pixel intensity is within the standard deviation for a putative baseline value, then the pixel data point is considered a baseline value.
Otherwise, the pixel data point is outside the standard deviation, and the data point is removed from any further consideration at step 146. The linear regression analysis is then re-calculated, including the standard deviation. This defines an iterative process for each pixel over time.
After all the baseline pixel data have been identified, then the pixels of the chamber are determined, at step 148. By clearly identifying the pixels of the chamber, the method may then discard these pixels from further consideration in delineating the border pixels.
The first step to accomplishing this goal is baseline subtraction. For each pixel in the ROI, another linear regression analysis is performed on the baseline pixel intensity over time at step 152. This provides a linear best-fit curve having a derived slope and intercept at step 154. For each non-baseline frame occurring at a given time, ti, the baseline intensity is derived from the linear curve as occurring for that particular time. The baseline value is then subtracted from the non-baseline pixel intensity at step 156.
Once this estimated baseline intensity is subtracted from the observed pixel intensity, the signal derived solely from the contrast, Si, is determined. In instances where attenuation causes shadowing in the image, the observed pixel intensity may have decreased to an extent to be less than the estimated baseline intensity. In such a case, Si is taken to be zero.
For each non-baseline or contrast frame, k, a composite signal-to-noise ratio (S/N) is determined from the signal, Sk, and the signals from the temporally adjacent heart cycles, Sk-i and Sk+i- A peak signal may arise from spurious noise, so that the signals are weighted according to the equation:
, S _ Wl * Sk-l + M>2 • Sk + W 3 * Sk+l
N σ where [ is the calculated standard deviation of the baseline data and Wj (j = 1,2,3) are the weights of the signals. It will be appreciated that more than more than three addends could be used to form the signal-to-noise ratio. The purpose of the weighting terms in the calculation of the signal-to-noise ratio is to reduce the influence of noise by performing a smoothing within a small time region. It is difficult to determine, a priori, what the optimal values for the weighting terms will be
in these calculations. The optimal values can be determined by a "receiver operating
characteristic" (ROC) analysis where, for each variation of the weighting factors, the sensitivity and specificity is determined by comparison to a "gold standard" method (e.g. where the opinion of a group of human experts form the gold standard in a given case). It will be appreciated that the methods of ROC analysis are well known to those in the art of
biomedical analysis. An exposition of ROC analysis is provided in RECEIVER OPERATING CHARACTERISTIC CURVES: A BASIC UNDERSTANDING, bv Vining et al., published in RadioGraphics, Vol. 12, No. 6 (November 1992), and herein incorporated by reference.
The signal-to-noise ratio is then treated as a standardized, normal variable and the
probability of obtaining the observed (S/N)k from random noise fluctuations, P[(S/N)k] may be calculated as follows:
Figure imgf000019_0001
As will be appreciated, as the signal-to-noise ratio increases, the probability that the signal results from random noise decreases. For each pixel, that probability is
determined for each non-baseline frame and the minimum probability for that pixel is taken at step 162. In order to determine which pixels are then in the heart chamber, the maximum signal-to-noise ratio over the non-baseline frames is determined. Because there is a
greater degree of brightening in the ventricle than the myocardium resulting from contrast
agent enhancement, a probability threshold may be established distinguishing the two regions, with probabilities above the threshold identifying pixels in the myocardium and probabilities below the threshold identifying pixels in the left ventricle. This comparison is accomplished at step 164 and continues until all the pixels i the ROI have been analyzed.
After every pixel in the ROI has been determined as part of the heart chamber or not, it is now possible to determine, among all the pixels not in the heart chamber, which are border pixels. This can be done by any suitable technique which is known in the art. For example, see "A Novel Algorithm for the Edge Detection and Edge Enhancement of Medical Images," I. Crooks et al., Med. Phys. 20:993-998 (1993) and "Multilevel Nonlinear Filters for Edge Detection and Noise Suppression," H. Hwang et al., IEEE Trans on Signal Processing 42:249-258 (1994).
In a preferred embodiment, cost weighting is used. In that case, if a small area within the chamber near the border is misclassified, an edge detection method will have
the chamber area smaller than it should be. The cost function for those points might be high so that the border is still correctly placed. To aid in this final determination, a binary image is made, with pixels above the brightening threshold given an intensity of zero and pixels below the threshold an intensity
of one. The center of mass of the ventricle pixels, (xi, y ), is then determined at step 172 and referred to as the center of the left ventricle:
cι = m ∑ Xi i = l
:-*∑ y,
where m is the number of ventricle pixels.
The envelope (or border) of the ventricle pixels is now determined from the binary image. The ventricular pixels are searched to find the points that have the minimum and maximum y value and the minimum and maximum x value - thus, defining a maximum of four points. It should be appreciated that the orientation of the images is not important. At each of these four locations, there may be one or more points; it is most convenient to pick a location with only one point, but is not necessary. In the case of all
four locations with multiple points, any one of the points at any of the locations will suffice as the reference point of step 178. The point is identified as the first point belonging to the border and it serves as the starting point of the envelope tracing method. ENVELOPE TRACING METHOD
Generally speaking, the envelope is traced by determining which adjacent point,
among all the adjacent points of the reference point is most likely to be a border point. This process continues with the most recently selected adjacent point as the new reference point and continues until the border is completely traced.
For the identification of the next border point, the starting point is referred to as the reference point. The angle, 1, of the reference point, (X2, Y2), is determined as follows:
Figure imgf000022_0001
From the reference point, a set of potential "adjacent border points" are established by putting out radial lines from the reference point. Figures 8 A and 8B depict the selection of candidate border points in the myocardium. Figure 8 A shows a color picture of a heart chamber (colored red in the Figure) surrounded by the dark myocardium. Figure 8B shows an enlarged view of the region in Figure 8 A that is bordered by the white box. As depicted in Figure 8B, as the method of the present invention advances, the border is gradually and automatically filled out (as depicted as the white solid curve). The last border point selected is depicted as the white circle. From this last border point, the radial lines are sent out to help determine the next border point. A candidate border point is found along each radial line, with the ventricular pixel nearest to the reference point chosen. Radial lines are radiated out over 180 degrees , from 1 to 1 + 180 degrees. The cost function is then calculated for each candidate point. If the cost of all points is above a threshold cost, then the angular range of radial lines is increased. The candidate point with the lowest cost is chosen as the adjacent border point and
becomes the reference point as the tracing continues until the border forms a closed loop.
The cost function may have global and local factors. Global factors, for example, may emphasize a smoothness in the change of the area of the left ventricle over the cardiac cycle. Local factors emphasize regional border characteristics. The cost factors
are independent and weighted as follows:
Cj - IlwiCy
where is the total cost associated with candidate pixel j; Cy is the cost factor i for candidate pixel j; and w, is the weighting factor for cost factor i. As with the weighting factor mentioned above for the signal-to-noise probability computations, these weight may also be determined by the well known methods of receiver operating characteristics. Individual cost factors may, for example, include the following (which correspond to steps 180, 182, and 184):
1. Contour definition.
The distance between adjacent border points is inversely proportional to how well the contour is defined - large distances between points will make the endocardial border appear jagged. For the candidate point, this cost factor, ci, is given as:
Figure imgf000024_0001
where (xc,yc) is the candidate point; (xr,yr) is the reference point; and (xR,yR) is the previous reference point.
2. Border Sharpness.
The magnitude of the first derivative, or gradient, of the pixel intensity
about the candidate point is a measure of the change from ventricular
pixels to myocardial pixels about this point. The magnitude of the
gradient, G(ps), may be determined using the Sobel operators as defined as
follows:
10
G(PS)X = (P7+ 2ps + P9)-(p1 + 2p2 + p3)
G(Ps)y = (P3 + 2p6 + pg)-(p, + 2p4+ p7) 10
- l- G —(\ pJT55)S = ^ v G"( \ p1-55)sxx2+ G ~(\ pr5s)'yy2 | j
where ps is the candidate point and pi through p9 are the neighboring
pixels in a matrix format. The cost factor, c2, for the candidate point is:
Figure imgf000024_0002
20 = Wf G2 = 0 12 where Gi is the magnitude of the gradient for the reference point and G2 is the magnitude of the gradient for the candidate point.
3. Contour Regularity.
The angle of the gradient of the pixel intensity about the border should be slowly changing for a smooth contour. The cost factor, c3, for the candidate point is given as:
c5 = 1 + 10.\sm(φrc- φ)\ 1
where >r is the angle of the gradient at the reference point; >c is the gradient angle for the candidate point, and > is the angle between a line from the reference point to the center of the ventricle and the candidate point. The angle of the gradient is given by:
φ - ^(G(p5) /G(ps)x)) 1
It will be appreciated that although only three cost functions are herein discussed, many other different cost functions may be employed to identify potential border point. Thus, the present invention should not be limited to the use of these particular cost functions. Indeed, the present invention encompasses any cost method that aids in the automatic determination of a border point. Moreover, the present invention encompasses the use of any subcombination of cost functions described herein.
After the endocardial border is fully identified in this manner, a summary image
may be presented. The background of the summary image may consist of an average of the baseline frames. Superimposed upon this background is the border, which may be highlighted in a different color. A possible format to display the border is depicted in Figure 8B as the solid white border line. The border is thus shown as the continuous broad white band that encloses the left ventricle chamber.
There has thus been shown and described a novel system and method for the delineation of a border region of a patient tissue or organ which meets the objects and advantages sought. As stated above, many changes, modifications, variations and other uses and applications of the subject invention will, however, become apparent to those skilled in the art after considering this specification and accompanying drawings which disclose preferred embodiments thereof. All such changes, modifications, variations and other uses and applications which do not depart from the spirit and scope of the invention are deemed to be covered by the invention which is limited only by the claims which follow.

Claims

IN THE CLAIMS:
1. A method for automatically determining the border of a patient's tissue found in an operator-selected region of interest, said border determined from a set of contrast- enhanced, grey scale images, the steps of said method comprising:
A) obtaining a set of grey scale images of the patient's tissue, some of which
contain contrast agent for image enhancement;
B) identifying a region of interest in which the patient tissue is located;
C) from the set of grey scale images collected from step (A), obtaining a baseline intensity value;
D) subtracting the baseline intensity value from the contrast enhanced images;
E) establishing a threshold based on signal to noise ratio
F) establishing a reference point as a first border point in the region of interest;
G) from a set of candidate points adjacent to said reference point found in step (E), automatically selecting which candidate point is most likely to be a border point; and
H) Substituting the selected candidate point in step (F) as the new reference point and continuing with step (F) until the entire border is determined.
2. The method as recited in claim 1 wherein said patient tissue is the endocardium.
3. The method as recited in claim 2 wherein step (A) further comprises:
(A)(i) selecting a point in the cardiac cycle at which to obtain a set grey scale
images; and (A)(ii) obtaining a set of grey scale images, some of which contain contrast agent for image enhancement.
4. The method as recited in claim 3 wherein step (A)(ii) further comprises: ~ (A)(ii)(a) obtaining a set of grey scale images prior to the introduction of
contrast agent; (A)(ii)(b) obtaining a set of grey scale images during the introduction of contrast agent; and (A)(ii)(c) obtaining a set of grey scale images after the contrast agent has been introduced.
5. The method as recited in claim 1 wherein step (A) further comprises:
(A)(i) obtaining a set of grey scale images of the patient's tissue, some of which contain contrast agent for image enhancement; and
(A)(ii) correcting for motion of patient' tissue in the set of grey scale images obtained in step (A)(i).
6. The method as recited in claim 1 wherein the identifying step of step (B) is
operator-selected.
7. The method as recited in claim 1 wherein the baseline intensity value of step (C) is obtained on a pixel-by-pixel basis.
8. The method as recited in claim 1 wherein the baseline intensity value of step (C) is obtained by an operator selecting baseline image frames by visual inspection.
9. The method as recited in claim 7 wherein the baseline intensity value of step (C) is
obtained by an operator selecting baseline image frames from a graph of mean pixel intensity within the region of interest over time.
10. The method as recited in claim 7 wherein the baseline intensity value of step (C) is automatically obtained by an performing linear regression analysis on pixel intensity over
time.
11. The method as recited in claim 2 wherein step (D) further comprises:
(D)(ii) subtracting the baseline intensity value from the contrast enhanced images on a pixel-by-pixel basis; and (D)(ii) determining, on a pixel-by-pixel basis, whether a given pixel is in the heart chamber;
(D)(iii) determining the center of mass of the heart chamber.
12. The method as recited in claim 11 wherein step (F) further comprises:
(F)(i) locating the set of points defined by the maximum and minimum x and y
coordinates of the set of points in the heart chamber;
(F)(ii) picking one of the points located in step (F)(i) as a reference point.
13. The method as recited in claim 2 wherein step (G) further comprises: (G)(i) selecting a set of neighboring point to the reference point;
(G)(ii) calculating a cost function for each of the neighboring point selected in step (G)(i); and (G)(iii) selecting a new reference point likely to a border point based on the cost values generated in step (G)(ii).
14. The method of claim 1 wherein end systole and end diastole points are used to determine regional wall motion.
15. The method of claim 1 wherein end systole and end diastole points are used to determine ejection fraction.
16. The method of claim 1 wherein end systole and end diastole points are used to determine fractional shortening.
17. The method of claim 1 wherein the imaging is performed from a view selected from the group consisting of sagittal, transverse, longitudinal, parasternal short axis, apical long axis, parasternal long axis, suprastemal long axis, subcostal short axis, subcostal four chamber, apical two chamber, and apical four chamber.
18. The method of claim 1 wherein the border delineates the left ventricle.
19. The method of claim 1 wherein the border delineates a venal thrombus.
20. The method of claim 1 wherein the processing is performed in real time.
PCT/US1996/008257 1995-05-31 1996-05-30 Automatic border delineation and dimensioning of regions using contrast enhanced imaging WO1996038815A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP8536746A JPH11506950A (en) 1995-05-31 1996-05-30 Automatic Boundary Drawing and Part Dimensioning Using Contrast-Enhanced Imaging
AU59629/96A AU5962996A (en) 1995-05-31 1996-05-30 Automatic border delineation and dimensioning of regions usi ng contrast enhanced imaging
EP96916909A EP0829068A1 (en) 1995-05-31 1996-05-30 Automatic border delineation and dimensioning of regions using contrast enhanced imaging

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US45583595A 1995-05-31 1995-05-31
US08/455,835 1995-05-31

Publications (1)

Publication Number Publication Date
WO1996038815A1 true WO1996038815A1 (en) 1996-12-05

Family

ID=23810460

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1996/008257 WO1996038815A1 (en) 1995-05-31 1996-05-30 Automatic border delineation and dimensioning of regions using contrast enhanced imaging

Country Status (5)

Country Link
EP (1) EP0829068A1 (en)
JP (1) JPH11506950A (en)
AU (1) AU5962996A (en)
CA (1) CA2220177A1 (en)
WO (1) WO1996038815A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2329313A (en) * 1997-07-02 1999-03-17 Yumi Tomaru Selecting region of interest in renal scintigraphy
GB2370441A (en) * 2000-08-21 2002-06-26 Leica Microsystems Automatic identification of a specimen region in a microscope image.
US7376253B2 (en) 2001-02-13 2008-05-20 Koninklijke Philips Electronics N.V. Analysis of successive data sets
US7630529B2 (en) 2000-04-07 2009-12-08 The General Hospital Corporation Methods for digital bowel subtraction and polyp detection
US20100316270A1 (en) * 2007-12-20 2010-12-16 Koninklijke Philips Electronics N.V. 3d reconstruction of a body and of a body contour
US8157742B2 (en) 2010-08-12 2012-04-17 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8200466B2 (en) 2008-07-21 2012-06-12 The Board Of Trustees Of The Leland Stanford Junior University Method for tuning patient-specific cardiovascular simulations
US8249815B2 (en) 2010-08-12 2012-08-21 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8548778B1 (en) 2012-05-14 2013-10-01 Heartflow, Inc. Method and system for providing information from a patient-specific model of blood flow
US10354050B2 (en) 2009-03-17 2019-07-16 The Board Of Trustees Of Leland Stanford Junior University Image processing method for determining patient-specific cardiovascular information

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4802093A (en) * 1985-11-22 1989-01-31 Kabushiki Kaisha Toshiba X-ray image-processing apparatus utilizing grayscale transformation
WO1991019457A1 (en) * 1990-06-12 1991-12-26 University Of Florida Automated method for digital image quantitation
EP0521559A1 (en) * 1991-07-03 1993-01-07 Koninklijke Philips Electronics N.V. Contour extraction in multi-phase, multislice cardiac MRI studies by propagation of seed contours between images

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4802093A (en) * 1985-11-22 1989-01-31 Kabushiki Kaisha Toshiba X-ray image-processing apparatus utilizing grayscale transformation
WO1991019457A1 (en) * 1990-06-12 1991-12-26 University Of Florida Automated method for digital image quantitation
EP0521559A1 (en) * 1991-07-03 1993-01-07 Koninklijke Philips Electronics N.V. Contour extraction in multi-phase, multislice cardiac MRI studies by propagation of seed contours between images

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
MAES L ET AL: "Automated contour detection of the left ventricle in short axis view and long axis view on 2D echocardiograms", PROCEEDINGS. COMPUTERS IN CARDIOLOGY (CAT. NO.90CH3011-4), CHICAGO, IL, USA, 23-26 SEPT. 1990, ISBN 0-8186-2225-3, 1991, LOS ALAMITOS, CA, USA, IEEE COMPUT. SOC. PRESS, USA, pages 603 - 606, XP000222135 *
SEBASTIANI G ET AL: "Analysis of dynamic magnetic resonance images", IEEE TRANSACTIONS ON MEDICAL IMAGING, JUNE 1996, IEEE, USA, vol. 15, no. 3, ISSN 0278-0062, pages 268 - 277, XP000600099 *
UNSER M ET AL: "Automated extraction of serial myocardial borders from M-mode echocardiograms", IEEE TRANSACTIONS ON MEDICAL IMAGING, MARCH 1989, USA, vol. 8, no. 1, ISSN 0278-0062, pages 96 - 103, XP000117589 *

Cited By (87)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2329313A (en) * 1997-07-02 1999-03-17 Yumi Tomaru Selecting region of interest in renal scintigraphy
GB2329313B (en) * 1997-07-02 1999-08-04 Yumi Tomaru Method of semi-automated selecting renal region of interest in scintigraphy
US7630529B2 (en) 2000-04-07 2009-12-08 The General Hospital Corporation Methods for digital bowel subtraction and polyp detection
GB2370441A (en) * 2000-08-21 2002-06-26 Leica Microsystems Automatic identification of a specimen region in a microscope image.
GB2370441B (en) * 2000-08-21 2003-11-19 Leica Microsystems Control of an analytical operation and/or adjustment operation in a microscope system
US7376253B2 (en) 2001-02-13 2008-05-20 Koninklijke Philips Electronics N.V. Analysis of successive data sets
US20100316270A1 (en) * 2007-12-20 2010-12-16 Koninklijke Philips Electronics N.V. 3d reconstruction of a body and of a body contour
US11107587B2 (en) 2008-07-21 2021-08-31 The Board Of Trustees Of The Leland Stanford Junior University Method for tuning patient-specific cardiovascular simulations
US8200466B2 (en) 2008-07-21 2012-06-12 The Board Of Trustees Of The Leland Stanford Junior University Method for tuning patient-specific cardiovascular simulations
US10354050B2 (en) 2009-03-17 2019-07-16 The Board Of Trustees Of Leland Stanford Junior University Image processing method for determining patient-specific cardiovascular information
US9268902B2 (en) 2010-08-12 2016-02-23 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US10441361B2 (en) 2010-08-12 2019-10-15 Heartflow, Inc. Method and system for image processing and patient-specific modeling of blood flow
US8311747B2 (en) 2010-08-12 2012-11-13 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8315812B2 (en) 2010-08-12 2012-11-20 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8315813B2 (en) 2010-08-12 2012-11-20 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8315814B2 (en) 2010-08-12 2012-11-20 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8321150B2 (en) 2010-08-12 2012-11-27 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8386188B2 (en) 2010-08-12 2013-02-26 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8496594B2 (en) 2010-08-12 2013-07-30 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8523779B2 (en) 2010-08-12 2013-09-03 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US11793575B2 (en) 2010-08-12 2023-10-24 Heartflow, Inc. Method and system for image processing to determine blood flow
US8594950B2 (en) 2010-08-12 2013-11-26 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8606530B2 (en) 2010-08-12 2013-12-10 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8630812B2 (en) 2010-08-12 2014-01-14 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US11583340B2 (en) 2010-08-12 2023-02-21 Heartflow, Inc. Method and system for image processing to determine blood flow
US8734356B2 (en) 2010-08-12 2014-05-27 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8734357B2 (en) 2010-08-12 2014-05-27 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8812245B2 (en) 2010-08-12 2014-08-19 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8812246B2 (en) 2010-08-12 2014-08-19 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US11298187B2 (en) 2010-08-12 2022-04-12 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US11154361B2 (en) 2010-08-12 2021-10-26 Heartflow, Inc. Method and system for image processing to determine blood flow
US11135012B2 (en) 2010-08-12 2021-10-05 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US11116575B2 (en) 2010-08-12 2021-09-14 Heartflow, Inc. Method and system for image processing to determine blood flow
US8157742B2 (en) 2010-08-12 2012-04-17 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US9078564B2 (en) 2010-08-12 2015-07-14 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US9081882B2 (en) 2010-08-12 2015-07-14 HeartFlow, Inc Method and system for patient-specific modeling of blood flow
US9152757B2 (en) 2010-08-12 2015-10-06 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US9149197B2 (en) 2010-08-12 2015-10-06 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US9167974B2 (en) 2010-08-12 2015-10-27 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US11090118B2 (en) 2010-08-12 2021-08-17 Heartflow, Inc. Method and system for image processing and patient-specific modeling of blood flow
US9226672B2 (en) 2010-08-12 2016-01-05 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US9235679B2 (en) 2010-08-12 2016-01-12 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8311750B2 (en) 2010-08-12 2012-11-13 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US9271657B2 (en) 2010-08-12 2016-03-01 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US9449147B2 (en) 2010-08-12 2016-09-20 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US11083524B2 (en) 2010-08-12 2021-08-10 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US9585723B2 (en) 2010-08-12 2017-03-07 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US9697330B2 (en) 2010-08-12 2017-07-04 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US9706925B2 (en) 2010-08-12 2017-07-18 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US9743835B2 (en) 2010-08-12 2017-08-29 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US9801689B2 (en) 2010-08-12 2017-10-31 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US9839484B2 (en) 2010-08-12 2017-12-12 Heartflow, Inc. Method and system for image processing and patient-specific modeling of blood flow
US9855105B2 (en) 2010-08-12 2018-01-02 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US9861284B2 (en) 2010-08-12 2018-01-09 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US9888971B2 (en) 2010-08-12 2018-02-13 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US10052158B2 (en) 2010-08-12 2018-08-21 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US10080613B2 (en) 2010-08-12 2018-09-25 Heartflow, Inc. Systems and methods for determining and visualizing perfusion of myocardial muscle
US10080614B2 (en) 2010-08-12 2018-09-25 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US10092360B2 (en) 2010-08-12 2018-10-09 Heartflow, Inc. Method and system for image processing and patient-specific modeling of blood flow
US10149723B2 (en) 2010-08-12 2018-12-11 Heartflow, Inc. Method and system for image processing and patient-specific modeling of blood flow
US10154883B2 (en) 2010-08-12 2018-12-18 Heartflow, Inc. Method and system for image processing and patient-specific modeling of blood flow
US10159529B2 (en) 2010-08-12 2018-12-25 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US10166077B2 (en) 2010-08-12 2019-01-01 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US10179030B2 (en) 2010-08-12 2019-01-15 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US10321958B2 (en) 2010-08-12 2019-06-18 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US10327847B2 (en) 2010-08-12 2019-06-25 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8249815B2 (en) 2010-08-12 2012-08-21 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US10376317B2 (en) 2010-08-12 2019-08-13 Heartflow, Inc. Method and system for image processing and patient-specific modeling of blood flow
US8311748B2 (en) 2010-08-12 2012-11-13 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US10478252B2 (en) 2010-08-12 2019-11-19 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US10492866B2 (en) 2010-08-12 2019-12-03 Heartflow, Inc. Method and system for image processing to determine blood flow
US10531923B2 (en) 2010-08-12 2020-01-14 Heartflow, Inc. Method and system for image processing to determine blood flow
US10682180B2 (en) 2010-08-12 2020-06-16 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US10702340B2 (en) 2010-08-12 2020-07-07 Heartflow, Inc. Image processing and patient-specific modeling of blood flow
US10702339B2 (en) 2010-08-12 2020-07-07 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US11033332B2 (en) 2010-08-12 2021-06-15 Heartflow, Inc. Method and system for image processing to determine blood flow
US10842568B2 (en) 2012-05-14 2020-11-24 Heartflow, Inc. Method and system for providing information from a patient-specific model of blood flow
US9517040B2 (en) 2012-05-14 2016-12-13 Heartflow, Inc. Method and system for providing information from a patient-specific model of blood flow
US9168012B2 (en) 2012-05-14 2015-10-27 Heartflow, Inc. Method and system for providing information from a patient-specific model of blood flow
US9063634B2 (en) 2012-05-14 2015-06-23 Heartflow, Inc. Method and system for providing information from a patient-specific model of blood flow
US9063635B2 (en) 2012-05-14 2015-06-23 Heartflow, Inc. Method and system for providing information from a patient-specific model of blood flow
US9002690B2 (en) 2012-05-14 2015-04-07 Heartflow, Inc. Method and system for providing information from a patient-specific model of blood flow
US8914264B1 (en) 2012-05-14 2014-12-16 Heartflow, Inc. Method and system for providing information from a patient-specific model of blood flow
US8855984B2 (en) 2012-05-14 2014-10-07 Heartflow, Inc. Method and system for providing information from a patient-specific model of blood flow
US8706457B2 (en) 2012-05-14 2014-04-22 Heartflow, Inc. Method and system for providing information from a patient-specific model of blood flow
US8548778B1 (en) 2012-05-14 2013-10-01 Heartflow, Inc. Method and system for providing information from a patient-specific model of blood flow
US11826106B2 (en) 2012-05-14 2023-11-28 Heartflow, Inc. Method and system for providing information from a patient-specific model of blood flow

Also Published As

Publication number Publication date
EP0829068A1 (en) 1998-03-18
AU5962996A (en) 1996-12-18
JPH11506950A (en) 1999-06-22
CA2220177A1 (en) 1996-12-05

Similar Documents

Publication Publication Date Title
US5743266A (en) Method for processing real-time contrast enhanced ultrasonic images
US5570430A (en) Method for determining the contour of an in vivo organ using multiple image frames of the organ
US6708055B2 (en) Method for automated analysis of apical four-chamber images of the heart
US5360006A (en) Automated method for digital image quantitation
US5797396A (en) Automated method for digital image quantitation
Saini et al. Ultrasound imaging and image segmentation in the area of ultrasound: a review
US5846200A (en) Ultrasonic diagnostic imaging system for analysis of left ventricular function
EP2658439B1 (en) Automatic left ventricular function evaluation
Hao et al. A novel region growing method for segmenting ultrasound images
EP0533782B1 (en) Automated method for digital image quantitation
Hamada et al. Arrhythmogenic right ventricular dysplasia: evaluation with electron-beam CT.
CN110664435A (en) Method and device for acquiring cardiac data and ultrasonic imaging equipment
WO1996038815A1 (en) Automatic border delineation and dimensioning of regions using contrast enhanced imaging
Hao et al. Segmenting high-frequency intracardiac ultrasound images of myocardium into infarcted, ischemic, and normal regions
Baur et al. Reproducibility of left ventricular size, shape and mass with echocardiography, magnetic resonance imaging and radionuclide angiography in patients with anterior wall infarction: a plea for core laboratories
Leung et al. Sparse registration for three-dimensional stress echocardiography
Maes et al. Automated contour detection of the left ventricle in short axis view in 2D echocardiograms
JP3702198B2 (en) Ultrasonic diagnostic equipment
Bosch et al. Overview of automated quantitation techniques in 2D echocardiography
Ma et al. Left ventricle segmentation from contrast enhanced fast rotating ultrasound images using three dimensional active shape models
Discher et al. An unsupervised approach for measuring myocardial perfusion in MR image sequences
Balaji et al. Automatic border detection of the left ventricle in parasternal short axis view of echocardiogram
Dindoyal et al. An active contour model to segment foetal cardiac ultrasound data
Setarehdan et al. Segmentation in echocardiographic images
Morda et al. Left-ventricular cavity automated-border detection using an autocovariance technique in echocardiography

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BB BG BR BY CA CH CN CZ DE DK EE ES FI GB GE HU IS JP KE KG KP KR KZ LK LR LS LT LU LV MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK TJ TM TR TT UA UG US UZ VN AM AZ BY KG KZ MD RU TJ TM

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): KE LS MW SD SZ UG AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN ML

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
ENP Entry into the national phase

Ref document number: 2220177

Country of ref document: CA

Ref country code: CA

Ref document number: 2220177

Kind code of ref document: A

Format of ref document f/p: F

ENP Entry into the national phase

Ref country code: JP

Ref document number: 1996 536746

Kind code of ref document: A

Format of ref document f/p: F

WWE Wipo information: entry into national phase

Ref document number: 1996916909

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 1996916909

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

WWW Wipo information: withdrawn in national office

Ref document number: 1996916909

Country of ref document: EP