WO2005031635A1 - System and method for three-dimensional reconstruction of a tubular organ - Google Patents

System and method for three-dimensional reconstruction of a tubular organ Download PDF

Info

Publication number
WO2005031635A1
WO2005031635A1 PCT/US2004/031594 US2004031594W WO2005031635A1 WO 2005031635 A1 WO2005031635 A1 WO 2005031635A1 US 2004031594 W US2004031594 W US 2004031594W WO 2005031635 A1 WO2005031635 A1 WO 2005031635A1
Authority
WO
WIPO (PCT)
Prior art keywords
vessel
interest
image
points
point
Prior art date
Application number
PCT/US2004/031594
Other languages
French (fr)
Inventor
Moshe Klaiman
Michael Zarkh
Original Assignee
Paieon, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Paieon, Inc. filed Critical Paieon, Inc.
Priority to JP2006528281A priority Critical patent/JP5129480B2/en
Priority to US10/573,464 priority patent/US7742629B2/en
Priority to EP04785100A priority patent/EP1665130A4/en
Publication of WO2005031635A1 publication Critical patent/WO2005031635A1/en
Priority to IL174514A priority patent/IL174514A/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/564Depth or shape recovery from multiple images from contours

Definitions

  • the present invention relates to medical imaging systems, and more specifically to medical imaging systems for use in angiography, for example.
  • a stenosis in a blood vessel refers to narrowing of the artery lumen due to plaque formation on the interior wall of the artery.
  • the severity of the narrowing depends upon the amount of cross-sectional area of the lumen that is occluded by plaque. While narrowing of the arteries may occur in any artery of the body (e.g., carotid arteries), particular concern has been placed on investigating the narrowing of arteries of the heart, the coronary arteries (coronary heart disease), since narrowing of these arteries is one of the primary causes of heart attacks. Accordingly, coronary angiography refers to the process (and associated systems) of investigating coronary arteries to determine the severity of any narrowing (i.e., to find stenotic arteries) that may exist.
  • a catheter is inserted into an artery of the arm or leg of a patient, where it is eventually advanced into the coronary arteries.
  • a radio-opaque substance is injected therein, so that the arteries may be imaged, using, for example, an X-ray angiography system.
  • the system takes "snapshots" (i.e., angiographic cine-runs) of the arteries at several different perspectives, to obtain complete views of the one or more arterial networks being investigated. Also, since narrowing is often asymmetrical about the axis of the artery, it is necessary to obtain at least two images, and preferably more, preferably perpendicular to an artery's axis from different perspectives (preferably orthogonal perspectives) to assess the severity of a stenosis. However, it is generally very difficult to obtain purely perpendicular perspectives of the vessels.
  • determination of the perspective positions is partially arbitrary and partially a process of trial and error (once a stenosis has been observed).
  • the overall number of images that can be obtained is limited by time, safety and cost. Usually four to seven projections for the left coronary arterial system and two to four images for the right arterial system are obtained.
  • An operator of an angiography system assesses the severity of a stenosis in the coronary arteries either on the basis of visual examination of a plurality of images
  • the 2D QCA systems basically implement the following steps: import of a specific image, vessel extraction for this image and then QCA for the vessel of interest.
  • 2D QCA systems usually provide diameter based analysis of the lesion and not densitometry analysis. In some cases, densitometry analysis is provided via the usage of DSA, but not for scenes that include motion, like the coronaries.
  • the 3D QCA methods generally include the following steps: image acquisition, vessel extraction from the 2D projections.
  • the 3D QCA systems additionally include imaging geometry recovery, point-by-point matching (between images) and, of course, 3DR.
  • the QCA of the 3D system generally includes, morphology assessment (including vessel foreshortening, overlapping, angulation, tortuosity), and in some cases measurements, usually true length and diameter information.
  • morphology assessment including vessel foreshortening, overlapping, angulation, tortuosity
  • measurements usually true length and diameter information.
  • cross-section area measurements are rarely addressed, although attempts have been made to achieve a precise representation of cross section profile along the vessels.
  • a method based on some heuristics in a framework of algebraic reconstruction approach was suggested in U.S. patent no. 6,301,498 to Greenberg.
  • the number of control points needed for geometry recovery depends on the type of transformation that is found and assumptions on unknown parameters. Accordingly, the number of control points can range anywhere from five (5) (see, for example, U.S. patent nos. 6,047,080 and 6,501,848) to eight (8) (see, for example, U.S. patent no. 4,875,165) for perspective transformation.
  • the confident and accurate identification of at least five corresponding points on multiple images is a burdensome procedure, if at all possible, since, for example, the right coronary artery system often lacks adequate branching points.
  • point-by-point matching utilizes (e.g., for multiple images) the epi-polar principle.
  • Epi-polar geometry is premised on the statement that for an imaged 3D point, its projections on a pair of images and two (2) associated focal points belong to a common (epi-polar) plane. Accordingly, for any given point on one image, the search for the corresponding point on another image may be found on the epi-polar line (intersection of the epipolar plane with the image plane).
  • this approach yields sufficient results only if: (i) the imaging geometry model adequately relates the organ and its 2D image, and (ii) the imaged vessel does not change its shape between the image acquisitions. This is why, in clinical practice, the restrictions of the straightforward epi-polar geometry approach are very limiting in terms of accuracy and quality of the 3D model.
  • embodiments of the present invention overcome the drawbacks and problems associated with the prior art systems, and present simple to use and straight forward systems and methods for accurately imaging and creating a 3DR of a tubular organ which may be used with conventional X-ray angiography systems. Specifically, some embodiments of the present invention present methods and systems for 3DR of a single vascular structure of interest, using two (and in some embodiments, more than two), 2D X-ray images.
  • some embodiments may include one or more (and, in some embodiments all) of the following: acquisition of cine-runs, projection angulation and ECG information (e.g., via Analog and/or DICOM), system calibration to process images (e.g., catheter calibration), marking of two or more images, edge tracing, with pre and post processing to eliminate potential incorrect distortions of the edge, detection of centerline, densitometry, including background subtraction, point-to-point matching and 3DR, fusion of diameter and densitometry data to obtain precise vessel cross-section area measurements, determination and visualization of healthy vessel proportions (in 2D and/or 3D), and display of data associated with the system, vessel of interest, and other related data.
  • the output of coronary angiography is improved by presenting a three-dimensional reconstruction of, for example, a stenotic vessel, as well as quantitative cross-section information.
  • a three-dimensional reconstruction may be integrated into one display with information about the imaged vessel that is available from angiography. Moreover, the 3D reconstruction as presented by such embodiments may reveal the complete morphology of the vessel, including details that are unseen in the 2D images due to foreshortened and curved segments. In addition, a display of 2D or 3DR of a vessel of interest can be focused on, zoomed and rotated.
  • the tubular organ and vessel of interest may be any one of an artery, a vein, a coronary artery, a carotid artery, a pulmonary artery, a renal artery, a hepatic artery, a femoral artery, a mesenteric artery and the like (e.g., any other tubular organ).
  • a method for three-dimensional reconstruction (3DR) of a single tubular organ using a plurality of two-dimensional images may include one or more of the following steps: displaying a first image of a vascular network, receiving input for identifying on the first image a vessel of interest, tracing the edges of the vessel of interest including eliminating false edges of objects visually adjacent to the vessel of interest and determining substantially precise radius and densitometry values along the vessel.
  • the method may also include one or more of the following steps: displaying at least a second image of the vascular network, receiving input for identifying on the second image the vessel of interest, tracing the edges of the vessel of interest in the second image, including eliminating false edges of objects visually adjacent to the vessel of interest, determining substantially precise radius and densitometry values along the vessel of interest in the second image, determining a three dimensional reconstruction of the vessel of interest and determining fused area measurements along the vessel.
  • This embodiment may also include determining a centerline, which includes a plurality of centerline points.
  • the determination of the fused area may include determining a plurality of healthy diameters (and preferably all healthy diameters) along the vessel of interest to be used as a physical reference, normalizing a majority of the data (and preferably substantially all the data, and most preferably, all the data), e.g., diameters and cross-section values into physical units, using the physical reference, fusing a majority of the data (preferably all or substantially all) into single area measurements and weighting each source of data according to the reliability of the data.
  • the weighting may be computed as a function of the views geometry and/or 3D vessel geometry.
  • the input for identifying the vessel of interest may include three points: a first point to mark the stenosis general location, a second point proximal to the stenosis, and a third point distal to the stenosis.
  • the input may also comprises markers for two (2) points for at least one of the first and second images, where one of the two points is anywhere proximal to the stenosis and the other point is anywhere distal to the stenosis.
  • the markers may also comprise two (2) points for the first image and one (1) point for the second image, where one of the two points is anywhere proximal to the stenosis and the other point is anywhere distal to the stenosis and one point may be an anchor point identified automatically on the first image.
  • a novel embodiment for detecting such bubbles may include defining a region of interest substantially parallel to a primary centerline, detecting at least one cluster of pixel data, adjacent to the vessel of interest, wherein each cluster of pixel data having a predetermined brightness level greater than a brightness level of surrounding pixel data, selecting an arbitrary pixel within each cluster, selecting a second pixel provided on a lane bounding the region of interest for each arbitrary pixel of each cluster, and establishing a barrier line to define an edge for the vessel of interest by connecting a plurality of arbitrary pixels with a corresponding second pixel.
  • the traced edge Upon the tracing each edge of the vessel of interest, the traced edge avoids each barrier line.
  • the elimination of false edges may also include detecting and/or eliminating (e.g., ignoring) one or more "bumps" along the vessel of interest.
  • the elimination of false edges, with regard to bumps may include establishing a list of suspect points: establishing a plurality of first distances between each of a plurality of originating points on at least one preliminary traced edge and a corresponding closest point positioned along the primary centerline, establishing a plurality of second distances between each of a plurality of second centerline points point on the primary centerline to a corresponding closest point positioned on the at least one edge, and determining deviation, from the centerline, an absolute distance of the second distance and the first distance.
  • the method may also include determining a gradient cost function, inversely proportional to a gradient magnitude at each edge point, determining a combined function aggregating deviation from the centerline and the gradient cost function, where upon the combined function being greater than a predefined value, the corresponding edge point is determined to be a bump point in a bump.
  • the method may further include determining a bump area defined by a plurality of connected bump points and a cutting line adjacent the vessel of interest, where the cutting line comprises a line which substantially maximizes a ratio between the bump area and a power of a cutting line length, and cutting the bump from the edge at the cutting line to establish a final edge.
  • the centerline of the vessel of interest may be determined one or more of the following steps: determining final traced edges of the vessel of interest, determining pairs of anchor points, wherein each pair comprises one point on each edge, determining a cross-sectional line by searching for pairs of anchor points which, when connected, establish the cross-sectional line substantially orthogonal to the center-line, dividing each edge into a plurality of segments using the anchor points, where for each segment, correspondence between the edges is established in that every point of each edge includes at least one pair of points on an opposite edge and a total sum of distances between adjacent points is minimal.
  • the method may also include connecting the centers of the plurality of segments to determine the centerline.
  • Densitometry may comprise properly subtracting a background influence.
  • determining densitometry values may include one of more of the following steps: establishing a plurality of profile lines substantially parallel to at least one edge of the vessel of interest, establishing a parametric grid covering the vessel of interest and a neighboring region, where the parametric grid includes a first parameter of the vessel of interest along the length thereof and a second parameter for controlling a cross-wise change of the vessel of interest and sampling the image using the grid to obtain a plurality of corresponding gray values - the gray values are investigated as functions on the profile lines.
  • the method may also include substantially eliminating detected occluding structures on the outside of the vessel of interest, the structures being detected as prominent minima of the parameters, substantially eliminating prominent minima detected on the inside of the vessel of interest, averaging gray values in a direction across the vessel of interest separately for each side of the vessel of interest, determining a linear background estimation on the grid inside the vessel of interest and determining cross-sectional area using the eliminated prominent minima.
  • Embodiments of the invention may include determining healthy vessel dimensions using an iterative regression over a healthy portion of the vessel of interest.
  • iteration comprises a compromise between a pre-defined slope and a line that follows healthy data.
  • the compromise is toward the line that follows the healthy data if the line corresponds to actual data over a plurality of clusters.
  • the determined healthy dimensions of the vessel of interest may be displayed, either in 2D and/or in 3D.
  • An epi-polar indicator and associated means (e.g., application program/computer instructions for a processor), may be included with various embodiments of the present invention. Accordingly, after receiving input for identifying the vessel of interest in the second image, the epi-polar indicator may be displayed for indicating a concurrence between the first image and second image for producing a "good" three-dimensional reconstruction of the vessel of interest.
  • Data in some embodiments of the present invention, may be cross-referenced among other data.
  • Other embodiments of the present invention are directed to a system for three- dimensional reconstruction (3DR) of a single blood vessel using a plurality of two- dimensional images.
  • a system may include a display for displaying a first image of a vascular network and a second image of a vascular network, and a three-dimensional reconstruction of a vessel, input means for receiving input for identifying a vessel of interest on the first image and for identifying the vessel of interest on the second image, and a processor arranged to operate one or more application programs and/or computer instructions.
  • the computer instructions may include instructions for allowing the processor to perform one or more of the following: tracing the edges of the vessel of interest including eliminating false edges of objects visually adjacent to the vessel of interest, determining substantially precise radius and densitometry values along the vessel, tracing the edges of the vessel of interest in the second image, including eliminating false edges of objects visually adjacent to the vessel of interest, determining substantially precise radius and densitometry values along the vessel of interest in the second image, determining a three dimensional reconstruction of the vessel of interest and determining fused area measurements along the vessel.
  • Other computer instructions may be included for accomplishing any of the foregoing not explicitly included herein.
  • inventions of the present invention are directed to a system for three- dimensional reconstruction (3DR) of a single blood vessel using a plurality of two- dimensional images.
  • the system may include display means for displaying a first image of a vascular network, and a second image of the vascular network and the 3DR, input means for identifying a vessel of interest on the first image and the second image, tracing means for tracing the edges of the vessel of interest in each image including elimination means for eliminating false edges of objects visually adjacent to the vessel of interest in each image and a processor.
  • the processor may be used for determining a centerline, comprising a plurality of centerline points, determining substantially precise radius and densitometry values along the vessel, determining substantially precise radius and densitometry values along the vessel of interest in the second image, determining a three dimensional reconstruction of the vessel of interest, determining fused area (cross- section) measurements along the vessel and establishing the 3DR of the vessel of interest.
  • inventions may include a system for three- dimensional reconstruction (3DR) of a single blood vessel using a plurality of two- dimensional images is provided (according to any of the foregoing, for example) and may also include an angiography system comprising a platform for scanning a patient, a C- ARM X-ray system including an x-ray source, a detector, a step motor for moving the C- ARM, and a workstation for doing QCA.
  • 3DR three- dimensional reconstruction
  • the workstation may include display means for displaying a first image of a vascular network, and a second image of the vascular network and the 3DR, input means for identifying a vessel of interest on the first image and the second image, tracing means for tracing the edges of the vessel of interest in each image including elimination means for eliminating false edges of objects visually adjacent to the vessel of interest in each image and a processor. Still other embodiments of the invention are directed to computer readable media
  • Fig. 1 illustrates a schematic of a system and interface to a C-ARM according to an embodiment of the present invention.
  • Fig.2 illustrates a three (3) point marking of a stenotic vessel.
  • Fig. 3 is an image from a cine-angio run comprising a vascular network.
  • Fig. 4 is the image from Fig. 3 having an incorrect traced edge.
  • Fig. 5 is a schematic of a vessel having a bubble area.
  • Fig. 6 is an image from a cine-angio run which includes references to items used in edge correction.
  • Fig. 7 is an image illustrating a detected bubble of a vessel of interest.
  • Fig. 8 is the image of Fig. 7 illustrating final traced edges.
  • Fig. 9 is an image from a cine-angio run for illustrating bump detection.
  • Fig. 10 is the image of Fig. 9 with an incorrect edge tracing (having a bump).
  • Fig. 11 is the image of Fig. 9 with corrected edges.
  • Fig. 12 is a schematic of a bump detection and elimination process
  • Fig. 13 is a further schematic of the bump detection and elimination process.
  • Fig. 14 is a further schematic of the bump detection and elimination process.
  • Fig. 15 is an image of a vessel of interest depicting centerline definition.
  • Fig. 16 is a schematic illustrating a typical cross section of a vessel.
  • Fig. 17 is an image of a vessel of interest illustrating an approach of computing densitometry of a vessel.
  • Fig. 18 is a schematic illustrating a principle of densitometry according to some embodiments of the present invention.
  • Fig. 19A is an image illustrating profile lines of a vessel of interest for computing densitometry.
  • Fig. 19B is a graph of densitometry values associated with the image of Fig. 19 A.
  • Fig. 20A and 20B represent first and second images of a vascular network for illustrating point-to-point matching.
  • Fig. 21 A is an image of a stenotic vessel for illustration of healthy artery computation.
  • Fig. 2 IB is a graph illustrating healthy artery computation of the stenotic vessel of
  • Fig. 22A is another image of a stenotic vessel for further illustration of healthy artery computation.
  • Fig. 22B is a graph illustrating the healthy artery computation of the stenotic vessel of Fig. 22A.
  • Figs. 23-28 are images of a stenotic vessel of interest, with reference to determining a healthy display of the vessel.
  • Fig. 29 is a screenshot for a 3DR system according to the present invention illustrating a 2D image related display (including healthy artery display in 2D)-.
  • Fig. 30 is a 3DR of a vessel of interest.
  • Fig. 31 is a screenshot of a 3DR, including display of information associated with the 3DR.
  • Fig. 32 illustrates an example of a pop-list that appears on screenshots of a system according to the present invention (also illustrating a 3DR of a vessel of interest).
  • Fig. 33 is a screenshot for a 3DR system according to the present invention illustrating a calibration technique.
  • Fig. 34 is a screenshot for a 3DR system according to the present invention illustrating graphic data presentation.
  • Figs. 35A is a first image illustrating traced edges of a vessel of interest.
  • Fig. 35B is a second image for the vessel of interest, which includes an epi-polar bar and lines for indicating the applicability of the second image as a good candidate for 3DR, with relationship to the imagewessel imaged in figure 35 A.
  • Fig. 36 illustrates a 3D cylinder representation of a tubular organ segment according to some embodiments of the present invention.
  • Fig. 37 illustrates a cross-section area through the segment illustrated in Fig. 36.
  • Fig. 1 illustrates one exemplary system constructed in accordance with some embodiments of the present invention useful for producing either two-dimensional angiographs and/or 3DRs of a patient's vascular system.
  • a system may include a horizontal support such as a table 2 for a patient 3 under examination, and a gantry C-arm 4 which encloses the patient's body.
  • the C-arm supports a radiation source 5 at one side of the patient's body, and a radiation detector 6 at the opposite side and in alignment with the radiation source.
  • the radiation source 5 may be an X-ray point source which produces, for example, a conical X-ray beam.
  • the radiation detector which may consist of a CCD camera having a plurality of radiation detector elements.
  • the apparatus may further include a step motor 7 for changing the angular position of the radiation source and radiation detector with respect to the body under examination.
  • the step motor 7 is capable of rotating the radiation source and the radiation detector about the Z-axis, which is the longitudinal axis of the patient's body, and also about the X-axis, which defines, with the Z-axis, the plane of the horizontal body support.
  • the electronics which may be included with the system of Fig. 1 may include an angiography system controller 10 which controls the radiation source and also the step motor to successively produce the exposures of the body from a plurality of different angular positions with respect to the body.
  • the controller may also receive the electronic outputs from the radiation detector elements in the CCD camera.
  • a computer work station 11 may be included which controls the angiography system controller 10 to produce a two-dimensional images 12 of blood vessels projection to any selected play (cine-runs), as well as 3DR images 13.
  • Control is preferably synchronized with a cardiac and/or respiratory gating signals produced by an ECG sensor and/or a respiration sensor (not shown), so that images of the blood vessels may be obtained during the same point during a cardiac cycle or respiration cycle.
  • the workstation may include the application programs and/or hardware for enabling the operation of the systems and methods of the embodiments of the invention for 2D and 3DR, as well as the associated QCA. Also, the systems and methods according to embodiments of the present invention may be and add-on component to the above- described configuration for a catheterization room. In some embodiments, another workstation, including hardware and software, may be interfaced to the catheterization room, for receiving cine-runs, and optionally, C-ARM angulation and ECG to process and present 3DR.
  • Two-dimensional (2D) X-ray images of a plurality of cine-angio runs are captured and presented on the monitor substantially in real-time during catheterization of a patient.
  • C-arm angulation data and ECG data may also be 1 acquired.
  • an ECG Gating process may be used to present the optimal ("best") image (End Diastolic Frame) from the captured images of the each cine-angio runs.
  • DICOM Digital Imaging and Communications in Medicine
  • DICOM is an acronym for "Digital Imaging and Communications in Medicine”
  • catheter calibration may be accomplished by identifying the catheter edges (3310), as shown in Fig. 33. In this way, knowing the size of the catheter, one can determine distances (e.g., pixel to mm transformation) in each image.
  • an operator of the system may mark a stenosis of a vessel of interest, by either manual selection (by the operator), or a system selected (e.g., via ECG gating) image, from at least a first image and a second image each selected from a separate cine-angio run.
  • the marking includes at least three (3) points, but in other embodiments, less than three points may be used (see “Limited Marking 3DR" below).
  • the three (3) points may include (Fig.
  • a primary centerline may be extracted using known algorithms (such as dijkstra optimization or wave propagation method).
  • the only property the primary centerline should possess is that it be a path inside the marked vessel.
  • the user marking points which can be located outside the vessel due to an imprecise user's pointing, may be automatically checked and moved, if necessary, into the vessel. Accordingly, the tracing algorithm may use these properly located marking points to an extract a primary centerline.
  • edge detection i.e., edge tracing
  • edge tracing may be accomplished via known methods using known algorithms (see, for example, Gradient Field Transform, "A New Approach For The Quantification of Complex Lesion Morphology: The Gradient Field Transform;", Zweit & Reiber, JACC vol. 24; "Single Source Shortest Path”; Introduction To Algorithms; Gormen, Leiserson & Riverst., p. 527; each of which is herein incorporated by reference in its entirety).
  • edge detection in angiography poses many difficulties, for which embodiments of the present invention address.
  • Such difficulties relate to a detected edge of a vessel of interest "detouring" off the actual edge of the vessel onto an edge of a visually adjacent vessel (or other feature/object) from the complex vascular structure which may surround the vessel of interest (in which the vessel of interest may be apart of)(see Figs. 3-4, illustrating a complex network of vessels and an incorrect edge trace 410).
  • a similar phenomenon is recognized where the end-point of the vessel of interest is marked, near which likes another parallel (or substantially parallel) vessel. Accordingly, prior to detecting the edges of the vessel of interest (using, for example, the above edge detecting methodologies, or modified versions thereof), embodiments of the invention conduct preprocessing to substantially reduce and preferably eliminate such detours from appearing as the final edge.
  • a bubble comprises a bright spot (relatively) surrounded by darker areas (e.g., another vessel 520) near the vessel of interest 530 and may be detected using a pixel map of the image.
  • one or more bubbles may be detected and substantially eliminated as a problem for edge detection in the following manner.
  • a region of interest, as shown in Fig. 6, for tracing of each edge is defined.
  • a primary centerline 610 It is bounded by a primary centerline 610, a "lane" 620 (which is a line positioned a sufficient distance away from the primary centerline, for example, twice the distance as a maximal possible healthy radius, from primary centerline) and two lines (Source 630 and Target 635), closing the hole between primary centerline and the lane.
  • the region of interest is bounded by four of the above-mentioned lines for one of the two edges.
  • the edge tracing is a process of finding an optimal path connecting the Source and the Target lines and need not leave the region of interest.
  • a bubble, cluster 640 (see also, Fig. 7, cluster 740) in the region of interest is then detected as a bright spot within a darker surrounding area.
  • a line drawn from cluster 640 to the lane defines a preventing path 650.
  • a plurality of such preventing paths are constructed.
  • edge tracing is preformed, no edge line which bypasses the bubble is possible - the edge tracing process ignores the bubble (e.g., by being prevented from crossing a preventing path), and thus, yields a correct edge.
  • Fig. 8 illustrates final edges (810 and 820) which by-pass the bubble.
  • FIG. 9 Another problem exists where a detected edge detours off the edge of the vessel of interest as a result of a side branch on the vessel of interest, yielding a "bump" appearance.
  • a side branch vessel 920 off of the vessel of interest 910 creates an incorrect edge 1010 (Fig. 10).
  • Embodiments of the present invention address this concern preferably after bubble detection and after the primary edges have first been detected, which, with reference to Fig. 11, presents a correct edge (lines 1110 and 1120).
  • a bump may be characterized (Figs. 12-13) by an increase in distance between opposite edges (manifested as increase in distance between edge and centerline), and low gradients on suspected bump edges.
  • the bump process includes two steps: bump detection and bump correction. Accordingly, after a primary edge 1200 (Fig. 12) is found, bumps are sought out. Starting from a point on the edge 1300, the closest point on the primary centerline (or opposite edge or any line substantially parallel to the vessel) is found and a distance between the two is found (arrow 1310). Then, from a point on the centerline, the closest point on the edge is found and a distance between the two is found, which is denoted by arrow 1320. Deviation from the centerline is defined as an absolute difference between distances 1320 and 1310.
  • all edge points are checked for as being bump points.
  • a combined function is calculated. This function aggregates two components: deviation from centerline and gradient cost function (for example, a condition on the gradient value can be expressed via a gradient cost function which may be inversely proportional to the gradient magnitude).
  • deviation from centerline and gradient cost function for example, a condition on the gradient value can be expressed via a gradient cost function which may be inversely proportional to the gradient magnitude.
  • a suspected bump point having a big deviation from centerline and/or low gradient may be considered an actual bump point.
  • the combined function in particular, can be product of deviation from centerline and gradient magnitude.
  • a bump comprises a plurality of bump points.
  • the detected bumps are corrected by "cutting" the bumps from the edge.
  • bump points which may include one or more neighboring edge points
  • an area 1405 of the bump is then determined, using the outer border of the bump 1400 and a cutting line 1410 as inner border.
  • the appropriate cutting line is finally determined by a line which maximizes the ratio between the bump area and a function of cutting line length, for example, a power of the cutting line length, and which is also the correct edge of the vessel of interest. This "cuts" the bump from the imaging of the vessel and establishes the correct edge of the vessel.
  • the centerline definition being an input for determining radius and densitometry values, ultimately determines stenosis measurements, and thus is very important.
  • the centerline is a line passing inside the vessel, between the edges. Every point in the centerline should be equally distant from the edges (i.e., the center). This is referred to as "problematic step" in the art and several methods are currently in use to determine such centerlines. Accordingly, any of the prior art centerline detection techniques may be used with the present invention. However, some embodiments of the present invention present invention present a novel approach as disclosed below. Accordingly, in one embodiment of the invention, a centerline is detected by seeking out pairs of anchor points (one on each edge)(see Fig. 15, item 1510).
  • anchor points (Pi, Cj) are found according to the following definition: the pair (Pi, Cj) is a base pair if distance (Pi, Cj) is less than the distance (Pi, C), and distance (Pi, Cj) is less than the distance (P, Cj) for all points (P, C) from the edges.
  • the anchor pairs are simated at bottleneck positions between the edges. This results in the cross-sectional line to be substantially orthogonal to the resulting centerline at anchor points, which is a natural property of a tubular body.
  • edges are then divided into segments 1520 between the anchor points.
  • correspondence between edges may be established according to the following principles: every point of each edge must have at least one matched point on an opposite edge; and total sum of distances between matched pairs is minimal.
  • the centerline is defined as connecting centers of the lines connecting each pair? Diameter values along the vessel could be simply the lengths of those lines.
  • Densitometry is the task of determining gray-levels along a vessel's cross-section to estimate the cross-section area of a vessel. While diameter measurement is view dependent, the cross-section area is not (being, theoretically, identical under every view).
  • Fig. 16 describes a cross section of the artery. A different diameter is measured (Dl and D2), depending on the view direction, while the area of the cross section area possesses a property of directional invariance.
  • the classical approach to compute densitometry is to compute the background gray-levels along segments perpendicular to the centerline (for example, black lines 1720, 1730) and to "subtract" those values of the background (e.g., outside vessel boundaries/edges) from the vessel's gray-value. If indeed the perpendicular segments pass through background that is common to the artery (for example, the left segment passes through the catheter), such a method may work. The vessel of interest also "goes over" the catheter, thus, subtracting the catheter gray-level values is justified.
  • one embodiment of the invention presents a novel algorithm to "subtract" the background influence in a vessel.
  • profile lines 1810 Fig. 18
  • the background analysis is much more global and may account for many things the classic approach cannot.
  • a two parametric grid covering the vessel and neighboring region are applied.
  • One parameter controls the change of the vessel along its length and the second parameter controls the change of the vessel cross- wise.
  • the image is then sampled on the grid.
  • Obtained gray values are investigated as functions on the lines parallel to the vessel (lines 1810, Fig. 18).
  • the crossing vessels and other occluding structures are detected as prominent minima of the functions and preferably eliminated.
  • the similar minima elimination is also performed on the grid inside the artery.
  • the values of the grid outside the vessel are averaged in a direction across the vessel on both sides separately, and linear background estimation is calculated on the grid inside the artery.
  • cross-section area is calculated using subtracted background.
  • the continuous line 1910 is the centerline of the vessel of interest.
  • parallel profile lines 1920 are drawn “outside" the vessel.
  • the graph (Fig. 19B) represents gray-level along one such profile line. One can see that the branches, being much darker, are expressed as minima points within this graph. Each of these functions passes the procedure of cutting the downward peaks. As a result of this procedure, the vessels branching from the artery or crossing over the artery are neglected.
  • any prior art 3DR method may be used to accomplish 3DR with any of the embodiments of the present invention (e.g., based either on orthographic or perspective imaging geometry model).
  • the following is a method for improved 3DR, according to some embodiments of the present invention, which overcomes problems, for example, such as geometric distortions. Because of presence of geometric distortions caused by scene changes between acquisitions, neither orthographic nor perspective transformation may be able to determine a substantially exact match between the images. The existence of such distortions and their influences on 3DR results are well known in the art: (i) errors in the 3D centerline reconstruction, and (ii) the fusion of mis-matched data for cross-section estimation.
  • some embodiments of the invention include a method to obtain a substantially exact match between images using a more suitable approach than the prior art methods (see, for example, U.S. patent nos. 4,875,165; 6,047,080; and 6,501,848), using local error corrections. Moreover, embodiments of the present invention automatically find and/or match landmark points between images.
  • the principle underlying obtaining a substantially exact match of points between images is to allow a continuous deviation from the epi-polar constraint in order to minimize discrepancies along the vessel (e.g., branching points or other prominent landmark features).
  • This approach may be used to obtain additional types of landmark points in order to improve reconstruction process.
  • the epi-polar principle prescribes the corresponding points to be in equal distance (epi-polar distance p; see Figs. 20A-20B) to a reference epi-polar line.
  • Reference points can be marked by an operator in all images or a reference point marked by the operator in one image may then be refined in order for it to be accurately located by a local correlation algorithm (for example) in other images or the reference points can be found automatically in all the images.
  • landmark feature points may be utilized for calculation of improved epi-polar distance on an image: branching points (B); prominent features of diameter function (C1,C2); local extremes of epi-polar distance (D) as a function of centerline point; and points of extreme curvatures (E).
  • a vessel's centerline points are preferably matched according to the match of improved epi-polar distances.
  • a conventional epi-polar distance p is calculated for artery centerline points of reference image pi in Fig. 20A and for artery centerline of the second image p 2 in Fig. 20B.
  • Embodiments of the present invention obtain a graph of measures: diameter or cross-section area, along the artery.
  • the values of a healthy vessel need to be extrapolated (for example).
  • An iterative "Regression” function is aimed to calculate the regression line of the "healthy" portion of the incoming values.
  • the Iterative Regression function calculates a regression line, which "ignores" extreme values (in most calling cases extremes are stenosis values or aneurysm values).
  • the method is an iterative computation of regression lines, while removing extreme values (which are far from the line using a function of the standard deviation, for example), until the error (between predication and line) is less than a predefined error or number of points participating in a "creation" of the "regression" line - i.e., points which were not identified as stenosis or aneurysm - are too low (for example - less than between about 5-50% of the total number of points, in some embodiments, less than between about 15-30% in other embodiments, and less than between about 20% in preferred embodiments).
  • the classic model is further expanded using some embodiments of the present inventions in at least the following ways: a default slope is "forced" into the iterative regression; this is motivated by the anatomical fact that vessels are typically always tapered; and searches for "clusters" of data; it is hypothesized that the use of more separate consistent clusters yield better results than a single long cluster (again, based on the anatomical characteristics of the vessel).
  • the algorithm may solve a dilemma on whether to follow the prescribed default slope or to maintain the slope from the previous iteration.
  • the measure of confidence about the slope seen on a previous iteration depends on the distribution of data points supporting the current regression line. If the data points supporting the current regression line are distributed evenly over the argument interval, more weight is given to the calculated slope. In an opposite situation, upon data points supporting a current regression line clustered as one block, more weight is given to the default slope.
  • FIG. 21 A illustrates an example for a "normal" stenotic vessel, with both proximal 2110 and distal 2120 healthy portions, hi a representing iteration of the healthy artery computations, Fig. 2 IB, there will be two clusters of points: one in the proximal part and one in the distal part (the bulleted points in the figure), for which the values of the radius line (2130) is relatively close to the values of the "regression" line (2140). Since there are two clusters distributed along the vessel, the new line (which strives to get closer to the data) will be accepted (rather than striving to stay closer to the predefined slope (2150).
  • Figs. 22A-22B represent another example.
  • the vessel of interest presents an ostial lesion (or defused disease).
  • the vessel has a healthy proximal portion 2210, but is stenotic through all its distal part 2220.
  • This, of- course, is manifested in the radius graph of Fig. 22B.
  • the "regression line" 2230 includes one cluster of points, in which the radius value is close to the regression value.
  • the result of the iteration will be closer to the default slope 2240 than to the regression line. It is worth noting, that this step of healthy artery computation is described in two contexts, computation and for 2D display (see below).
  • the above computation is preferably performed first, and then it serves as an input for the 2D display procedure.
  • the difference between the two steps is that the computation step is generally related to the healthy values, while the second step (of display) may also be related to the "symmetry" of this values versus the lesion (for example, how to locate a value of 5mm healthy "around" a 3mm lesion).
  • Healthy artery display is an excellent tool for image presentation in QCA systems, and helps a physician to analyze a stenotic area (e.g., in terms of symmetry, etc.). Since this information of the healthy vessel is not part of an angiogram, some embodiments of the invention establish such information based on an extrapolation of existing data (preferably lumen edges). Accordingly, Fig. 23 presents an image of a vessel network, Fig. 24 represents the detected edges of the lumen, with Fig. 25 represents the display (which may be extrapolated) of what the vessel would look like if it were healthy.
  • This process includes connecting each edge's end- points to each other by a straight line, producing two lines 2610 and 2620 (Fig. 26).
  • the lines are preferably produced to be as distant from each other, using a measure of "healthy radius" (see above). If the vessel lumen is entirely inside these two lines, the computation of the healthy artery is complete, as these lines may then represent the healthy artery. If the vessel lumen is not entirely inside these two lines, the most distant point of any lumen edge 2701, 2705 from those lines is found (point 2710, Fig. 27). This point (and the corresponding point at the second edge) divides each edge into two (see Fig. 28). This process is continued recursively.
  • the recursive procedure starts with the first segment defined as the whole artery, i.e. from is the start threshold of the artery and to is the end threshold of the artery.
  • a segment of the artery limited by two couples of the previously found anchor points is received.
  • Each couple contains two points from different edges. For example, let P and C be edges of the vessel of interest; the two couples of points can be denoted by (Pfrom,Cfr 0 m) and (P to5 C to ).
  • the new point is the point from the segment of interest that maximally deviates from the line connecting the centers of the limiting segments from and to. Accordingly, if the deviation is less than the correspondent healthy radius, then the new point is discarded and recursion branch terminates. If the deviation is greater than the correspondent healthy radius and this healthy radius in turn is greater than the input radius at this point, a new couple of anchor points are found.
  • One point of the new couple is the new point.
  • the second point constituting the new couple is determined via healthy radius and a point from the opposite edge corresponding to the new one. Namely, the second point constituting the new couple lies in the twice healthy-radius distance on the straight line connecting the new point with its counterpart. If the deviation is greater than the correspondent healthy radius and this healthy radius is less than the input radius at this point (e.g., an aneurysm) then a new couple of the anchor points also exists.
  • the points of the new couple lie on the straight line connecting the new point and a point from the opposite edge corresponding to the new one.
  • the distance between the points of new couple is equal to twice healthy-radius.
  • a result of the termination of recursion is a list of anchor points.
  • the healthy edge is finalized via interpolation between anchor points (for example, spline interpolation). See Fig. 29 showing, at center, two-dimensional, healthy artery display.
  • the healthy 3D artery is defined by 3D healthy centerline and 3D healthy diameters.
  • 3D healthy centerline the known point-by-point matching of 2D centerlines may be utilized, applying it to the healthy 2D centerline points nearest to the available matched pair.
  • the 3D healthy diameters may then be taken as a diameter corresponding to the healthy (reference) diameter.
  • the cross section area may be a result of the fusion algorithm described below and the healthy diameter is (iterative) regression line for sqrt(cross section area/ ⁇ ). Fusion
  • Diameter values are view dependent and both diameter and cross-section area values may be corrupted by noise.
  • Embodiments related to this implementation may also be based on assigning "quality" weight tags for every source of information based on the relation between the projection geometry and the artery's 3D geometry.
  • a 2D image participating in a 3D vessel reconstruction supplies 2D centerline, diameter and non-physical area value.
  • 2D centerlines may be linked to the 3D centerline (i.e., every 3D centerline point linked to originating 2D centerline points).
  • every 3D centerline point there exists references to at least one set of measured 2D diameters and area values (preferably two sets for the at least two images).
  • Diameter graphs and cross section graphs are configured to a common coordinate system (e.g., in mm) using (for example) an adjustment of the found regression lines. More specifically, the healthy line of the average diameter may be used as a reference line. In that regard, substantially all (and preferably all) data may be transformed (radius and densitometry per run) using (for example) the ratio between the data's healthy line and the reference healthy line.
  • RadsNorm RadAvReg / RadsReg * Rads
  • RadensNorm RadAvReg / RadensReg * Radens
  • RadsNorm are Normalized Radius values
  • RadensNorm are Normalized Densitometry-derived-Radius values
  • RadAvReg are healthy (regression line) values derived from average radius graph
  • RadsReg are healthy (regression line) values derived from specific radius graph
  • Rads are specific radius graph values
  • RadensReg are healthy (regression line) values derived from specific densitometry- derived-radius graph
  • Radens are specific densitometry-derived-radius graph values
  • the fused area may be calculated as a weighted sum of densitometry areas and areas calculated via product of diameters (for example).
  • the weights may be determined locally and may depend on viewing directions and/or local 3D centerline direction.
  • the weight of densitometry area may be maximal if the corresponding view is orthogonal to the centerline direction, while weight of product of diameters may be maximal if both views are orthogonal to the centerline direction and in addition mutually orthogonal.
  • An ellipse area may be used to express the area as product of diameters and a circle area may be used to express the area as the power of the cross-section-derived diameters.
  • the combined (fused) area function may be determined as a weighted sum of Sellipse and Scircle:
  • the weighting coefficients W(ij) and W(k) express a fidelity of every particular measurement Sellipse(i,j) and Scircle(k) of area.
  • the weight W(k) may be the absolute value of sinus of the angle between the artery direction and the line of sight vector and becomes 1 when line of sight is orthogonal to the artery and 0 if the line of sight is parallel to the artery.
  • the weight W(ij) expresses a quality of mutual orientation of two views and the artery and may reach the maximal value when artery direction and two line of sight vectors build an orthogonal basis, i.e. each two of vectors are orthogonal. Note, upon two view vectors being orthogonal to each other, the calculation of the cross section area using the ellipse area formula may be maximally justified. Alternatively, if radii have been acquired from the images with close view directions, then the use of elliptic cross section formula may be inconsistent.
  • the area value Sellipse(i ) may reach a maximal fidelity W(ij)-1. It is worth noting an additional consideration.
  • While the above embodiments disclose (generally) the use of three (3) marking points per image from at least two different cine-angio runs, other embodiments of the invention may utilize less marking points.
  • the operator may simply mark two (2) points per image for two cine-angio runs, or, in other embodiments, the operator may mark two (2) points for a first image from one cine-angio run and one (1) point for one or more additional images (from other cine-angio runs).
  • an operator may mark two (2) points proximal and distal (to a stenosis) on one image of one cine-angio run. This run may be referred to as a "Master” run, and the selected corresponding image to a "Master” image.
  • the system calculates centerline and edges and a "stenosis” point on the artery (this point is not required to be the actual stenosis point, but, rather, a reference point).
  • the operator selects images from two additional runs (the "Slave” runs) and marks the location of the "stenosis” point on the images from the Slave runs. After reception of each stenosis point on the images from the Slave runs, the systems performs tracing on the image of the
  • Slave runs then presents the results of the trace and the 3DR.
  • This reduction of markings may be enabled in embodiments of the present invention using path optimization algorithms including, for example, a generic dijkstra algorithm or a wave propagation algorithm (WPA) adopted for the image trace.
  • path optimization algorithms including, for example, a generic dijkstra algorithm or a wave propagation algorithm (WPA) adopted for the image trace.
  • WPA wave propagation algorithm
  • a path is found connecting the source point and target point with minimal cost (e.g., sum of image gray levels to some power).
  • the target set is an epi-polar line, instead of point, and the result is a path connecting the source point ("stenosis" or anchor point) to the target line.
  • stenosis or anchor point
  • a tree of alternative paths may be produced and the optimal branch path may be chosen. This process is further explained as follows:
  • proximal and distal points of a vessel of interest are input by an operator, and a centerline of the vessel of interest is produced (output).
  • the system traces the edges of the vessel of interest in the Master image and determines the stenosis point (which may be accomplished, for example, by determining the location of the minimum diameter of the vessel of interest).
  • the centerline is split into proximal and distal portions.
  • the images from the Slave runs are traced, separately for the proximal and distal portions, from the marked "stenosis" point to the epi-polar line, for both slave images.
  • additional candidate branches may be added to the main branch.
  • two proximal and two distal candidate trees for Slaves images are obtained.
  • the optimal combination of proximal candidates and the optimal combination of distal candidates comprise three (3) primary centerlines.
  • a 3D match is performed with 3D deviation error attribution.
  • the errors express a quality of the match and are sensitive to scale distortion.
  • Additional criterion of the match quality is a match of 2D centerline directions at corresponding points of the three centerlines. This match criterion may be insensitive to the scale change between the images.
  • the aggregate criteria for the optimal combination is then selected based on the combination of deviation errors, direction match between the centerlines and an additional consideration of a preference to the combinations utilizing more points from the centerline of the Master image.
  • this group of embodiments of the present invention provides a method and system for three-dimensional reconstruction of a tubular organ from angiographic projections.
  • the second group of embodiments improve the epi-polar geometry approach for 3DR by providing additional considerations to the three- dimensional reconstruction process, thus providing accurate correspondences between different projections and thus providing an accurate 3D model even in the presence of the mentioned geometrical distortions and epi-polar problem condition.
  • the suggested reconstruction method is based on epi-polar geometry enhanced by integrating other considerations to the reconstruction process. These other considerations including, for example, the tubular organ's parameters derived from the image, such as Radius and Densitometry (Gray-level) values, along the tubular organ's centerline and local centerline directions. Other considerations, which are derived from the tubular organ's characteristics, can be also incorporated.
  • the current group of embodiments provide a method for three-dimensional reconstruction from two two-dimensional angiographic images and a method for three-dimensional reconstruction from three or more two- dimensional angiographic images.
  • some of the embodiments according to this second group include a method for establishing correspondence between projections of a tubular organ visible in angiographic images comprises:
  • a 3D point could be found by either: a. "averaging" the 3D points that result from every pair of projection lines, or, b.using three or more projection lines to determine a 3D point; for example, a point that minimizes sum of distances from those lines.
  • a direction correspondence criterion is incorporated into the optimization process.
  • the process of finding the correlation could be performed prior to optimization or as part of the optimization.
  • Epi-polar principle defines that, given two 2D projections, every point on the first image defines an epi-polar line on the second image (and vice a versa); the 2D point on the second image that corresponds to the 2D point on the first image is restricted to this epi-polar line.
  • Three-dimensional reconstruction using epi-polar geometry could be described as follows:
  • each 2D point in the first centerline defines an epi-polar line that intersects the centerline of the second image, this intersection point is the 2D point on the second image that corresponds to the 2D point on the first image
  • each of these 2D points defines a projective line (meaning a line from the source 3D point to this projected 2D point).
  • a projective line meaning a line from the source 3D point to this projected 2D point.
  • the second group of embodiments brings answers to these insufficiencies based on exploitation of additional invariants apart from epi-polar distance for obtaining accurate match between 2D projections of a tubular organ and 3D reconstruction.
  • One invariant is the radius function behavior along the projected arteries.
  • a general relation also is established between projected density of the arteries and epi-polar geometry. The relation allows calculation of values invariant for different projections. The invariance property is utilized for matching tabular organs in different projections. It is known that even in the case where there are no distortions, there exist sitaations when the epipolar principle does not supply the unique solution (the epi-polar ambiguity).
  • the new approach according to the second group of embodiments aids in solving the ambiguity in such situations.
  • the relation will be proved under the assumption of local cylinder structure of the tubular organ.
  • Fig. 36 shows 3D cylinder representation of a tabular organ segment.
  • D be the 3D direction of the tabular organ and S be the area of cross section (Fig.l) orthogonal to D,
  • 1.
  • Vi and V 2 be C-Arm directions in which two images of the tabular organ have been taken,
  • l,
  • l.
  • the cross sectional area is equal to S only in the case when view direction is orthogonal to the tabular organ (Vi orthogonal to D).
  • the cross sectional area is inversely proportional to the cosine of the angle between the view direction V and the plane of orthogonal cross section S. This implies that the cross sectional area is S/cos( ⁇ ) (Fig. 37).
  • ⁇ / D is a dot product of two vectors.
  • V 1 the unit vector orthogonal to two view vectors
  • V ⁇ 2 Vi X V 2 /
  • the vector V 12 is the vector orthogonal to epi-polar planes of the two images.
  • the measure of the projected orientation of the tabular organ relative to epi-polar plane E is, by definition, a scalar product of V 1 and the tabular organ's direction:
  • Theorem the ratio of projected area and the visible epi-polar orientation is invariant for every pear of views, i.e.
  • D 2 can be calculated as directions tangential to the extracted 2D centerlines from the images.
  • F(iJ) Fi(
  • Fi, F 2 , F 3 , F are functions with the following properties.
  • Fi, F 2 , F 3 are monotonically increasing functions;
  • F 1 ( 00 ) 00 ; 0 ⁇ F 2 , F 3 ⁇ 1;
  • C 2 and C 3 are weight coefficients.
  • the terms "continuous” and “monotonic” mean that there are three possible increments of indexes i : (0,1),(1,0),(1,1).
  • the optimization problem can be solved by dynamic programming method or Dijkstra type algorithm.
  • ) of the target function is a soft epi-polar constraint that penalizes strong deviations from the epi-polar condition
  • the penalty function F ⁇ (
  • /T) 2 is tolerant for the discrepancies
  • ) of the target function encourages similarity of radii along the optimal path.
  • ) expresses invariance property stated in theorem.
  • the optimization's target function F(i,j) is defined over all possible correspondences between the two centerline points; the solution of the optimization problem is a correspondence map between 2D points on one centerline and 2D points on the other centerline.
  • every matched set of 2D points defines a 3D point, for example as a point that minimizes distance from projective lines.
  • the sequence of these 3D points is the three-dimensional reconstruction of the tubular organ.
  • a direction correspondence criterion is incorporated into the optimization process, described above as "defining a novel constraint for the process of 3D reconstruction from three or more views”.
  • the difference of epi-polar distances can be described as one parameter family of functions depending on a shift along epi-polar direction.
  • the invention also contemplates a system for imaging a tubular organ, comprising a microprocessor configured to generate a three-dimensional reconstruction of the tubular organ from two or more angiographic images of the tabular organ obtained from different perspectives, using the three-dimensional reconstruction method described above.
  • the invention is particularly applicable to imaging an artery contained in an arterial tree.
  • a three-dimensional reconstruction of a tabular organ, such as an artery, from two projections is available via methods known in the prior art. Usually, this requires some user interaction to identify the organ of interest in the first two views.
  • the third group of embodiments provides a method of an automated update on the basis of one or more additional projections. When a 3D model reconstructed from two projections is available, it is projected on to an additional image plane according to the specific viewing geometry of that additional existing projection.
  • the third group of embodiments may determines this shift by implementing a correlation technique. After the shift is calculated the organ tracing and analysis in the third image is carried out with the use of the projected model as a first approximation. The new detected and traced projection of the organ is then used for recalculation of the three-dimensional reconstruction to a better approximation.
  • the three- dimensional reconstruction made from two views is exploited to determine local weights for a refined reconstruction incorporating additional projections.
  • a projection of an organ is most informative for the purpose of three-dimensional reconstruction when the viewing direction is orthogonal to the organ.
  • a pair of projections is more informative when the viewing directions are sufficiently separated.
  • the third group of embodiments proposes to determine local weights of combination of two 2D image sources for refined 3D reconstruction. Local weights are determined according to the angle between the primary 3D model (reconstructed from the first two projections) centerline and view vectors of the projections and angle between the view vectors.
  • the third group of embodiments relates to a novel method and system for an automated three-dimensional reconstruction of an organ from three or more projections.
  • the present group of embodiments provides a method and system that performs an automatic identification of the reconstructed organ in the 2D image of an additional projection, performs an automatic trace and analysis of the organ in the 2D image (in the same manner as was performed on the first and second images), and finally incorporates the new projection into the three-dimensional reconstruction, improving the accuracy of the three-dimensional reconstruction.
  • Such an approach is particularly applicable to imaging an artery contained in an arterial free.
  • a projection of an organ is most informative for the purpose of three- dimensional reconstruction when the viewing direction is orthogonal to the organ, and substantially different views produce more accurate 3D reconstruction than not substantially different (close) views.
  • some of the embodiments of this third group are directed to a novel method and system for exploiting the three-dimensional reconstruction made from two views to determine local weights for a refined reconstruction incorporating additional projections.
  • the present group of embodiments relates to two aspects of making a better three-dimensional reconstruction of an organ when additional angiographic projections to a first two projections are available.
  • the first aspect refers to an automated procedure of identifying, tracing and incorporating an additional projection into the reconstruction.
  • the second aspect presents a novel method of weighted reconstruction process, where weights express the local optimal combination of projections to reconstruct the 3D model from, as a function of viewing angles.
  • A be a 3D model of an organ segment reconstructed from two marked images.
  • a generalized cylinder model that consists of a three- dimensional centerline and circular orthogonal cross sections specified by radii. This model could be expressed as A ⁇ (X come Yi, Z ⁇ , Ri), where i is the index for skeleton points along the three-dimensional centerline.
  • G be the known geometry of image I.
  • the geometry data G includes angles and rough estimation of magnification factor but does not include C-Arm patient bed shift.
  • Projection of the model A using geometry data G into image / plane could be done in two ways - binary or realistic.
  • the "realistic” projection will set the gray value of a pixel as a function of the length of intersection between the ray and the model.
  • the “binary” projection will simply set pixels as zeros and ones, where “one” means that there was an intersection between the ray and the model.
  • For Finding the shift between the projected image and the angiographic image / the two should be correlated, using correlation methods known in literature; correlation could be performed between i " and either a "realistic” projected image or a "binary” projected image.
  • the shift defines a region of interest on image I, and the three-dimensional model projection provides a first approximation of the organ's centerline.
  • organ's parameters radii, gray-levels
  • N 2D projection
  • Reconstruction of a 3D line from multiple projections can be posed as an optimization problem whose elementary step is reconstruction of a single point.
  • reconstruction of a single point using multiple projections can be done by means of intersection of projective lines corresponding to the 2D projections; in practice, projective lines do not intersect.
  • One natural definition for a 3D reconstructed point, resulting from intersecting two lines, could be defining the 3D point as the middle of the shortest segment connecting the projective lines.
  • Three-dimensional reconstruction from three or more projections expands the above-mentioned idea and determines a 3D point in a similar way.
  • One example is a direct expansion of taking the 3D point that minimizes distance from (three or more) projective lines.
  • Another method is to take the 3D points that result from all pairs of projections and set the final reconstructed point as a geometrical function of these points.
  • the present group of embodiments suggest a novel approach, in which the results from all pairs of projections are indeed used, but rather than setting the 3D reconstructed result as a function of only the points, the relationship between viewing angles and the 3D model is utilized to determine weights for each pair result.
  • A be a 3D model of an organ segment reconstructed from two projections with indexes 1 and 2.
  • r a local tangent direction of the 3D model A at the region where points Pi, P 2 reference to.
  • Ry be middle of the shortest segment between the projective lines & L j .
  • Wy det ( Vi,Vj,T) a determinant of 3-by-3 matrix composed of unit vectors Vi, Vj, and T.
  • the intersection point is given by the expression:
  • the quality of intersection is:
  • the reconstructed 3D point is defined as a weighted sum of the intersection points per each pair of projective lines.
  • the weights reflect the mutual geometry of two views and local orientation of the primary 3D model in such a way that maximal weight (1) is achieved by the combination of two orthogonal views, which are also both orthogonal to the organ.
  • the weight is near zero in the case when the two views are close to each other or if one of the views is too oblique. It is to be noted that the nature of the weights is local; the same pair of views can maximally contribute at one segment of the organ and minimally contribute at another segment.
  • this third group of embodiments also contemplates a system for imaging a tabular organ, comprising a processor configured to generate a three-dimensional reconstruction of the tabular organ from two or more angiographic images of the tubular organ obtained from different perspectives, using the three-dimensional reconstruction method described above.
  • Multiple data 3DR Image page cross-section area graph and lesion analysis measurements (e.g., diameter data, C-ARM position, other reference data) may be displayed simultaneously to deliver the maximum relevant information in an optimal manner (see, for example, Fig. 31).
  • lesion analysis measurements e.g., diameter data, C-ARM position, other reference data
  • Pop up menu for various projections A pop up list (3210, Fig. 32) presented in various projections (e.g., 2D projections, ONP and 0,0). Selection of any projection may rotate the 3D to that projection enabling the operator to appreciate it with comparison to the 2D images (for example).
  • Color coding of 3D model and/or graphs and other data can be implemented to denote narrowing severity, angulation, etc. (or a combination of parameters), to draw the physician attention to problematic segments.
  • Correlated data A cross-reference of data from a 2D trace of the vessel to the 3D model to graphs; every point can be allocated simultaneously on all. Cues are presented, for example to enable the operator to investigate the data either specifically or simultaneously.
  • One or more graphs may be presented including a graph for representing the cross-section area (fusion output) data and one for diameter information, or a combined graph.
  • a diameter data graph may be refened to "eccentricity", as it presents maximum and minimum diameter value for every point along the vessel.
  • Epi-polar warnings/bars/lines Epi-polar geometry is well-known and extensively documented, and is used for 3DR in the present invention. However, a 3DR is only as good as the images are to prepare it. Accordingly, to determine whether a second image, in combination with the first image, is adequate to aid in 3DR, embodiments of the present invention provide an operator with a visual indicator. As shown in Figs. 35A-35B, once the operator completes marking of a first image
  • Fig. 35 A and the vessel of interest is traced, as soon as the operator clicks on about the stenosis on the second image (Fig. 35B), the system presents, on the second image, epi-lines (lines 3510 and 3520) that are in the vicinity of the first image's markings and epi-bar 3530.
  • the bar indicates the conditions for 3DR.
  • the bar is color coded to indicate whether the second image is a good combination with the first image.
  • a "redder” bar would indicate poorer conditions for 3DR.
  • Many of the figures represent screen-shots of a preferred embodiment of the system. Specifically it is presented a prefened embodiments of catheter calibration (Fig. 33), display of 2D image related data (Fig. 29), including edge tracing and healthy artery display, display of 3DR results (Fig. 31) of the vessel of interest and 3D healthy vessel, and quantitative analysis of the vessel of interest (Fig. 34) in the form of graphs and specific measurements, such as percent narrowing (diameter and area), length, plaque volume, minimal lumen diameter and area, reference (healthy) area and diameter measures, eccentricity index and angulation.

Abstract

Embodiments of the present invention include methods and systems for three-dimensional reconstruction of a tubular organ (for example, coronary artery) using a plurality of two-dimensional images. Some of the embodiments may include displaying a first image of a vascular network, receiving input for identifying on the first image a vessel of interest, tracing the edges of the vessel of interest including eliminating false edges of objects visually adjacent to the vessel of interest, determining substantially precise radius and densitometry values along the vessel, displaying at least a second image of the vascular network, receiving input for identifying on the second image the vessel of interest, tracing the edges of the vessel of interest in the second image, including eliminating false edges of objects visually adjacent to the vessel of interest, determining substantially precise radius and densitometry values along the vessel in the second image, determining a three dimensional reconstruction of the vessel of interest and determining fused area (cross-section) measurements along the vessel and computing and presenting quantitative measurements, including, but not limited to, true length, percent narrowing (diameter and area), and the like.

Description

SYSTEM AND METHOD FOR THREE-DIMENSIONAL RECONSTRUCTION OF A TUBULAR ORGAN
CLAIMS TO PRIORITY AND RELATED APPLICATIONS
The present application claims priority under 35 U.S.C. §119(e) of U.S. provisional patent application nos. 60/505,430, filed September 25, 2003, 60/506,178, filed September 29, 2003, and 60/577,981, filed June 7, 2004, each disclosure of which, in its entirety, is incorporated herein by reference.
BACKGROUND OF THE INVENTION
Field Of The Invention
The present invention relates to medical imaging systems, and more specifically to medical imaging systems for use in angiography, for example.
Background Of The Invention
A stenosis in a blood vessel, for example, an artery refers to narrowing of the artery lumen due to plaque formation on the interior wall of the artery. The severity of the narrowing depends upon the amount of cross-sectional area of the lumen that is occluded by plaque. While narrowing of the arteries may occur in any artery of the body (e.g., carotid arteries), particular concern has been placed on investigating the narrowing of arteries of the heart, the coronary arteries (coronary heart disease), since narrowing of these arteries is one of the primary causes of heart attacks. Accordingly, coronary angiography refers to the process (and associated systems) of investigating coronary arteries to determine the severity of any narrowing (i.e., to find stenotic arteries) that may exist.
To image the arteries, a catheter is inserted into an artery of the arm or leg of a patient, where it is eventually advanced into the coronary arteries. Upon arriving at the coronary arteries, a radio-opaque substance is injected therein, so that the arteries may be imaged, using, for example, an X-ray angiography system.
The system takes "snapshots" (i.e., angiographic cine-runs) of the arteries at several different perspectives, to obtain complete views of the one or more arterial networks being investigated. Also, since narrowing is often asymmetrical about the axis of the artery, it is necessary to obtain at least two images, and preferably more, preferably perpendicular to an artery's axis from different perspectives (preferably orthogonal perspectives) to assess the severity of a stenosis. However, it is generally very difficult to obtain purely perpendicular perspectives of the vessels.
Accordingly, determination of the perspective positions is partially arbitrary and partially a process of trial and error (once a stenosis has been observed). However, the overall number of images that can be obtained is limited by time, safety and cost. Usually four to seven projections for the left coronary arterial system and two to four images for the right arterial system are obtained.
An operator of an angiography system assesses the severity of a stenosis in the coronary arteries either on the basis of visual examination of a plurality of images
(projections) or by computer analysis of a single image. As indicated above, since most of the images are, in general, not purely perpendicular to the arterial axis, estimation of stenosis severity is usually not accurate by either means.
Currently, there exist two-dimensional (2D) Quantitative Coronary Angiography (QCA) systems, which create 2D images of vessels for the investigation of stenoses, as well as three-dimensional (3D) QCA methods which also create a 3D reconstruction (3DR) of an entire arterial tree for investigation of stenotic vessels.
The 2D QCA systems basically implement the following steps: import of a specific image, vessel extraction for this image and then QCA for the vessel of interest. 2D QCA systems usually provide diameter based analysis of the lesion and not densitometry analysis. In some cases, densitometry analysis is provided via the usage of DSA, but not for scenes that include motion, like the coronaries.
The 3D QCA methods generally include the following steps: image acquisition, vessel extraction from the 2D projections. The 3D QCA systems additionally include imaging geometry recovery, point-by-point matching (between images) and, of course, 3DR. The QCA of the 3D system generally includes, morphology assessment (including vessel foreshortening, overlapping, angulation, tortuosity), and in some cases measurements, usually true length and diameter information. However, cross-section area measurements are rarely addressed, although attempts have been made to achieve a precise representation of cross section profile along the vessels. A method based on some heuristics in a framework of algebraic reconstruction approach was suggested in U.S. patent no. 6,301,498 to Greenberg. However this method requires a special arrangement of at least four (4) acquisitions from different directions orthogonal to the artery. Also, in both 2D and 3D QCA systems and methods, one important aspect of measurements and stenosis severity is the establishment of healthy vessel measurements. Systems and methods that present healthy vessel (or related) measurements use, for example, the interpolation of values based on measured diameters at proximal and at distal portions. This step is critical, since it is a basis for many measurements. At the same time, this step is very sensitive and could easily produce incorrect measurements.
Other problems exist with reference to the methods for the existing 3D imaging systems. For example, with image acquisition, prior art systems utilize either bi-plane acquisition, rotational acquisition, or single projection (image) acquisition (the most general approach (see U.S. patent nos. 6,047,080 and 6,169,917). Although bi-plane acquisition minimizes distortions due to cardiac cycle phase, the technique is insufficient in some situations of epi-polar geometry ambiguity. With regard to rotational acquisition systems, although close in time, these systems do not solve either a cardiac phase problem or the epi-polar geometry ambiguity.
With regard to imaging geometry recovery, the number of control points needed for geometry recovery depends on the type of transformation that is found and assumptions on unknown parameters. Accordingly, the number of control points can range anywhere from five (5) (see, for example, U.S. patent nos. 6,047,080 and 6,501,848) to eight (8) (see, for example, U.S. patent no. 4,875,165) for perspective transformation. However, the confident and accurate identification of at least five corresponding points on multiple images is a burdensome procedure, if at all possible, since, for example, the right coronary artery system often lacks adequate branching points.
Moreover, whether non-linear or linear optimization is used, both solutions suffer from an instability problem. Specifically, the natural candidate points to serve as control points are the branching points in the arterial tree. However, it is very often the case that the precise location of a branching point is difficult to identify due to that area of the arterial tree overlapping another vessel or itself. Moreover, as usual in computational geometry, not every required set of points is useful to produce the transformation. For example, if all the points lie on a common line in an image, the points can not serve for transformation calculation. Finally, transformation to 3DR from a family of perspective transformations, in general, can not compensate for local distortions in each image caused by the image being taken at different phase of either the heart cycle and patient respiration (for example).
There also exist a variety of techniques for vessel extraction in prior art imaging systems from 2D X-ray angiographic images. However, the ability to perform vessel extraction in clinical practice relates to the degree of automation and robustness of a particular process. For example, in U.S. patent no. 6,047,080, an operator must input six (6) marking points to identify six (6) branches of an artery tree in each image, and make several clicks per branch to define an initial centerline of each branch in every image. In addition, in order to stabilize the solution, the operator is recommended to add control points of high curvature and add stenosis points.
When the centerlines representing the entire vascular tree (including various branches) in 2D projections have been extracted, point-by-point matching utilizes (e.g., for multiple images) the epi-polar principle. Epi-polar geometry is premised on the statement that for an imaged 3D point, its projections on a pair of images and two (2) associated focal points belong to a common (epi-polar) plane. Accordingly, for any given point on one image, the search for the corresponding point on another image may be found on the epi-polar line (intersection of the epipolar plane with the image plane). However, this approach yields sufficient results only if: (i) the imaging geometry model adequately relates the organ and its 2D image, and (ii) the imaged vessel does not change its shape between the image acquisitions. This is why, in clinical practice, the restrictions of the straightforward epi-polar geometry approach are very limiting in terms of accuracy and quality of the 3D model.
In view of the above-mentioned short comings of the prior art, current 2D QCA systems do not deliver sufficient support for coronary angiography (for example) and current 3D QCA systems are not in clinical use since these systems either deliver incorrect results or are too cumbersome to use.
Thus, there exists a need for a 3DR system which can be used in clinical procedures (e.g., angiography) that delivers a system that may include a practical, intuitive, easy-to-use, robust solution to overcome at least one and preferably all of the above-mentioned disadvantages of the prior art systems and methods.
SUMMARY OF THE INVENTION
Accordingly, embodiments of the present invention overcome the drawbacks and problems associated with the prior art systems, and present simple to use and straight forward systems and methods for accurately imaging and creating a 3DR of a tubular organ which may be used with conventional X-ray angiography systems. Specifically, some embodiments of the present invention present methods and systems for 3DR of a single vascular structure of interest, using two (and in some embodiments, more than two), 2D X-ray images. Briefly, some embodiments may include one or more (and, in some embodiments all) of the following: acquisition of cine-runs, projection angulation and ECG information (e.g., via Analog and/or DICOM), system calibration to process images (e.g., catheter calibration), marking of two or more images, edge tracing, with pre and post processing to eliminate potential incorrect distortions of the edge, detection of centerline, densitometry, including background subtraction, point-to-point matching and 3DR, fusion of diameter and densitometry data to obtain precise vessel cross-section area measurements, determination and visualization of healthy vessel proportions (in 2D and/or 3D), and display of data associated with the system, vessel of interest, and other related data. With the present invention, the output of coronary angiography is improved by presenting a three-dimensional reconstruction of, for example, a stenotic vessel, as well as quantitative cross-section information.
In some embodiments, a three-dimensional reconstruction may be integrated into one display with information about the imaged vessel that is available from angiography. Moreover, the 3D reconstruction as presented by such embodiments may reveal the complete morphology of the vessel, including details that are unseen in the 2D images due to foreshortened and curved segments. In addition, a display of 2D or 3DR of a vessel of interest can be focused on, zoomed and rotated.
The tubular organ and vessel of interest may be any one of an artery, a vein, a coronary artery, a carotid artery, a pulmonary artery, a renal artery, a hepatic artery, a femoral artery, a mesenteric artery and the like (e.g., any other tubular organ).
Accordingly, in a first embodiment, a method for three-dimensional reconstruction (3DR) of a single tubular organ using a plurality of two-dimensional images is provided and may include one or more of the following steps: displaying a first image of a vascular network, receiving input for identifying on the first image a vessel of interest, tracing the edges of the vessel of interest including eliminating false edges of objects visually adjacent to the vessel of interest and determining substantially precise radius and densitometry values along the vessel. The method may also include one or more of the following steps: displaying at least a second image of the vascular network, receiving input for identifying on the second image the vessel of interest, tracing the edges of the vessel of interest in the second image, including eliminating false edges of objects visually adjacent to the vessel of interest, determining substantially precise radius and densitometry values along the vessel of interest in the second image, determining a three dimensional reconstruction of the vessel of interest and determining fused area measurements along the vessel. This embodiment may also include determining a centerline, which includes a plurality of centerline points. The determination of the fused area may include determining a plurality of healthy diameters (and preferably all healthy diameters) along the vessel of interest to be used as a physical reference, normalizing a majority of the data (and preferably substantially all the data, and most preferably, all the data), e.g., diameters and cross-section values into physical units, using the physical reference, fusing a majority of the data (preferably all or substantially all) into single area measurements and weighting each source of data according to the reliability of the data. The weighting may be computed as a function of the views geometry and/or 3D vessel geometry.
The input for identifying the vessel of interest may include three points: a first point to mark the stenosis general location, a second point proximal to the stenosis, and a third point distal to the stenosis.
However, the input may also comprises markers for two (2) points for at least one of the first and second images, where one of the two points is anywhere proximal to the stenosis and the other point is anywhere distal to the stenosis. The markers may also comprise two (2) points for the first image and one (1) point for the second image, where one of the two points is anywhere proximal to the stenosis and the other point is anywhere distal to the stenosis and one point may be an anchor point identified automatically on the first image.
The elimination of false edges may comprise detecting of one or more "bubbles" (see description below) adjacent the vessel of interest. A novel embodiment for detecting such bubbles (e.g., false edges) may include defining a region of interest substantially parallel to a primary centerline, detecting at least one cluster of pixel data, adjacent to the vessel of interest, wherein each cluster of pixel data having a predetermined brightness level greater than a brightness level of surrounding pixel data, selecting an arbitrary pixel within each cluster, selecting a second pixel provided on a lane bounding the region of interest for each arbitrary pixel of each cluster, and establishing a barrier line to define an edge for the vessel of interest by connecting a plurality of arbitrary pixels with a corresponding second pixel. Upon the tracing each edge of the vessel of interest, the traced edge avoids each barrier line. The elimination of false edges may also include detecting and/or eliminating (e.g., ignoring) one or more "bumps" along the vessel of interest. In particular, the elimination of false edges, with regard to bumps, for example, may include establishing a list of suspect points: establishing a plurality of first distances between each of a plurality of originating points on at least one preliminary traced edge and a corresponding closest point positioned along the primary centerline, establishing a plurality of second distances between each of a plurality of second centerline points point on the primary centerline to a corresponding closest point positioned on the at least one edge, and determining deviation, from the centerline, an absolute distance of the second distance and the first distance. The method may also include determining a gradient cost function, inversely proportional to a gradient magnitude at each edge point, determining a combined function aggregating deviation from the centerline and the gradient cost function, where upon the combined function being greater than a predefined value, the corresponding edge point is determined to be a bump point in a bump. The method may further include determining a bump area defined by a plurality of connected bump points and a cutting line adjacent the vessel of interest, where the cutting line comprises a line which substantially maximizes a ratio between the bump area and a power of a cutting line length, and cutting the bump from the edge at the cutting line to establish a final edge.
The centerline of the vessel of interest may be determined one or more of the following steps: determining final traced edges of the vessel of interest, determining pairs of anchor points, wherein each pair comprises one point on each edge, determining a cross-sectional line by searching for pairs of anchor points which, when connected, establish the cross-sectional line substantially orthogonal to the center-line, dividing each edge into a plurality of segments using the anchor points, where for each segment, correspondence between the edges is established in that every point of each edge includes at least one pair of points on an opposite edge and a total sum of distances between adjacent points is minimal. The method may also include connecting the centers of the plurality of segments to determine the centerline.
Densitometry, according to embodiments of the invention, may comprise properly subtracting a background influence. In particular, determining densitometry values may include one of more of the following steps: establishing a plurality of profile lines substantially parallel to at least one edge of the vessel of interest, establishing a parametric grid covering the vessel of interest and a neighboring region, where the parametric grid includes a first parameter of the vessel of interest along the length thereof and a second parameter for controlling a cross-wise change of the vessel of interest and sampling the image using the grid to obtain a plurality of corresponding gray values - the gray values are investigated as functions on the profile lines. The method may also include substantially eliminating detected occluding structures on the outside of the vessel of interest, the structures being detected as prominent minima of the parameters, substantially eliminating prominent minima detected on the inside of the vessel of interest, averaging gray values in a direction across the vessel of interest separately for each side of the vessel of interest, determining a linear background estimation on the grid inside the vessel of interest and determining cross-sectional area using the eliminated prominent minima.
Embodiments of the invention may include determining healthy vessel dimensions using an iterative regression over a healthy portion of the vessel of interest. In particular, iteration comprises a compromise between a pre-defined slope and a line that follows healthy data. In one embodiment, the compromise is toward the line that follows the healthy data if the line corresponds to actual data over a plurality of clusters. The determined healthy dimensions of the vessel of interest may be displayed, either in 2D and/or in 3D.
Three-dimensional reconstruction of the vessel of interest may include: determining a conventional epi-polar distance pt for the plurality of centerline points in the first image, determining a conventional epi-polar distance p2 for the plurality of centerline points in the second image and re-determining p2 substantially in accordance with p2new = p2 + δ, where δ is a smooth compensatory function establishing correspondence of one or more landmark points.
An epi-polar indicator, and associated means (e.g., application program/computer instructions for a processor), may be included with various embodiments of the present invention. Accordingly, after receiving input for identifying the vessel of interest in the second image, the epi-polar indicator may be displayed for indicating a concurrence between the first image and second image for producing a "good" three-dimensional reconstruction of the vessel of interest.
Data, in some embodiments of the present invention, may be cross-referenced among other data. Other embodiments of the present invention are directed to a system for three- dimensional reconstruction (3DR) of a single blood vessel using a plurality of two- dimensional images. Such a system may include a display for displaying a first image of a vascular network and a second image of a vascular network, and a three-dimensional reconstruction of a vessel, input means for receiving input for identifying a vessel of interest on the first image and for identifying the vessel of interest on the second image, and a processor arranged to operate one or more application programs and/or computer instructions. The computer instructions may include instructions for allowing the processor to perform one or more of the following: tracing the edges of the vessel of interest including eliminating false edges of objects visually adjacent to the vessel of interest, determining substantially precise radius and densitometry values along the vessel, tracing the edges of the vessel of interest in the second image, including eliminating false edges of objects visually adjacent to the vessel of interest, determining substantially precise radius and densitometry values along the vessel of interest in the second image, determining a three dimensional reconstruction of the vessel of interest and determining fused area measurements along the vessel. Other computer instructions may be included for accomplishing any of the foregoing not explicitly included herein.
Yet other embodiments of the present invention are directed to a system for three- dimensional reconstruction (3DR) of a single blood vessel using a plurality of two- dimensional images. The system may include display means for displaying a first image of a vascular network, and a second image of the vascular network and the 3DR, input means for identifying a vessel of interest on the first image and the second image, tracing means for tracing the edges of the vessel of interest in each image including elimination means for eliminating false edges of objects visually adjacent to the vessel of interest in each image and a processor. The processor may be used for determining a centerline, comprising a plurality of centerline points, determining substantially precise radius and densitometry values along the vessel, determining substantially precise radius and densitometry values along the vessel of interest in the second image, determining a three dimensional reconstruction of the vessel of interest, determining fused area (cross- section) measurements along the vessel and establishing the 3DR of the vessel of interest.
Other embodiments of the present invention may include a system for three- dimensional reconstruction (3DR) of a single blood vessel using a plurality of two- dimensional images is provided (according to any of the foregoing, for example) and may also include an angiography system comprising a platform for scanning a patient, a C- ARM X-ray system including an x-ray source, a detector, a step motor for moving the C- ARM, and a workstation for doing QCA. The workstation may include display means for displaying a first image of a vascular network, and a second image of the vascular network and the 3DR, input means for identifying a vessel of interest on the first image and the second image, tracing means for tracing the edges of the vessel of interest in each image including elimination means for eliminating false edges of objects visually adjacent to the vessel of interest in each image and a processor. Still other embodiments of the invention are directed to computer readable media
(e.g., floppy discs, hard-drives, CDs, DVDs, smart media and other flash storage), whether permanent or temporary, for storing one or more application programs made up of computer instructions (or just computer instructions) for enabling a computer (e.g., processor, and/or a workstation/network) to perform the methods according to the various embodiments of the present invention.
Any of the embodiments of the invention may also be used with existing angiography systems, or other vessel imaging systems. The relation of the present invention to such systems is readily apparent to one of ordinary skill in the art in view of the present disclosure. Other embodiments, as well as objects and advantages of the present invention will become more clear with reference to the following detailed description and attached figures as briefly described below. BRIEF DESCRIPTION OF THE DRAWINGS
Fig. 1 illustrates a schematic of a system and interface to a C-ARM according to an embodiment of the present invention.
Fig.2 illustrates a three (3) point marking of a stenotic vessel. Fig. 3 is an image from a cine-angio run comprising a vascular network.
Fig. 4 is the image from Fig. 3 having an incorrect traced edge.
Fig. 5 is a schematic of a vessel having a bubble area.
Fig. 6 is an image from a cine-angio run which includes references to items used in edge correction. Fig. 7 is an image illustrating a detected bubble of a vessel of interest.
Fig. 8 is the image of Fig. 7 illustrating final traced edges.
Fig. 9 is an image from a cine-angio run for illustrating bump detection.
Fig. 10 is the image of Fig. 9 with an incorrect edge tracing (having a bump).
Fig. 11 is the image of Fig. 9 with corrected edges. Fig. 12 is a schematic of a bump detection and elimination process
Fig. 13 is a further schematic of the bump detection and elimination process.
Fig. 14 is a further schematic of the bump detection and elimination process.
Fig. 15 is an image of a vessel of interest depicting centerline definition.
Fig. 16 is a schematic illustrating a typical cross section of a vessel. Fig. 17 is an image of a vessel of interest illustrating an approach of computing densitometry of a vessel. Fig. 18 is a schematic illustrating a principle of densitometry according to some embodiments of the present invention.
Fig. 19A is an image illustrating profile lines of a vessel of interest for computing densitometry. Fig. 19B is a graph of densitometry values associated with the image of Fig. 19 A.
Fig. 20A and 20B represent first and second images of a vascular network for illustrating point-to-point matching.
Fig. 21 A is an image of a stenotic vessel for illustration of healthy artery computation. Fig. 2 IB is a graph illustrating healthy artery computation of the stenotic vessel of
Fig. 21A.
Fig. 22A is another image of a stenotic vessel for further illustration of healthy artery computation.
Fig. 22B is a graph illustrating the healthy artery computation of the stenotic vessel of Fig. 22A.
Figs. 23-28 are images of a stenotic vessel of interest, with reference to determining a healthy display of the vessel.
Fig. 29 is a screenshot for a 3DR system according to the present invention illustrating a 2D image related display (including healthy artery display in 2D)-. Fig. 30 is a 3DR of a vessel of interest.
Fig. 31 is a screenshot of a 3DR, including display of information associated with the 3DR.
Fig. 32 illustrates an example of a pop-list that appears on screenshots of a system according to the present invention (also illustrating a 3DR of a vessel of interest). Fig. 33 is a screenshot for a 3DR system according to the present invention illustrating a calibration technique.
Fig. 34 is a screenshot for a 3DR system according to the present invention illustrating graphic data presentation. Figs. 35A is a first image illustrating traced edges of a vessel of interest.
Fig. 35B is a second image for the vessel of interest, which includes an epi-polar bar and lines for indicating the applicability of the second image as a good candidate for 3DR, with relationship to the imagewessel imaged in figure 35 A.
Fig. 36 illustrates a 3D cylinder representation of a tubular organ segment according to some embodiments of the present invention.
Fig. 37 illustrates a cross-section area through the segment illustrated in Fig. 36.
DETAILED DESCRIPTION OF THE EMBODIMENTS
The embodiments of the present invention may be integrated into existing catheterization systems to produce both 2D and 3DR images. Fig. 1, for example, illustrates one exemplary system constructed in accordance with some embodiments of the present invention useful for producing either two-dimensional angiographs and/or 3DRs of a patient's vascular system. Such a system may include a horizontal support such as a table 2 for a patient 3 under examination, and a gantry C-arm 4 which encloses the patient's body. The C-arm supports a radiation source 5 at one side of the patient's body, and a radiation detector 6 at the opposite side and in alignment with the radiation source. The radiation source 5 may be an X-ray point source which produces, for example, a conical X-ray beam. The radiation detector, which may consist of a CCD camera having a plurality of radiation detector elements. The apparatus may further include a step motor 7 for changing the angular position of the radiation source and radiation detector with respect to the body under examination. In a preferred embodiment of the invention described below, the step motor 7 is capable of rotating the radiation source and the radiation detector about the Z-axis, which is the longitudinal axis of the patient's body, and also about the X-axis, which defines, with the Z-axis, the plane of the horizontal body support.
The electronics which may be included with the system of Fig. 1 may include an angiography system controller 10 which controls the radiation source and also the step motor to successively produce the exposures of the body from a plurality of different angular positions with respect to the body. The controller may also receive the electronic outputs from the radiation detector elements in the CCD camera. A computer work station 11 may be included which controls the angiography system controller 10 to produce a two-dimensional images 12 of blood vessels projection to any selected play (cine-runs), as well as 3DR images 13. Control is preferably synchronized with a cardiac and/or respiratory gating signals produced by an ECG sensor and/or a respiration sensor (not shown), so that images of the blood vessels may be obtained during the same point during a cardiac cycle or respiration cycle. The workstation may include the application programs and/or hardware for enabling the operation of the systems and methods of the embodiments of the invention for 2D and 3DR, as well as the associated QCA. Also, the systems and methods according to embodiments of the present invention may be and add-on component to the above- described configuration for a catheterization room. In some embodiments, another workstation, including hardware and software, may be interfaced to the catheterization room, for receiving cine-runs, and optionally, C-ARM angulation and ECG to process and present 3DR.
FIRST GROUP OF EMBODIMENTS
Image Acquisition
Two-dimensional (2D) X-ray images of a plurality of cine-angio runs are captured and presented on the monitor substantially in real-time during catheterization of a patient. In addition to the images, C-arm angulation data and ECG data may also be1 acquired. Using the ECG sensor, an ECG Gating process may be used to present the optimal ("best") image (End Diastolic Frame) from the captured images of the each cine-angio runs.
The capturing of cine-runs may be accomplished in analog (using, for example, a frame grabber) or via standard DICOM connection (preferred). DICOM is an acronym for "Digital Imaging and Communications in Medicine", and is a file format and digital communications protocol that allows medical equipment and software from different manufacturers communicate with one another, so that medical data can be easily shared.
After image capture, an operator may perform catheter calibration on an image according to known methods. Examples of such know methods can be found in U.S. patent no. 5,042,486 and PCT patent publication WO 94/04938, whose disclosures are incorporated herein by reference. Other calibration devices are described in U.S. Pat. Nos. 3,644,825, 3,868,565, 4,017,858, 4,054,881 and 4,849,692, the disclosures of which are also incorporated herein by reference. Some embodiments of the present invention may utilize an automatic calibration using the DICOM data. In other embodiments, catheter calibration may be accomplished by identifying the catheter edges (3310), as shown in Fig. 33. In this way, knowing the size of the catheter, one can determine distances (e.g., pixel to mm transformation) in each image.
Vessel of Interest Identification
With the images obtained from the cine-angio runs, and preferably after calibration, an operator of the system may mark a stenosis of a vessel of interest, by either manual selection (by the operator), or a system selected (e.g., via ECG gating) image, from at least a first image and a second image each selected from a separate cine-angio run. In one embodiment, the marking includes at least three (3) points, but in other embodiments, less than three points may be used (see "Limited Marking 3DR" below). The three (3) points may include (Fig. 2): a first point 210 to mark the stenosis general location, a second point 220 proximal to the stenosis, -and a third point 230 distal to the stenosis. After an image is marked, edge detection and centerline definition for that image may be established. Edge Detection (edge tracing)
Initially, a primary centerline may be extracted using known algorithms (such as dijkstra optimization or wave propagation method). The only property the primary centerline should possess is that it be a path inside the marked vessel. In this regard, the user marking points, which can be located outside the vessel due to an imprecise user's pointing, may be automatically checked and moved, if necessary, into the vessel. Accordingly, the tracing algorithm may use these properly located marking points to an extract a primary centerline.
For each image, the edges of the marked vessel of interest are traced. Although edge detection (i.e., edge tracing) may be accomplished via known methods using known algorithms (see, for example, Gradient Field Transform, "A New Approach For The Quantification of Complex Lesion Morphology: The Gradient Field Transform;...", Zweit & Reiber, JACC vol. 24; "Single Source Shortest Path"; Introduction To Algorithms; Gormen, Leiserson & Riverst., p. 527; each of which is herein incorporated by reference in its entirety). However, using these known methods, edge detection in angiography poses many difficulties, for which embodiments of the present invention address.
Such difficulties relate to a detected edge of a vessel of interest "detouring" off the actual edge of the vessel onto an edge of a visually adjacent vessel (or other feature/object) from the complex vascular structure which may surround the vessel of interest (in which the vessel of interest may be apart of)(see Figs. 3-4, illustrating a complex network of vessels and an incorrect edge trace 410). Moreover, a similar phenomenon is recognized where the end-point of the vessel of interest is marked, near which likes another parallel (or substantially parallel) vessel. Accordingly, prior to detecting the edges of the vessel of interest (using, for example, the above edge detecting methodologies, or modified versions thereof), embodiments of the invention conduct preprocessing to substantially reduce and preferably eliminate such detours from appearing as the final edge.
The phenomenon is shown in Fig. 5 and is addressed by seeking out what is referred to as a "bubble" 510, adjacent the vessel of interest, which results in an incorrect edge 515. A bubble comprises a bright spot (relatively) surrounded by darker areas (e.g., another vessel 520) near the vessel of interest 530 and may be detected using a pixel map of the image. As shown in Figs. 6-8, one or more bubbles may be detected and substantially eliminated as a problem for edge detection in the following manner. A region of interest, as shown in Fig. 6, for tracing of each edge is defined. It is bounded by a primary centerline 610, a "lane" 620 (which is a line positioned a sufficient distance away from the primary centerline, for example, twice the distance as a maximal possible healthy radius, from primary centerline) and two lines (Source 630 and Target 635), closing the hole between primary centerline and the lane. Thus, the region of interest is bounded by four of the above-mentioned lines for one of the two edges. The edge tracing is a process of finding an optimal path connecting the Source and the Target lines and need not leave the region of interest. A bubble, cluster 640 (see also, Fig. 7, cluster 740) in the region of interest is then detected as a bright spot within a darker surrounding area. Then, starting with an arbitrary pixel within the bubble, neighboring pixels successively most distant (preferably) from the primary centerline are sought, until a border (lane) is reached. In this manner, a line drawn from cluster 640 to the lane defines a preventing path 650. A plurality of such preventing paths are constructed. Thereafter, when edge tracing is preformed, no edge line which bypasses the bubble is possible - the edge tracing process ignores the bubble (e.g., by being prevented from crossing a preventing path), and thus, yields a correct edge. Fig. 8 illustrates final edges (810 and 820) which by-pass the bubble.
Another problem exists where a detected edge detours off the edge of the vessel of interest as a result of a side branch on the vessel of interest, yielding a "bump" appearance. As shown in Figs. 9 and 10, a side branch vessel 920 off of the vessel of interest 910, creates an incorrect edge 1010 (Fig. 10). Embodiments of the present invention address this concern preferably after bubble detection and after the primary edges have first been detected, which, with reference to Fig. 11, presents a correct edge (lines 1110 and 1120). A bump may be characterized (Figs. 12-13) by an increase in distance between opposite edges (manifested as increase in distance between edge and centerline), and low gradients on suspected bump edges. The bump process (in some embodiments) includes two steps: bump detection and bump correction. Accordingly, after a primary edge 1200 (Fig. 12) is found, bumps are sought out. Starting from a point on the edge 1300, the closest point on the primary centerline (or opposite edge or any line substantially parallel to the vessel) is found and a distance between the two is found (arrow 1310). Then, from a point on the centerline, the closest point on the edge is found and a distance between the two is found, which is denoted by arrow 1320. Deviation from the centerline is defined as an absolute difference between distances 1320 and 1310.
Preferably, all edge points are checked for as being bump points. Thereafter, for every point on the primary edge, a combined function is calculated. This function aggregates two components: deviation from centerline and gradient cost function (for example, a condition on the gradient value can be expressed via a gradient cost function which may be inversely proportional to the gradient magnitude). Specifically, a suspected bump point having a big deviation from centerline and/or low gradient, may be considered an actual bump point. The combined function, in particular, can be product of deviation from centerline and gradient magnitude. A bump comprises a plurality of bump points.
The detected bumps are corrected by "cutting" the bumps from the edge. After bump points have been determined (which may include one or more neighboring edge points), an area 1405 of the bump is then determined, using the outer border of the bump 1400 and a cutting line 1410 as inner border. The appropriate cutting line is finally determined by a line which maximizes the ratio between the bump area and a function of cutting line length, for example, a power of the cutting line length, and which is also the correct edge of the vessel of interest. This "cuts" the bump from the imaging of the vessel and establishes the correct edge of the vessel.
Centerline Definition
The centerline definition, being an input for determining radius and densitometry values, ultimately determines stenosis measurements, and thus is very important. By definition, the centerline is a line passing inside the vessel, between the edges. Every point in the centerline should be equally distant from the edges (i.e., the center). This is referred to as "problematic step" in the art and several methods are currently in use to determine such centerlines. Accordingly, any of the prior art centerline detection techniques may be used with the present invention. However, some embodiments of the present invention present a novel approach as disclosed below. Accordingly, in one embodiment of the invention, a centerline is detected by seeking out pairs of anchor points (one on each edge)(see Fig. 15, item 1510). Specifically, if P and C are arrays of edge points (i.e., edge P and edge C), anchor points (Pi, Cj) are found according to the following definition: the pair (Pi, Cj) is a base pair if distance (Pi, Cj) is less than the distance (Pi, C), and distance (Pi, Cj) is less than the distance (P, Cj) for all points (P, C) from the edges. The anchor pairs are simated at bottleneck positions between the edges. This results in the cross-sectional line to be substantially orthogonal to the resulting centerline at anchor points, which is a natural property of a tubular body.
The edges are then divided into segments 1520 between the anchor points. For each segment, correspondence between edges may be established according to the following principles: every point of each edge must have at least one matched point on an opposite edge; and total sum of distances between matched pairs is minimal. Thereafter, the centerline is defined as connecting centers of the lines connecting each pair? Diameter values along the vessel could be simply the lengths of those lines.
Densitometry and Background Subtraction
Densitometry is the task of determining gray-levels along a vessel's cross-section to estimate the cross-section area of a vessel. While diameter measurement is view dependent, the cross-section area is not (being, theoretically, identical under every view). Fig. 16 describes a cross section of the artery. A different diameter is measured (Dl and D2), depending on the view direction, while the area of the cross section area possesses a property of directional invariance.
The art of computing/determining this area (which is generally a function of gray levels along the cross section) is to "subtract" the background influence. There are many prior art methods regarding DSA (Digital Subtraction Angiography), which are very useful for static objects, but are hard to implement for a moving coronary vessel. Thus, other described methods are trying other approaches to "subtract" the background; these methods are very problematic, since they are very local (see Fig. 17). Specifically, as shown, dashed line 1710 represents a centerline of the vessel in interest. As briefly mentioned above, the classical approach to compute densitometry is to compute the background gray-levels along segments perpendicular to the centerline (for example, black lines 1720, 1730) and to "subtract" those values of the background (e.g., outside vessel boundaries/edges) from the vessel's gray-value. If indeed the perpendicular segments pass through background that is common to the artery (for example, the left segment passes through the catheter), such a method may work. The vessel of interest also "goes over" the catheter, thus, subtracting the catheter gray-level values is justified.
On the other hand, if the right segment 1730 goes through a branching vessel; the gray level values for the vessel of interest along this segment are not influenced by the branching vessel (unlike the previous example of the catheter). Thus, it is erroneous to subtract the "background" (actually the branching vessel's) gray- values from those of the vessel of interest.
Accordingly, one embodiment of the invention presents a novel algorithm to "subtract" the background influence in a vessel. Initially, profile lines 1810 (Fig. 18) along the background are drawn, parallel to the edge. This way, the background analysis is much more global and may account for many things the classic approach cannot.
In order to consistently evaluate the background, a two parametric grid covering the vessel and neighboring region are applied. One parameter controls the change of the vessel along its length and the second parameter controls the change of the vessel cross- wise. The image is then sampled on the grid. Obtained gray values are investigated as functions on the lines parallel to the vessel (lines 1810, Fig. 18). The crossing vessels and other occluding structures are detected as prominent minima of the functions and preferably eliminated. The similar minima elimination is also performed on the grid inside the artery. The values of the grid outside the vessel are averaged in a direction across the vessel on both sides separately, and linear background estimation is calculated on the grid inside the artery. Next, cross-section area is calculated using subtracted background.
As shown in Fig. 19 A, the continuous line 1910 is the centerline of the vessel of interest. As described, parallel profile lines 1920 are drawn "outside" the vessel. The graph (Fig. 19B) represents gray-level along one such profile line. One can see that the branches, being much darker, are expressed as minima points within this graph. Each of these functions passes the procedure of cutting the downward peaks. As a result of this procedure, the vessels branching from the artery or crossing over the artery are neglected.
Point-To-Point Matching For 3DR One of skill in the art will appreciate that any prior art 3DR method may be used to accomplish 3DR with any of the embodiments of the present invention (e.g., based either on orthographic or perspective imaging geometry model). However, the following is a method for improved 3DR, according to some embodiments of the present invention, which overcomes problems, for example, such as geometric distortions. Because of presence of geometric distortions caused by scene changes between acquisitions, neither orthographic nor perspective transformation may be able to determine a substantially exact match between the images. The existence of such distortions and their influences on 3DR results are well known in the art: (i) errors in the 3D centerline reconstruction, and (ii) the fusion of mis-matched data for cross-section estimation.
Accordingly, some embodiments of the invention include a method to obtain a substantially exact match between images using a more suitable approach than the prior art methods (see, for example, U.S. patent nos. 4,875,165; 6,047,080; and 6,501,848), using local error corrections. Moreover, embodiments of the present invention automatically find and/or match landmark points between images.
The principle underlying obtaining a substantially exact match of points between images is to allow a continuous deviation from the epi-polar constraint in order to minimize discrepancies along the vessel (e.g., branching points or other prominent landmark features). This approach may be used to obtain additional types of landmark points in order to improve reconstruction process. Specifically, in the framework of an orthographic projection, the epi-polar principle prescribes the corresponding points to be in equal distance (epi-polar distance p; see Figs. 20A-20B) to a reference epi-polar line. Reference points can be marked by an operator in all images or a reference point marked by the operator in one image may then be refined in order for it to be accurately located by a local correlation algorithm (for example) in other images or the reference points can be found automatically in all the images.
The following types of landmark feature points may be utilized for calculation of improved epi-polar distance on an image: branching points (B); prominent features of diameter function (C1,C2); local extremes of epi-polar distance (D) as a function of centerline point; and points of extreme curvatures (E).
A vessel's centerline points are preferably matched according to the match of improved epi-polar distances. Specifically, a conventional epi-polar distance p is calculated for artery centerline points of reference image pi in Fig. 20A and for artery centerline of the second image p2 in Fig. 20B. Then the second epi-polar distances p2 are re-calculated in a form p2new = p2 + δ in order to provide equal epi-polar distances at landmark points, where δ may be a smooth compensatory function establishing correspondence of the landmark points. If pt(LM) and p2(LM) are the epi-polar distances of a landmark point, then the compensatory function includes a value δ(LM) = p^LM) - p (LM) at this landmark point. See illustration of value δ for landmark point E. It is worthwhile to note that the compensatory function δ is calculated per specific vessel. This approach has a straightforward extension for the case of reconstruction from three (3) images. The two compensatory functions for the second and third images δ2 and δ3 have values δ2(LM) = pι(LM) - p2( M), and δ3(LM) = pι(LM) - p3(LM) at landmark points.
Healthy Artery Computation
Embodiments of the present invention obtain a graph of measures: diameter or cross-section area, along the artery. In order to perform lesion analysis to compute measurements such as percent narrowing, the values of a healthy vessel need to be extrapolated (for example).
An iterative "Regression" function is aimed to calculate the regression line of the "healthy" portion of the incoming values. The Iterative Regression function calculates a regression line, which "ignores" extreme values (in most calling cases extremes are stenosis values or aneurysm values). Thus, the method is an iterative computation of regression lines, while removing extreme values (which are far from the line using a function of the standard deviation, for example), until the error (between predication and line) is less than a predefined error or number of points participating in a "creation" of the "regression" line - i.e., points which were not identified as stenosis or aneurysm - are too low (for example - less than between about 5-50% of the total number of points, in some embodiments, less than between about 15-30% in other embodiments, and less than between about 20% in preferred embodiments).
The classic model is further expanded using some embodiments of the present inventions in at least the following ways: a default slope is "forced" into the iterative regression; this is motivated by the anatomical fact that vessels are typically always tapered; and searches for "clusters" of data; it is hypothesized that the use of more separate consistent clusters yield better results than a single long cluster (again, based on the anatomical characteristics of the vessel).
Accordingly, the algorithm, on preferably every iteration, may solve a dilemma on whether to follow the prescribed default slope or to maintain the slope from the previous iteration. The measure of confidence about the slope seen on a previous iteration depends on the distribution of data points supporting the current regression line. If the data points supporting the current regression line are distributed evenly over the argument interval, more weight is given to the calculated slope. In an opposite situation, upon data points supporting a current regression line clustered as one block, more weight is given to the default slope. These improvements are significant to the classic method, and provide not only better and more robust results, but also enable a system to consider more complicated cases, such as ostial lesions (which are lesions without a proximal or distal healthy portion for the vessel). For example, Fig. 21 A illustrates an example for a "normal" stenotic vessel, with both proximal 2110 and distal 2120 healthy portions, hi a representing iteration of the healthy artery computations, Fig. 2 IB, there will be two clusters of points: one in the proximal part and one in the distal part (the bulleted points in the figure), for which the values of the radius line (2130) is relatively close to the values of the "regression" line (2140). Since there are two clusters distributed along the vessel, the new line (which strives to get closer to the data) will be accepted (rather than striving to stay closer to the predefined slope (2150).
Figs. 22A-22B represent another example. In this example, however, the vessel of interest presents an ostial lesion (or defused disease). As can be seen, the vessel has a healthy proximal portion 2210, but is stenotic through all its distal part 2220. This, of- course, is manifested in the radius graph of Fig. 22B. In this case, the "regression line" 2230, includes one cluster of points, in which the radius value is close to the regression value. Thus, in this case, the result of the iteration will be closer to the default slope 2240 than to the regression line. It is worth noting, that this step of healthy artery computation is described in two contexts, computation and for 2D display (see below). Accordingly, the above computation is preferably performed first, and then it serves as an input for the 2D display procedure. The difference between the two steps is that the computation step is generally related to the healthy values, while the second step (of display) may also be related to the "symmetry" of this values versus the lesion (for example, how to locate a value of 5mm healthy "around" a 3mm lesion). Two Dimensional Healthy Artery Display - Figs.23-29
Healthy artery display is an excellent tool for image presentation in QCA systems, and helps a physician to analyze a stenotic area (e.g., in terms of symmetry, etc.). Since this information of the healthy vessel is not part of an angiogram, some embodiments of the invention establish such information based on an extrapolation of existing data (preferably lumen edges). Accordingly, Fig. 23 presents an image of a vessel network, Fig. 24 represents the detected edges of the lumen, with Fig. 25 represents the display (which may be extrapolated) of what the vessel would look like if it were healthy.
This process, according to one embodiment, includes connecting each edge's end- points to each other by a straight line, producing two lines 2610 and 2620 (Fig. 26). The lines are preferably produced to be as distant from each other, using a measure of "healthy radius" (see above). If the vessel lumen is entirely inside these two lines, the computation of the healthy artery is complete, as these lines may then represent the healthy artery. If the vessel lumen is not entirely inside these two lines, the most distant point of any lumen edge 2701, 2705 from those lines is found (point 2710, Fig. 27). This point (and the corresponding point at the second edge) divides each edge into two (see Fig. 28). This process is continued recursively.
The recursive procedure starts with the first segment defined as the whole artery, i.e. from is the start threshold of the artery and to is the end threshold of the artery. At each step of recursion, a segment of the artery limited by two couples of the previously found anchor points is received. Each couple contains two points from different edges. For example, let P and C be edges of the vessel of interest; the two couples of points can be denoted by (Pfrom,Cfr0m) and (Pto5Cto). If, in the current step, a new point couple is found (Pnew5Cne between (Pfrom, Cfrom) and (Pto, Go), then the procedure is called recursively twice with the two artery segments: (Pfi-om, Cfrom), (Pnew, C„ew) and (Pw, Cnew), (Pto, Cto). If no new couple is found, the branch of recursion calls terminate.
The new point is the point from the segment of interest that maximally deviates from the line connecting the centers of the limiting segments from and to. Accordingly, if the deviation is less than the correspondent healthy radius, then the new point is discarded and recursion branch terminates. If the deviation is greater than the correspondent healthy radius and this healthy radius in turn is greater than the input radius at this point, a new couple of anchor points are found.
One point of the new couple is the new point. The second point constituting the new couple is determined via healthy radius and a point from the opposite edge corresponding to the new one. Namely, the second point constituting the new couple lies in the twice healthy-radius distance on the straight line connecting the new point with its counterpart. If the deviation is greater than the correspondent healthy radius and this healthy radius is less than the input radius at this point (e.g., an aneurysm) then a new couple of the anchor points also exists. The points of the new couple lie on the straight line connecting the new point and a point from the opposite edge corresponding to the new one. The distance between the points of new couple, as in the previous case, is equal to twice healthy-radius. But, contrary to the previous case, the points of the new couple are located symmetric relative to corresponding centerline point. A result of the termination of recursion is a list of anchor points. The healthy edge is finalized via interpolation between anchor points (for example, spline interpolation). See Fig. 29 showing, at center, two-dimensional, healthy artery display.
Three-Dimensional Healthy Artery Display
The same notion for 2D of a healthy vessel is applied for 3D. As shown in Figs. 30-32, transparent area 3010 is visualizing an approximation of the healthy vessel. Similarly to the 3D vessel reconstruction, the healthy 3D artery is defined by 3D healthy centerline and 3D healthy diameters. For 3D healthy centerline calculation, the known point-by-point matching of 2D centerlines may be utilized, applying it to the healthy 2D centerline points nearest to the available matched pair. The 3D healthy diameters may then be taken as a diameter corresponding to the healthy (reference) diameter. The cross section area may be a result of the fusion algorithm described below and the healthy diameter is (iterative) regression line for sqrt(cross section area/π). Fusion
At this point, diameter measurements and cross-section area measurements have been obtained along the artery from various (at least 2) projections. Diameter values are view dependent and both diameter and cross-section area values may be corrupted by noise. Thus, it is preferable to combine all the data (diameter and area values) for better computation of the cross section area. Embodiments related to this implementation may also be based on assigning "quality" weight tags for every source of information based on the relation between the projection geometry and the artery's 3D geometry.
Accordingly, a 2D image participating in a 3D vessel reconstruction supplies 2D centerline, diameter and non-physical area value. After the 3D centerline reconstruction, 2D centerlines may be linked to the 3D centerline (i.e., every 3D centerline point linked to originating 2D centerline points). In other words for every 3D centerline point, there exists references to at least one set of measured 2D diameters and area values (preferably two sets for the at least two images). The fusion process may be comprised of the following steps: the area (cross-section\densitometry) values may be corrected according to a local angle between the view vector and the 3D centerline direction, yielding measurements of orthogonal cross section areas; healthy diameter is calculated by applying an iterative regression algorithm to an average diameter function, yielding a reference physical measure (which is possible only from diameter values, which are in mm); average diameter may be used to average errors (which, in some embodiments, minimizes errors as well); and. healthy regression lines for square root of area are calculated. Since densitometry is an area measurement up to a constant factor, the functions: Radens — Densitomertry Radius = sqrt (Densitometry) may be calculated in order to be comparable with Radius.
Diameter graphs and cross section graphs are configured to a common coordinate system (e.g., in mm) using (for example) an adjustment of the found regression lines. More specifically, the healthy line of the average diameter may be used as a reference line. In that regard, substantially all (and preferably all) data may be transformed (radius and densitometry per run) using (for example) the ratio between the data's healthy line and the reference healthy line.
RadsNorm = RadAvReg / RadsReg * Rads, RadensNorm = RadAvReg / RadensReg * Radens,
Where: RadsNorm are Normalized Radius values,
RadensNorm are Normalized Densitometry-derived-Radius values,
RadAvReg are healthy (regression line) values derived from average radius graph,
RadsReg are healthy (regression line) values derived from specific radius graph,
Rads are specific radius graph values, RadensReg are healthy (regression line) values derived from specific densitometry- derived-radius graph,
Radens are specific densitometry-derived-radius graph values,
The fused area may be calculated as a weighted sum of densitometry areas and areas calculated via product of diameters (for example). The weights may be determined locally and may depend on viewing directions and/or local 3D centerline direction. The weight of densitometry area may be maximal if the corresponding view is orthogonal to the centerline direction, while weight of product of diameters may be maximal if both views are orthogonal to the centerline direction and in addition mutually orthogonal. An ellipse area may be used to express the area as product of diameters and a circle area may be used to express the area as the power of the cross-section-derived diameters.
Sellipse(ij)=pi*RadsNorm(i)*RadsNorm(j), i !=j, i,j=l,2, ..., NumberOfViews Scircle(k) =pi *RadensNormA2, k=l,2, ..., NumberOfViews. Some embodiments of the above described fusion approach utilize an assumption that the circular cross section in the healthy part of a vessel is represented by regression lines (of diameters and square roots of cross sections). On the other hand, in the stenosis region, the lumen cross section may be highly eccentric and the use of densitometry may be capable to improve the area estimation. Incorporation of densitometry area in such situation may improve the cross section estimation.
The combined (fused) area function may be determined as a weighted sum of Sellipse and Scircle:
Figure imgf000032_0001
Senipse(ij))+SUMk(W(k)*Scircle(k))) / (SUMijW(ij)+SUMkW(k))
The weighting coefficients W(ij) and W(k) express a fidelity of every particular measurement Sellipse(i,j) and Scircle(k) of area. In some embodiments, the weighting coefficients may be defined using local orientation of the artery relative to the camera direction (line of sight vector). Specifically, let ViewVectors(k), k=l, ..., NumberOfViews be the camera line of sight unit vectors and ArtDir be the 3D artery direction unit vectors calculated at each artery point . Accordingly, the geometrical meaning of the weighting coefficients may be as follows. The weight W(k) may be the absolute value of sinus of the angle between the artery direction and the line of sight vector and becomes 1 when line of sight is orthogonal to the artery and 0 if the line of sight is parallel to the artery. The weight W(ij) expresses a quality of mutual orientation of two views and the artery and may reach the maximal value when artery direction and two line of sight vectors build an orthogonal basis, i.e. each two of vectors are orthogonal. Note, upon two view vectors being orthogonal to each other, the calculation of the cross section area using the ellipse area formula may be maximally justified. Alternatively, if radii have been acquired from the images with close view directions, then the use of elliptic cross section formula may be inconsistent. Accordingly, if, in addition to orthogonal views the plane of vectors, View Vectors (i) and ViewVectorsQ)) are orthogonal to the artery, then the area value Sellipse(i ) may reach a maximal fidelity W(ij)-1. It is worth noting an additional consideration. In some embodiments, the above definition gives some priority to the area measurements originated from densitometry since W(i,j) < W(k), for k=i and bηj. While the elliptical area assumption may suffer from probable inconsistency, the area evaluation via densitometry does not bear such defect (as mentioned above), and the priority to the densitometry may be reasonable.
Limited Marking 3DR
While the above embodiments disclose (generally) the use of three (3) marking points per image from at least two different cine-angio runs, other embodiments of the invention may utilize less marking points. For example, in some embodiments, the operator may simply mark two (2) points per image for two cine-angio runs, or, in other embodiments, the operator may mark two (2) points for a first image from one cine-angio run and one (1) point for one or more additional images (from other cine-angio runs).
For example, an operator may mark two (2) points proximal and distal (to a stenosis) on one image of one cine-angio run. This run may be referred to as a "Master" run, and the selected corresponding image to a "Master" image. The system then calculates centerline and edges and a "stenosis" point on the artery (this point is not required to be the actual stenosis point, but, rather, a reference point). The operator then selects images from two additional runs (the "Slave" runs) and marks the location of the "stenosis" point on the images from the Slave runs. After reception of each stenosis point on the images from the Slave runs, the systems performs tracing on the image of the
Slave runs, then presents the results of the trace and the 3DR. This reduction of markings may be enabled in embodiments of the present invention using path optimization algorithms including, for example, a generic dijkstra algorithm or a wave propagation algorithm (WPA) adopted for the image trace.
Thus, using a WPA, for example, after image, source point and target point are input, with respect to the Master image, a path is found connecting the source point and target point with minimal cost (e.g., sum of image gray levels to some power). For the images from the Slave runs, the target set is an epi-polar line, instead of point, and the result is a path connecting the source point ("stenosis" or anchor point) to the target line. In addition, a tree of alternative paths may be produced and the optimal branch path may be chosen. This process is further explained as follows:
Trace Master image: proximal and distal points of a vessel of interest are input by an operator, and a centerline of the vessel of interest is produced (output). At this time, the system traces the edges of the vessel of interest in the Master image and determines the stenosis point (which may be accomplished, for example, by determining the location of the minimum diameter of the vessel of interest). The centerline is split into proximal and distal portions.
Thereafter, the images from the Slave runs are traced, separately for the proximal and distal portions, from the marked "stenosis" point to the epi-polar line, for both slave images. This produces 4 traces, with the output comprising one proximal and one distal path for every Slave image (main branches). Using a queue state at the end of dijkstra/WPA, additional candidate branches (secondary branches) may be added to the main branch. As a result, two proximal and two distal candidate trees for Slaves images are obtained.
A choice is then made on the optimal candidate combination: the optimal combination of proximal candidates and the optimal combination of distal candidates comprise three (3) primary centerlines. For every candidate branch combination (branch from Master image and two candidates from Slave images), a 3D match is performed with 3D deviation error attribution. The errors express a quality of the match and are sensitive to scale distortion. Additional criterion of the match quality is a match of 2D centerline directions at corresponding points of the three centerlines. This match criterion may be insensitive to the scale change between the images. Accordingly, the aggregate criteria for the optimal combination is then selected based on the combination of deviation errors, direction match between the centerlines and an additional consideration of a preference to the combinations utilizing more points from the centerline of the Master image. SECOND GROUP OF EMBODIMENTS
It is an objective of this group of embodiments of the present invention to provide a method and system for three-dimensional reconstruction of a tubular organ from angiographic projections. Specifically, the second group of embodiments improve the epi-polar geometry approach for 3DR by providing additional considerations to the three- dimensional reconstruction process, thus providing accurate correspondences between different projections and thus providing an accurate 3D model even in the presence of the mentioned geometrical distortions and epi-polar problem condition.
The suggested reconstruction method, according to the second group of embodiments, is based on epi-polar geometry enhanced by integrating other considerations to the reconstruction process. These other considerations including, for example, the tubular organ's parameters derived from the image, such as Radius and Densitometry (Gray-level) values, along the tubular organ's centerline and local centerline directions. Other considerations, which are derived from the tubular organ's characteristics, can be also incorporated. The current group of embodiments provide a method for three-dimensional reconstruction from two two-dimensional angiographic images and a method for three-dimensional reconstruction from three or more two- dimensional angiographic images. These embodiments further provide a solution for three-dimensional reconstruction for the case where a common reference point between all two-dimensional images is given as well as for the case where the reference point is not given; in this case, a novel approach is provided for obtaining a reference point by means of correlating the invariant functions.
Accordingly, some of the embodiments according to this second group include a method for establishing correspondence between projections of a tubular organ visible in angiographic images comprises:
(a) extracting centerlines of the tubular organ on two angiographic images,
(b) computing features along centerlines points: radius of the tubular organ, centerline direction, projected cross section area of the tubular organ (densitometry); these features compose invariant functions, which are used to match between centerlines, (c) constructing an optimization target function that comprises a penalty function expressing soft epi-polar constraint and discrepancies between invariant functions; the optimization's target function being defined over all possible correspondences between the two centerline points, (d) solving the optimization target function, to generate a map between 2D points on one centerline to the 2D points on the other centerline,
(e) if a reference point is given, then optimizing solution so that the map includes the match of the reference point,
(f) when a reference point is not given, finding it either by obeying the condition Eι(t)=0 and E2( )=0 where E is dP/dL, P is epi-polar distance and L is centerline length, or by means of finding the correlation of functions Si Ei and S2/E2 expressed as functions of epi-polar distance to arbitrary temporary reference point or via correlation of functions Ri and R2; whereby every matched set of 2D points defines a 3D point, for example as a point that minimizes distance from projective lines and the sequence of these 3D points is the three-dimensional reconstruction of the tubular organ.
In the case of three or more projections the optimization process is similar and a 3D point could be found by either: a. "averaging" the 3D points that result from every pair of projection lines, or, b.using three or more projection lines to determine a 3D point; for example, a point that minimizes sum of distances from those lines.
Also, in the case of three or more projections, a direction correspondence criterion is incorporated into the optimization process. The process of finding the correlation could be performed prior to optimization or as part of the optimization. Epi-polar principle defines that, given two 2D projections, every point on the first image defines an epi-polar line on the second image (and vice a versa); the 2D point on the second image that corresponds to the 2D point on the first image is restricted to this epi-polar line. Three-dimensional reconstruction using epi-polar geometry could be described as follows:
(a) given 2D centerlines in two projections, each 2D point in the first centerline defines an epi-polar line that intersects the centerline of the second image, this intersection point is the 2D point on the second image that corresponds to the 2D point on the first image,
(b) each of these 2D points defines a projective line (meaning a line from the source 3D point to this projected 2D point). Thus, the intersection of the two projective lines finds the corresponding 3D source point (Ideally two lines are intersecting, but in practice they do not, so criteria such as minimal distance point should be defined).
(c) the sequence of resultant 3D points is the three- dimensional reconstruction of the tubular organ.
This described process of three-dimensional reconstruction using epi-polar geometry has many insufficiencies. Accordingly, the second group of embodiments brings answers to these insufficiencies based on exploitation of additional invariants apart from epi-polar distance for obtaining accurate match between 2D projections of a tubular organ and 3D reconstruction. One invariant is the radius function behavior along the projected arteries. In the invention, a general relation also is established between projected density of the arteries and epi-polar geometry. The relation allows calculation of values invariant for different projections. The invariance property is utilized for matching tabular organs in different projections. It is known that even in the case where there are no distortions, there exist sitaations when the epipolar principle does not supply the unique solution (the epi-polar ambiguity). The new approach according to the second group of embodiments aids in solving the ambiguity in such situations. The relation will be proved under the assumption of local cylinder structure of the tubular organ.
Defining an invariant function using epi-polar conditions and projected area (densitometry) Fig. 36 shows 3D cylinder representation of a tabular organ segment. Let D be the 3D direction of the tabular organ and S be the area of cross section (Fig.l) orthogonal to D, |D|=1. Let Vi and V2 be C-Arm directions in which two images of the tabular organ have been taken, |Vι|=l, | V2|=l. The cross sectional area is equal to S only in the case when view direction is orthogonal to the tabular organ (Vi orthogonal to D). In the general case the cross sectional area is inversely proportional to the cosine of the angle between the view direction V and the plane of orthogonal cross section S. This implies that the cross sectional area is S/cos(α) (Fig. 37).
The above-mentioned cosine is equal to the sine of the angle between the vectors V and D. So, the cross sectional area is:
Figure imgf000038_0001
where ^/ D is a dot product of two vectors.
Let Di and D2 be projections of the tabular organ's direction to the image planes. Directions Di and D2 coincide with the tub directions. We have (2) Di=D-(ViTD)Vi , 1=1,2.
Note, the vectors Di and D2 in (2) are not normalized.
Denote by V1 the unit vector orthogonal to two view vectors Vι2= Vi XV2/|V! XV2|. The vector V12 is the vector orthogonal to epi-polar planes of the two images. The measure of the projected orientation of the tabular organ relative to epi-polar plane E is, by definition, a scalar product of V1 and the tabular organ's direction:
(3) Eι= Di ' V12/| Dil and E2≡ U2 1 V12/| D2|.
Theorem: the ratio of projected area and the visible epi-polar orientation is invariant for every pear of views, i.e.
(4) Sι/Eι= S2/E2.
Proof:
Using (1) and (2), we obtain:
Figure imgf000039_0001
Multiplying (6) by Vι2 we obtain DιT2= D2 T2 and using notation (3)
(7) | D,| EH D2| E2. Raising (7) to the second power, we can rewrite it in the form | Dι|2
Figure imgf000040_0001
D2|2 E2 2. From (2) we obtain |Dj|2=(D-(ViTD)Vi)T(D-(ViTD)VD=l- (VjTD)2. Hence,
Figure imgf000040_0002
Using equations (8) and (5) we reach equation (4), thus, theorem is proven.
All the measures presented in equation (8) are calculated from the images and do not require 3D reconstruction. Si is known as densitometry - determining projected cross section area values using gray level values in the image. As mentioned, directions Di and
D2 can be calculated as directions tangential to the extracted 2D centerlines from the images.
Defining a novel constraint for the process of 3D reconstruction from three or more views In the case of three-dimensional reconstruction from three or more projections we can incorporate a direction correspondence constraint. Let Dl5 D2 ,..., D be vectors tangential to the 2D tabular organ's centerlines expressed as 3D vectors. The following condition is necessary for point match. For matched points the rank of matrix composed of vectors Di, D2 ,..., DN is less then 3. Rank(Dχ,Ω2 ,..., DN)<3
For three projections (N=3) the equivalent statement is zero determinant of the matrix composed of the vectors Dl5 D2, D3
Figure imgf000040_0003
Method for 3D reconstruction from 2D projections
For simplicity sake, the process will be described for two 2D projections. The process for three or more 2D projections is a simple generalization of the described. Parallel projection geometry is assumed and an image plane passing through 3D origin point that coincides with an identified or given reference point in every image is considered; thus, every point and direction found in the image plane can and will be expressed as 3D entities using reference point and known orientation.
Let Li(l), Li (2), Li (3), ... be a sequence of points representing the tabular organ's centerline in the first image and L2(l), L2(2), L2(3), ... be a sequence of points representing the tabular organ's centerline in the second image. Using previous notation, Vi and V2 are projection directions and V12 is epi-polar direction orthogonal to two view vectors. Index is used as an index of a point on line Li and / as index of point on line L2. Let Ri(z),R2( > be corresponding measures of radii from 2 projections, Di(/),D2( ) measures of centerline normalized direction vectors and Si(t"),S2(/) measures of projected cross section areas based on densitometry calculation. Denote
Figure imgf000041_0002
as epi-polar distances and Eι(zHot(Dι( ,Vι2),
Figure imgf000041_0001
An equivalent definition of E can be given via increments of epi-polar distance and line length E=dP/dL .
Consider a function F of two variables i and j defined on the rectangular domain of indexes iy 1 - 1 - ^ X " - J - M ) where N and M are numbers of points in the centerlines.
F(iJ) = Fi(|Pι( -P2(/)l) + C2F2(|Ri( -R2(/)l) + C3F3(|Si( E2( S2(/')Ei( |) + F4(Ei(t)E2(/)). Here Fi, F2, F3, F are functions with the following properties. Fι(0)=F2(0)=F3(0)=0; Fi, F2, F3 are monotonically increasing functions; F1(00)=00 ; 0≤F2, F3< 1;
|0, if x ≥ O F4(x) =l00' x < 0
C2 and C3 are weight coefficients. The matching problem is formulated as a solution to the minimal path finding problem for function F. Namely, find continuous and monotonic path starting on side i =1 or y-1 and ending on side
Figure imgf000042_0001
minimal sum of values F on its way. The terms "continuous" and "monotonic" mean that there are three possible increments of indexes i : (0,1),(1,0),(1,1). The optimization problem can be solved by dynamic programming method or Dijkstra type algorithm.
The first term Fι(|Pι(i)-P2( )|) of the target function is a soft epi-polar constraint that penalizes strong deviations from the epi-polar condition
Figure imgf000042_0002
For example, the penalty function Fι=( |Pi-P2|/T)2 is tolerant for the discrepancies |Pi-P2|<T and is severe for gross discrepancies jPi-P2|>T. The second term C2F2(|Rι(t)-R2( )|) of the target function encourages similarity of radii along the optimal path. The third term C3F3(|Sι( E2( )-S2( )Ei(t)|) expresses invariance property stated in theorem. Here it is written in the form eliminating singularities associated with division by zero. The fourth term F4(Ei(z)E2(/)) actually imposes stiff constraint not allowing to match up the segments with opposite orientation though the epi-polar distance allows the match. This term often helps to resolve the matching in the situation of ambiguity (epi-polar problem). Formally, a requirement for the epi-polar orientation measure Eι(t), E2(/') to have the same sign follows from the equality stated in the theorem and the fact that the measured cross sectional area is always positive. Thus, the optimization's target function F(i,j) is defined over all possible correspondences between the two centerline points; the solution of the optimization problem is a correspondence map between 2D points on one centerline and 2D points on the other centerline. Now, obtaining the three-dimensional reconstruction continues as known in literature: every matched set of 2D points defines a 3D point, for example as a point that minimizes distance from projective lines. The sequence of these 3D points is the three-dimensional reconstruction of the tubular organ.
In the case of three or more projections the optimization process is similar and a 3D point could be found by either:
(a) "averaging" the 3D points that result from every pair of projection lines, or, (b) using three or more projection lines to determine a 3D point; for example, a point that minimizes sum of distances from those lines.
Also, in the case of three or more projections, a direction correspondence criterion is incorporated into the optimization process, described above as "defining a novel constraint for the process of 3D reconstruction from three or more views".
If the reference point is one of the skeleton point, i.e. Li(z'ø) and l ϋjo), an additional constraint forcing the optimization algorithm to path through the reference i0,jo is imposed into the target function foo, if (i = i0 and j ≠ j0) or (i ≠ i0 and j = j0) 0, otherwise Freifz ) =
Note that only one term in the target function depends on reference point - the penalty term Fi(|Pi( -P2(/')l)-
When the reference point is not known, the difference of epi-polar distances can be described as one parameter family of functions depending on a shift along epi-polar direction. The reference point (or shift) can be found in different ways: • the reference point can be chosen among the points obeying the condition E^ ^O and E2(/')=0, where E is dP/dL, P is epi-polar distance and L is centerline length.
• the shift and therefore reference point can be found via correlation of functions Si/Ei and S2/E2 expressed as functions of epi-polar distance to arbitrary temporary reference point or via correlation of functions Ri and R2.
• the reference point can be found in the process of solving the optimization problem if the classical penalty term Fi(|Pi(t)-P2(/)|) is substituted with the following expression Fι(|Pi(z)-P2( )-(
Figure imgf000043_0001
, where Pl5 P2 are distance to arbitrary temporary reference point and isiart, j start are indexes of the first point of the currently optimal matching segment in the point (i ). In addition to the method described above, the invention also contemplates a system for imaging a tubular organ, comprising a microprocessor configured to generate a three-dimensional reconstruction of the tubular organ from two or more angiographic images of the tabular organ obtained from different perspectives, using the three-dimensional reconstruction method described above. The invention is particularly applicable to imaging an artery contained in an arterial tree.
THIRD GROUP OF EMBODIMENTS
It is an object of the third group' of embodiments of the present invention to provide a method and system for three-dimensional organ reconstruction from more than two angiographic projections in an automated manner, meaning without additional user interaction, i.e., without requiring the user to identify the tubular organ on additional angiograms.
A three-dimensional reconstruction of a tabular organ, such as an artery, from two projections is available via methods known in the prior art. Usually, this requires some user interaction to identify the organ of interest in the first two views. Once this reconstruction is available the third group of embodiments provides a method of an automated update on the basis of one or more additional projections. When a 3D model reconstructed from two projections is available, it is projected on to an additional image plane according to the specific viewing geometry of that additional existing projection.
This gives rise to a major geometric distortion that manifest itself as an unknown shift between the actual X-Ray image and the projected model. The third group of embodiments may determines this shift by implementing a correlation technique. After the shift is calculated the organ tracing and analysis in the third image is carried out with the use of the projected model as a first approximation. The new detected and traced projection of the organ is then used for recalculation of the three-dimensional reconstruction to a better approximation.
According to a second aspect of this group of embodiments, the three- dimensional reconstruction made from two views is exploited to determine local weights for a refined reconstruction incorporating additional projections. A projection of an organ is most informative for the purpose of three-dimensional reconstruction when the viewing direction is orthogonal to the organ. In addition, a pair of projections is more informative when the viewing directions are sufficiently separated. These properties are obviously local, per segment of the organ. Thus, one combination of projections is preferable for one segment of the organ, while another combination of projections is preferable for another.
The third group of embodiments proposes to determine local weights of combination of two 2D image sources for refined 3D reconstruction. Local weights are determined according to the angle between the primary 3D model (reconstructed from the first two projections) centerline and view vectors of the projections and angle between the view vectors.
Accordingly, the third group of embodiments relates to a novel method and system for an automated three-dimensional reconstruction of an organ from three or more projections. Once a three-dimensional reconstruction of the organ from two projections is available, the present group of embodiments provides a method and system that performs an automatic identification of the reconstructed organ in the 2D image of an additional projection, performs an automatic trace and analysis of the organ in the 2D image (in the same manner as was performed on the first and second images), and finally incorporates the new projection into the three-dimensional reconstruction, improving the accuracy of the three-dimensional reconstruction. Such an approach is particularly applicable to imaging an artery contained in an arterial free.
A projection of an organ is most informative for the purpose of three- dimensional reconstruction when the viewing direction is orthogonal to the organ, and substantially different views produce more accurate 3D reconstruction than not substantially different (close) views. To implement these two notions, some of the embodiments of this third group are directed to a novel method and system for exploiting the three-dimensional reconstruction made from two views to determine local weights for a refined reconstruction incorporating additional projections.
Specifically, the present group of embodiments relates to two aspects of making a better three-dimensional reconstruction of an organ when additional angiographic projections to a first two projections are available. The first aspect refers to an automated procedure of identifying, tracing and incorporating an additional projection into the reconstruction. The second aspect presents a novel method of weighted reconstruction process, where weights express the local optimal combination of projections to reconstruct the 3D model from, as a function of viewing angles.
Method for automatic detection of the organ in additional projection
Let A be a 3D model of an organ segment reconstructed from two marked images. For example, we can use a generalized cylinder model that consists of a three- dimensional centerline and circular orthogonal cross sections specified by radii. This model could be expressed as A≡(X„ Yi, Zι, Ri), where i is the index for skeleton points along the three-dimensional centerline. Let / be an image that did not participate in reconstruction of A. Let G be the known geometry of image I. The geometry data G includes angles and rough estimation of magnification factor but does not include C-Arm patient bed shift. We are referring the 3D model A as floating, meaning, representing the true organ it terms of dimensions and morphology but not in terms of location in space.
Projection of the model A using geometry data G into image / plane could be done in two ways - binary or realistic. The "realistic" projection will set the gray value of a pixel as a function of the length of intersection between the ray and the model. The "binary" projection will simply set pixels as zeros and ones, where "one" means that there was an intersection between the ray and the model. For Finding the shift between the projected image and the angiographic image /, the two should be correlated, using correlation methods known in literature; correlation could be performed between i" and either a "realistic" projected image or a "binary" projected image. The shift defines a region of interest on image I, and the three-dimensional model projection provides a first approximation of the organ's centerline. Thus, the process continues by tracing the organ in image I, as known in the prior art, organ's parameters (radii, gray-levels...) are computed, as known in the prior-art and eventually the data from this additional projection in incorporated into the three-dimensional reconstruction. Method for three-dimensional reconstruction from N (N>2) 2D projections
Reconstruction of a 3D line from multiple projections can be posed as an optimization problem whose elementary step is reconstruction of a single point. Theoretically, reconstruction of a single point using multiple projections can be done by means of intersection of projective lines corresponding to the 2D projections; in practice, projective lines do not intersect. One natural definition for a 3D reconstructed point, resulting from intersecting two lines, could be defining the 3D point as the middle of the shortest segment connecting the projective lines. Three-dimensional reconstruction from three or more projections expands the above-mentioned idea and determines a 3D point in a similar way. One example is a direct expansion of taking the 3D point that minimizes distance from (three or more) projective lines. Another method is to take the 3D points that result from all pairs of projections and set the final reconstructed point as a geometrical function of these points. The present group of embodiments suggest a novel approach, in which the results from all pairs of projections are indeed used, but rather than setting the 3D reconstructed result as a function of only the points, the relationship between viewing angles and the 3D model is utilized to determine weights for each pair result.
Let VJ,V2,—,VN be viewing directions and Lι,L2,..., χ be projective lines, Z,- = Pi + λ Vι , where PJ,P2,...,PN are points from 2D centerlines (i indexing a projection). Let A be a 3D model of an organ segment reconstructed from two projections with indexes 1 and 2. We hold, as a result of a primary reconstruction from these two projections 1 and 2, a reference from points Pi, P2 to the centerline points of the model A. Let r be a local tangent direction of the 3D model A at the region where points Pi, P2 reference to. Let Ry be middle of the shortest segment between the projective lines & Lj. Denote by Wy = det ( Vi,Vj,T) a determinant of 3-by-3 matrix composed of unit vectors Vi, Vj, and T. The intersection point is given by the expression:
R =Σ WijRij lΣ Wij
The quality of intersection is:
D ^Σ WyDij IΣ Wij, Where y is distance between the lines Lι,Lj .
The reconstructed 3D point is defined as a weighted sum of the intersection points per each pair of projective lines. The weights reflect the mutual geometry of two views and local orientation of the primary 3D model in such a way that maximal weight (1) is achieved by the combination of two orthogonal views, which are also both orthogonal to the organ. The weight is near zero in the case when the two views are close to each other or if one of the views is too oblique. It is to be noted that the nature of the weights is local; the same pair of views can maximally contribute at one segment of the organ and minimally contribute at another segment. Note also that this suggested definition is quite efficient, since it does not require the calculation of 3D reconstructed points during the optimization process; the distance /),>• between two projective lines is simply the discrepancy between epi-polar distances calculated as an absolute difference of two dot products.
In addition to the method described above, this third group of embodiments also contemplates a system for imaging a tabular organ, comprising a processor configured to generate a three-dimensional reconstruction of the tabular organ from two or more angiographic images of the tubular organ obtained from different perspectives, using the three-dimensional reconstruction method described above.
OTHER IMPROVEMENTS
The above-described embodiments of the present invention (groups one, two and/or three) may include one or more of the following features, although each feature, on its own, may also comprise an independent embodiment.
Multiple data 3DR Image page: cross-section area graph and lesion analysis measurements (e.g., diameter data, C-ARM position, other reference data) may be displayed simultaneously to deliver the maximum relevant information in an optimal manner (see, for example, Fig. 31).
Pop up menu for various projections. A pop up list (3210, Fig. 32) presented in various projections (e.g., 2D projections, ONP and 0,0). Selection of any projection may rotate the 3D to that projection enabling the operator to appreciate it with comparison to the 2D images (for example).
Color coding of 3D model and/or graphs and other data. Color coding can be implemented to denote narrowing severity, angulation, etc. (or a combination of parameters), to draw the physician attention to problematic segments.
Correlated data. A cross-reference of data from a 2D trace of the vessel to the 3D model to graphs; every point can be allocated simultaneously on all. Cues are presented, for example to enable the operator to investigate the data either specifically or simultaneously.
One or more graphs (see, for example, screen shot, Fig. 34) may be presented including a graph for representing the cross-section area (fusion output) data and one for diameter information, or a combined graph. A diameter data graph may be refened to "eccentricity", as it presents maximum and minimum diameter value for every point along the vessel.
Epi-polar warnings/bars/lines. Epi-polar geometry is well-known and extensively documented, and is used for 3DR in the present invention. However, a 3DR is only as good as the images are to prepare it. Accordingly, to determine whether a second image, in combination with the first image, is adequate to aid in 3DR, embodiments of the present invention provide an operator with a visual indicator. As shown in Figs. 35A-35B, once the operator completes marking of a first image
(Fig. 35 A), and the vessel of interest is traced, as soon as the operator clicks on about the stenosis on the second image (Fig. 35B), the system presents, on the second image, epi-lines (lines 3510 and 3520) that are in the vicinity of the first image's markings and epi-bar 3530.
The bar indicates the conditions for 3DR. In the present illustration (Fig. 35B), the bar is color coded to indicate whether the second image is a good combination with the first image. Here, the more "white" the bar is, the better the conditions for 3DR. Accordingly, since the bar in Fig. 35B is quite white, conditions are good for 3DR (a "redder" bar would indicate poorer conditions for 3DR). Many of the figures represent screen-shots of a preferred embodiment of the system. Specifically it is presented a prefened embodiments of catheter calibration (Fig. 33), display of 2D image related data (Fig. 29), including edge tracing and healthy artery display, display of 3DR results (Fig. 31) of the vessel of interest and 3D healthy vessel, and quantitative analysis of the vessel of interest (Fig. 34) in the form of graphs and specific measurements, such as percent narrowing (diameter and area), length, plaque volume, minimal lumen diameter and area, reference (healthy) area and diameter measures, eccentricity index and angulation.
Having now described a number of embodiments of the present invention, it is apparent to those skilled in the art that the present disclosure is not limited to those embodiments, and that the above embodiments may.be used in combination. Moreover, numerous other embodiments and modifications of the disclosed embodiment are contemplated as following within the scope of the present invention.

Claims

WHAT IS CLAIMED IS:
1. A method for three-dimensional reconstruction (3DR) of a single tubular organ using a plurality of two-dimensional images comprising: displaying a first image of a vascular network; receiving input for identifying on the first image a vessel of interest; tracing the edges of the vessel of interest including eliminating false edges of objects visually adjacent to the vessel of interest; deteπnining substantially precise radius and densitometry values along the vessel; displaying at least a second image of the vascular network; receiving input for identifying on the second image the, vessel of interest; tracing the edges of the vessel of interest in the second image, including eliminating false edges of objects visually adjacent to the vessel of interest; detennining substantially precise radius and densitometry values along the vessel of interest in the second image; detennining a three dimensional reconstruction of the vessel of interest; and determining fused area measurements along the vessel.
2. The method according to claim 1, wherein the vessel of interest is selected from the group comprising: an artery, a vein, a coronary artery, a carotid artery, a pulmonary artery, a renal artery, a hepatic artery, a femoral artery, a mesenteric artery, and any other tubular organ.
3. The method according to claim 1, further comprising detennining a centerline, comprising a plurality of centerline points.
4. The method according to claim 1, wherein the fused area measurements are obtained using a fusion of diameter and cross section-densitometry derived measurements.
5. The method according to claims 1 or 4, wherein detennining the fused area comprises: determine a plurality of healthy diameters along the vessel of interest to be used as a physical reference; normalizing a majority of the data, diameters and cross-section values to physical units, using the above physical reference; and fusing a majority of the data into single area measurements, weighting each source of data according to the reliability of the data.
6. The method according to claim 5, where weighting is computed as a function of the views geometry and/or 3D vessel geometry.
7. The method according to claim 1, wherein the input for identifying the vessel of interest comprises of three points comprise a first point to mark the stenosis general location, a second point proximal to the stenosis, and a third point distal to the stenosis.
8. The method according to claim 1, wherein the input for identifying the vessel of interest comprises markers for two (2) points for at least one of the first and second images, wherein one of the two points is anywhere proximal to the stenosis and the other point is anywhere distal to the stenosis.
9. The method according to claim 1, wherein the markers comprise two (2) points for the first image and one (1) point for the second image, wherein one of the two points is anywhere proximal to the stenosis and the other point is anywhere distal to the stenosis and wherein one point is an anchor point identified automatically on the first image.
10. The method according to claim 1, wherein elimination of false edges comprises ignoring one or more bubbles adjacent the vessel of interest.
11. The method according to claim 1 or 10, wherein elimination of false edges comprises: defining a region of interest substantially parallel to a primary centerline; detecting at least one cluster of pixel data, adjacent to the vessel of interest, wherein each cluster of pixel data having a predetermined brightness level greater than a brightness level of sunounding pixel data; selecting an arbitrary pixel within each cluster; selecting a second pixel provided on a lane bounding the region of interest for each arbitrary pixel of each cluster; establishing a barrier line to define an edge for the vessel of interest by connecting a plurality of arbitrary pixels with a conesponding second pixel, wherein upon the tracing each edge of the vessel of interest, the traced edge avoids each barrier line.
12. The method according to claim 1, wherein elimination of false edges comprises detecting and/or eliminating one or more bumps along the vessel of interest.
13. The method according to claim 1 or 12, wherein elimination of false edges includes: establishing a list of suspect points, comprising: establishing a plurality of first distances between each of a plurality of originating points on at least one preliminary traced edge and a conesponding closest point positioned along the primary centerline; establishing a plurality of second distances between each of a plurality of second centerline points point on the primary centerline to a conesponding closest point positioned on the at least one edge; and determining deviation, from the centerline, an absolute distance of the second distance and the first distance; determining a gradient cost function, inversely proportional to a gradient magnitude at each edge point; determining a combined function aggregating deviation from the centerline and the gradient cost function, wherein upon the combined function being greater than a predefined value, the conesponding edge point is determined to be a bump point in a bump; determining a bump area defined by a plurality of connected bump points and a cutting line adjacent the vessel of interest, wherein the cutting line comprises a line which substantially maximizes a ratio between the bump area and a power of a cutting line length; and cutting the bump from the edge at the cutting line to establish a final edge.
14. The method according to claim 3, wherein defining a centerline of the vessel of interest comprises: determining final traced edges of the vessel of interest; determining pairs of anchor points, wherein each pair comprises one point on each edge; determining a cross-sectional line by searching for pairs of anchor points which, when connected, establish the cross-sectional line substantially orthogonal to the center-line; dividing each edge into a plurality of segments using the anchor points, wherein for each segment, conespondence between the edges is established in that every point of each edge includes at least one pair of points on an opposite edge and a total sum of distances between adjacent points is minimal; and connecting the centers of the plurality of segments to determine the centerline.
15. The method according to claim 1, wherein determining densitometry values comprises subtracting a background influence.
16. The method according to claim 1 or 15, wherein determining densitometry values comprises: establishing a plurality of profile lines substantially parallel to at least one edge of the vessel of interest; estabKshing a parametric grid covering the vessel of interest and a neighboring region, wherein the parametric grid includes a first parameter of the vessel of interest along the length thereof and a second parameter for controlling a cross-wise change of the vessel of interest; sampling the image using the grid to obtain a plurality of conesponding gray values, wherein: the gray values are investigated as functions on the profile lines; substantially eliminating detected occluding structures on the outside of the vessel of interest, the structures being detected as prominent minima of the parameters; substantially eliminating prominent minima detected on the inside of the vessel of interest; averaging gray values in a direction across the vessel of interest separately for each side of the vessel of interest; determining a linear background estimation on the grid inside the vessel of interest; and detennining cross-sectional area using the eliminated prominent minima.
17. The method according to claim 1, further comprising determining healthy vessel dimensions using an iterative regression over a healthy portion of the vessel of interest.
18. The method according to claim 17, wherein each iteration comprises a compromise between a pre-defined slope and a line that follows healthy data.
19. The method according to claim 18, wherein the compromise is toward the line that follows the healthy data if the line conesponds to actual data over a plurality of clusters.
20. The method according to claim 1, further comprising displaying, either in 2D and/or in 3D, healthy vessel dimensions of the vessel of interest.
21. The method according to claim 3, wherein determining the three-dimensional reconstruction of the vessel of interest includes: determining a conventional epi-polar distance pi for the plurality of centerline points in the first image; determining a conventional epi-polar distance p2 for the plurality of centerline points in the second image; and re-determdning p2 substantially in accordance with p2new=P2 + δ, where δ is a smooth compensatory function establishing conespondence of one or more landmark points.
22. The method according to claim 1, further comprising displaying color coded data relating to the vessel of interest in any display of data.
23. The method according to claim 1, wherein after receiving input for identifying the vessel of interest in the second image, displaying an epi-polar indicator for indicating a concurrence between the first image and second image for producing a three- dimensional reconstruction of the vessel of interest.
24. The method according to claim 1, further comprising displaying quantitative analysis of the vessel of interest including cross-section area graph and/or lesion analysis measurements.
25. The method according to claim 1, further comprising cross-referencing data among at least a pair or more data related to the two-dimensional trace of the vessel of interest, the three-dimensional reconstruction of the vessel of interest, and graphical data.
26. A system for three-dimensional reconstruction (3DR) of a single blood vessel using a plurality of two-dimensional images comprising: a display for displaying a first image of a vascular network and a second image of a vascular network, and a three-dimensional reconstruction of a vessel; input means for receiving input for identifying a vessel of interest on the first image and for identifying the vessel of interest on the second image; a processor ananged to operate one or more application programs comprising computer instructions for: tracing the edges of the vessel of interest including eliminating false edges of objects visually adjacent to the vessel of interest; determining substantially precise radius and densitometry values along the vessel; tracing the edges of the vessel of interest in the second image, including eliminating false edges of objects visually adjacent to the vessel of interest; determining substantially precise radius and densitometry values along the vessel of interest in the second image; determining a three dimensional reconstruction of the vessel of interest; and determining fused area measurements along the vessel.
27. The system according to claim 26, wherein the vessel of interest is selected from the group comprising: an artery, a vein, a coronary artery, a carotid artery, a pulmonary artery, a renal artery, a hepatic artery, a femoral artery, and a mesenteric artery.
28. The system according to claim 26, wherein the application programs further include computer instructions for determining a centerline, comprising a plurality of centerline points.
29. The system according to claim 26, wherein the fused area measurements are obtained using a fusion of diameter and cross section-densitometry derived measurements.
30. The system according to claim 26 or 29, wherein determining the fused area comprises: detennining a plurality of healthy diameters along the vessel of interest to be used as a physical reference; normalizing a majority of the data, diameters and cross-section values, to physical units, using the above physical reference; and fusing a majority of the data into single area measurements, weighting each source of data according to the reliability of the data.
31. The system according to claim 30, where weighting is computed as a function of the views geometry and/or 3D vessel geometry.
32. The system according to claim 26, wherein the input for identifying the vessel of interest comprises of three points comprise a first point to mark the stenosis general location, a second point proximal to the stenosis, and a third point distal to the stenosis.
33. The system according to claim 26, wherein the input for identifying the vessel of interest comprises markers for two (2) points for at least one of the first and second images, wherein one of the two points is anywhere proximal to the stenosis and the other point is anywhere distal to the stenosis.
34. The system according to claim 26, wherein the markers comprise two (2) points for the first image and one (1) point for the second image, wherein one of the two points is anywhere proximal to the stenosis and the other point is anywhere distal to the stenosis and wherein one point is an anchor point identified automatically on the first image.
35. The system according to claim 26, wherein elimination of false edges comprises ignoring one or more bubbles adjacent the vessel of interest.
36. The system according to claim 26 or 35, wherein elimination of false edges comprises: defining a region of interest substantially parallel to a primary centerline; detecting at least one cluster of pixel data, adjacent to the vessel of interest, wherein each cluster of pixel data having a predetermined brightness level greater than a brightness level of sunounding pixel data; selecting an arbitrary pixel within each cluster; selecting a second pixel provided on a lane bounding the region of interest for each arbitrary pixel of each cluster; establishing a barrier line to define an edge for the vessel of interest by connecting a plurality of arbitrary pixels with a conesponding second pixel, wherein upon the tracing each edge of the vessel of interest, the traced edge avoids each barrier line.
37. The system according to claim 26, wherein elimination of false edges comprises detecting and or eliminating one or more bumps along the vessel of interest.
38. The system according to claim 26 or 37, wherein elimination of false edges includes: establishing a list of suspect points, comprising: establishing a plurality of first distances between each of a plurality of originating points on at least one preliminary traced edge and a conesponding closest point positioned along the primary centerline; establishing a plurality of second distances between each of a plurality of second centerline points point on the primary centerline to a conesponding closest point positioned on the at least one edge; detennining deviation, from the centerline, an absolute distance of the second distance and the first distance; determining a gradient cost function, inversely proportional to a gradient magnitude at each edge point; determining a combined function aggregating deviation from the centerline and the gradient cost function, wherein upon the combined function being greater than a predefined value, the conesponding edge point is deteπnined to be a bump point in a bump; determining a bump area defined by a plurality of connected bump points and a cutting line adjacent the vessel of interest, wherein the cutting line comprises a line which substantially maximizes a ratio between the bump area and a power of a cutting line length; and cutting the bump from the edge at the cutting line to establish a final edge.
39. The system according to claim 26, wherein the application programs also include computer instructions for displaying an epi-polar indicator for indicating a concunence between the first image and second image for producing a three- dimensional reconstruction of the vessel of interest.
40. The system according to claim 28, wherein defining a centerline of the vessel of interest comprises: determining final traced edges of the vessel of interest; determining pairs of anchor points, wherein each pair comprises one point on each edge; determining a cross-sectional line by searching for pairs of anchor points which, when connected, establish the cross-sectional line substantially orthogonal to the center-line; dividing each edge into a plurality of segments using the anchor points, wherein for each segment, conespondence between the edges is established in that every point of each edge includes at least one pair of points on an opposite edge and a total sum of distances between adjacent points is minimal; and connecting the centers of the plurality of segments to determine the centerline.
41. The system according to claim 26, wherein determining densitometry values comprises subtracting a background influence.
42. The system according to claim 26 or 41, wherein determining densitometry values comprises: establishing a plurality of profile lines substantially parallel to at least one edge of the vessel of interest; establishing a parametric grid covering the vessel of interest and a neighboring region, wherein the parametric grid includes a first parameter of the vessel of interest along the length thereof and a second parameter for controlling a cross-wise change of the vessel of interest; sampling the image using the grid to obtain a plurality of conesponding gray values, wherein: the gray values are investigated as functions on the profile lines; substantially eliminating detected occluding structures on the outside of the vessel of interest, the structures being detected as prominent minima of the parameters; substantially eliminating prominent minima detected on the inside of the vessel of interest; averaging gray values in a direction across the vessel of interest separately for each side of the vessel of interest; determining a linear background estimation on the grid inside the vessel of interest; and determining cross-sectional area using the eliminated prominent minima.
43. The system according to claim 26, further comprising deterrnining healthy vessel dimensions using an iterative regression over a healthy portion of the vessel of interest.
44. The system according to claim 43, wherein each iteration comprises a compromise between a pre-defined slope and a line that follows healthy data.
45. The system according to claim 44, wherein the compromise is toward the line that follows the healthy data if the line conesponds to actual data over a plurality of clusters.
46. The system according to claim 28, wherein detennining the three-dimensional reconstruction of the vessel of interest includes: determining a conventional epi-polar distance pi for the plurality of centerline points in the first image; deterrnining a conventional epi-polar distance p2 for the plurality of centerline points in the second image; and re-determining p2 substantially in accordance with p2new=p2 + δ, where δ is a smooth compensatory function establishing conespondence of one or more landmark points.
47. The system according to claim 26, wherein the application programs including computer instructions for displaying color coded data relating to the vessel of interest in any display of data.
48. The system according to claim 26, further comprising epi-polar indicator means for indicating a concunence between the first image and second image for producing a three-dimensional reconstruction of the vessel of interest.
49. The system according to claim 26, further comprising quantitative analysis means for rendering quantitative analysis of the vessel of interest including cross-section area graph and or lesion analysis measurements.
50. The system according to claim 26, further comprising cross-referencing means for cross-referencing data among at least a pair or more data related to the two- dimensional trace of the vessel of interest, the three-dimensional reconstruction of the vessel of interest, and graphical data.
51. A system for three-dimensional reconstruction (3DR) of a single blood vessel using a plurality of two-dimensional images comprising: display means for displaying a first image of a vascular network, and a second image of the vascular network and the 3DR; input means for identifying a vessel of interest on the first image and the second image; tracing means for tracing the edges of the vessel of interest in each image including elimination means for eliminating false edges of objects visually adjacent to the vessel of interest in each image; a processor for: determining a centerline, comprising a plurality of centerline points, determining substantially precise radius and densitometry values along the vessel, determining substantially precise radius and densitometry values along the vessel of interest in the second image, determining a three dimensional reconstruction of the vessel of interest, determining fused area (cross-section) measurements along the vessel and establishing the 3DR of the vessel of interest.
52. A method for three-dimensional reconstruction of a tabular organ, the tabular organ being imaged on two angiographic images, comprising: extracting centerlines of the tabular organ on two angiographic images; obtaining invariant functions of the two images; constructing an optimization target function that is comprised of a penalty function expressing soft epi-polar constraint and discrepancies between invariant functions; the optimization's target function is defined over all possible conespondences between the two centerline points; solving the optimization target function, to generate a map between 2D points on one centerline to the 2D points on the other centerline; if a reference point is given, then optimizing solution so that the map includes the match of the reference point; when a reference point is not given, finding it either by obeying the condition Eι(i)=0 and E2( )=:0 where E is dP/dL, P is epi-polar distance and L is centerline length, or by means of finding the conelation of functions Si Ei and S2/E2 expressed as functions of epi-polar distance to arbitrary temporary "reference point or via conelation of functions Ri and R2; wherein every matched set of 2D points defines a 3D point, for example as a point that minimizes distance from projective lines and the sequence of these 3D points is the three-dimensional reconstruction of the tubular organ.
53. The method according to claim 52, wherein invariant functions are comprised of radius or projected cross section area or centerline direction of the tubular organ along the centerline points, or the invariant function obtaining an invariant function from a tabular organ characteristic, the tabular organ being imaged in angiography, the invariant function being equivalence between the ratio of projected area and the visible epi-polar orientation for every pair of views..
54. The method according to claims 52, wherein a reference point is not given, and is found either by obeying the condition
Figure imgf000066_0001
where E is dP/dL, P is epipolar distance and L is centerline length, or by means of finding the conelation of functions Si/Ei and S2/E2 expressed as functions of epi-polar distance to arbitrary temporary reference point or via conelation of functions Ri and R2..
55. The method according to claim 54, wherein the process of finding the conelation is performed prior to optimization..
56. The method according to claim 54, wherein the process of finding the conelation is performed as part of the optimization..
57. A method for three-dimensional reconstruction of an tabular organ, the tabular organ being imaged on three or more angiographic images, the method comprising forming a three-dimensional reconstruction of a tubular organ, the tabular organ being imaged on two angiographic images using the method according to any one of claim 56 and incorporating a direction conespondence criterion into the optimization process of said method..
58. The method according to Claim 57, wherein determining a 3D point includes "averaging" the 3D points, that result from every pair of projection lines.
59. The method according to Claim 57, wherein detennining a 3D point includes using three or more projection lines to determine a 3D point, for example, a point that minimizes sum of distances from those lines..
60. A method for automated three-dimensional reconstruction of a tabular organ from at least first, second and third angiographic projections, comprising:. obtaining a three-dimensional (3D) reconstruction of the tabular organ from the first and second angiographic projections; projecting the 3D reconstruction onto an image plane according to a specific viewing geometry of the third angiographic projection; determining a shift between the third angiographic projection and the projected 3D reconstruction on said image plane so as to identify the tabular organ within the third angiographic projection; tracing and analyzing the tabular organ in the third angiographic projection using the projected 3D reconstruction on said image plane as a first approximation so as to derive properties of the tabular organ; and using said properties for re-determining the three-dimensional reconstruction to a better approximation.
61. The method according to claim 60, wherein projecting the 3D reconstruction onto an image plane produces a binary projected image.
62. The method according to claim 60, wherein projecting the 3D reconstruction onto an image plane produces a realistic projected image, in which a pixel's gray-level is a function of the length of intersection between the ray and the model.
63. A method for three-dimensional reconstruction from N (N>2) 2D projections, comprising: obtaining a three-dimensional reconstruction for every pair of projections; assigning a respective weight per 3D point, for each of said pairs of projections, that reflects a mutual geometry of two views and local orientation of a primary 3D model in such a way that maximal weight (1) is achieved by the combination of two orthogonal views, which are also both orthogonal to the organ; and such that the respective weight is near zero in the case when the two views are close to each other or if one of the views is too oblique; and defining the reconstructed 3D point as a weighted sum of the intersection points per each pair of projective lines.
4. The method according to claim 63, wherein in assigning a respective weight per 3D point is performed by a weighting mechanism that utilizes the 3D model and viewing directions.
PCT/US2004/031594 2003-09-25 2004-09-24 System and method for three-dimensional reconstruction of a tubular organ WO2005031635A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2006528281A JP5129480B2 (en) 2003-09-25 2004-09-24 System for performing three-dimensional reconstruction of tubular organ and method for operating blood vessel imaging device
US10/573,464 US7742629B2 (en) 2003-09-25 2004-09-24 System and method for three-dimensional reconstruction of a tubular organ
EP04785100A EP1665130A4 (en) 2003-09-25 2004-09-24 System and method for three-dimensional reconstruction of a tubular organ
IL174514A IL174514A (en) 2003-09-25 2006-03-23 System and method for three-dimensional reconstruction of a tubular organ

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US50543003P 2003-09-25 2003-09-25
US60/505,430 2003-09-25
US50617803P 2003-09-29 2003-09-29
US60/506,178 2003-09-29
US57798104P 2004-06-07 2004-06-07
US60/577,981 2004-06-07

Publications (1)

Publication Number Publication Date
WO2005031635A1 true WO2005031635A1 (en) 2005-04-07

Family

ID=34397020

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2004/031594 WO2005031635A1 (en) 2003-09-25 2004-09-24 System and method for three-dimensional reconstruction of a tubular organ

Country Status (5)

Country Link
US (1) US7742629B2 (en)
EP (1) EP1665130A4 (en)
JP (1) JP5129480B2 (en)
IL (1) IL174514A (en)
WO (1) WO2005031635A1 (en)

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006117773A1 (en) * 2005-05-03 2006-11-09 Paieon Inc. Method and apparatus for positioning a biventrivular pacemaker lead and electrode
WO2006137016A1 (en) * 2005-06-21 2006-12-28 Koninklijke Philips Electronics N. V. Method and device for imaging a blood vessel
WO2007002562A2 (en) * 2005-06-24 2007-01-04 Edda Technology, Inc. Methods for interactive liver disease diagnosis
JP2007069000A (en) * 2005-09-06 2007-03-22 Pie Medical Imaging Bv Method, apparatus and computer program for detecting contour of blood vessel using x-ray density measuring method
JP2007083048A (en) * 2005-09-23 2007-04-05 Mediguide Ltd Method and system for determining three dimensional representation of tubular organ
WO2008051841A2 (en) * 2006-10-20 2008-05-02 Stereotaxis, Inc. Location and display of occluded portions of vessels on 3-d angiographic images
US7587074B2 (en) 2003-07-21 2009-09-08 Paieon Inc. Method and system for identifying optimal image within a series of images that depict a moving organ
JP2010504794A (en) * 2006-09-29 2010-02-18 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Protrusion detection method, system, and computer program
US7742629B2 (en) 2003-09-25 2010-06-22 Paieon Inc. System and method for three-dimensional reconstruction of a tubular organ
US7778685B2 (en) 2000-10-18 2010-08-17 Paieon Inc. Method and system for positioning a device in a tubular organ
US20100309198A1 (en) * 2007-05-15 2010-12-09 Claude Kauffmann method for tracking 3d anatomical and pathological changes in tubular-shaped anatomical structures
US8126241B2 (en) 2001-10-15 2012-02-28 Michael Zarkh Method and apparatus for positioning a device in a tubular organ
WO2012028190A1 (en) 2010-09-02 2012-03-08 Pie Medical Imaging Bv Method and apparatus for quantitative analysis of a tree of recursively splitting tubular organs
US8157742B2 (en) 2010-08-12 2012-04-17 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8200466B2 (en) 2008-07-21 2012-06-12 The Board Of Trustees Of The Leland Stanford Junior University Method for tuning patient-specific cardiovascular simulations
US8249815B2 (en) 2010-08-12 2012-08-21 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8295577B2 (en) 2005-03-31 2012-10-23 Michael Zarkh Method and apparatus for guiding a device in a totally occluded or partly occluded tubular organ
WO2013038290A1 (en) * 2011-09-13 2013-03-21 Koninklijke Philips Electronics N.V. Vessel annotator.
CN103295262A (en) * 2013-05-21 2013-09-11 东软集团股份有限公司 Rotating multi-angle surface reconstruction method and device for tubular cavity tissue
US8548778B1 (en) 2012-05-14 2013-10-01 Heartflow, Inc. Method and system for providing information from a patient-specific model of blood flow
US20140369583A1 (en) * 2013-06-18 2014-12-18 Konica Minolta, Inc. Ultrasound diagnostic device, ultrasound diagnostic method, and computer-readable medium having recorded program therein
CN104408408A (en) * 2014-11-10 2015-03-11 杭州保迪自动化设备有限公司 Extraction method and extraction device for robot spraying track based on curve three-dimensional reconstruction
US9984456B2 (en) 2004-04-14 2018-05-29 Edda Technology, Inc. Method and system for labeling hepatic vascular structure in interactive liver disease diagnosis
CN109366446A (en) * 2018-10-30 2019-02-22 中国冶集团有限公司 The setting out method of electricity driving displacement line-plotting device and abnormal-shaped screw body intersection
WO2019052709A1 (en) * 2017-09-13 2019-03-21 Siemens Healthcare Gmbh Improved 3-d vessel tree surface reconstruction
US10354050B2 (en) 2009-03-17 2019-07-16 The Board Of Trustees Of Leland Stanford Junior University Image processing method for determining patient-specific cardiovascular information
CN115984301A (en) * 2022-12-13 2023-04-18 徐州医科大学 Three-dimensional reconstruction precision comparison method based on Bujia syndrome blood vessel image
KR102542972B1 (en) * 2022-07-04 2023-06-15 재단법인 아산사회복지재단 Method and apparatus for generating three-dimensional blood vessel structure

Families Citing this family (167)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005055496A2 (en) * 2003-11-26 2005-06-16 Viatronix Incorporated System and method for optimization of vessel centerlines
JP4990143B2 (en) * 2004-09-21 2012-08-01 イメドース ゲーエムベーハー Method and apparatus for analyzing retinal blood vessels in digital images
US7787683B2 (en) * 2004-12-20 2010-08-31 Siemens Medical Solutions Usa, Inc. Tree structure based 2D to 3D registration
US8218836B2 (en) * 2005-09-12 2012-07-10 Rutgers, The State University Of New Jersey System and methods for generating three-dimensional images from two-dimensional bioluminescence images and visualizing tumor shapes and locations
CN100501566C (en) * 2006-01-05 2009-06-17 李明 Curved-surface film projection system and method therefor
DE102006014882A1 (en) * 2006-03-30 2007-11-15 Siemens Ag A method for imaging the myocardium of an infarct patient and methods for aiding a therapeutic intervention on the heart
JP5086563B2 (en) * 2006-05-26 2012-11-28 オリンパス株式会社 Image processing apparatus and image processing program
DE102007030960A1 (en) * 2006-07-25 2008-01-31 Siemens Ag Three dimensional-structure representing method, involves presenting three dimensional-structure as individual volumetric grey scale value and as result of volumetric scans with set of cutting planes
EP1923836A1 (en) * 2006-11-20 2008-05-21 Agfa HealthCare NV Picking on fused 3D volume rendered images and updating corresponding views according to a picking action.
EP1959391A1 (en) * 2007-02-13 2008-08-20 BrainLAB AG Determination of the three dimensional contour path of an anatomical structure
US11197651B2 (en) 2007-03-08 2021-12-14 Sync-Rx, Ltd. Identification and presentation of device-to-vessel relative motion
US11064964B2 (en) 2007-03-08 2021-07-20 Sync-Rx, Ltd Determining a characteristic of a lumen by measuring velocity of a contrast agent
US9968256B2 (en) 2007-03-08 2018-05-15 Sync-Rx Ltd. Automatic identification of a tool
US8781193B2 (en) 2007-03-08 2014-07-15 Sync-Rx, Ltd. Automatic quantitative vessel analysis
US9375164B2 (en) 2007-03-08 2016-06-28 Sync-Rx, Ltd. Co-use of endoluminal data and extraluminal imaging
US9629571B2 (en) 2007-03-08 2017-04-25 Sync-Rx, Ltd. Co-use of endoluminal data and extraluminal imaging
US9305334B2 (en) 2007-03-08 2016-04-05 Sync-Rx, Ltd. Luminal background cleaning
EP2129284A4 (en) 2007-03-08 2012-11-28 Sync Rx Ltd Imaging and tools for use with moving organs
US10716528B2 (en) 2007-03-08 2020-07-21 Sync-Rx, Ltd. Automatic display of previously-acquired endoluminal images
ATE544132T1 (en) * 2007-05-30 2012-02-15 Cleveland Clinic Foundation AUTOMATED CENTERLINE EXTRACTION METHOD AND GENERATION OF A CORRESPONDING ANALYTICAL EXPRESSION AND USE THEREOF
US20100189337A1 (en) * 2007-07-11 2010-07-29 Koninklijke Philips Electronics N.V. Method for acquiring 3-dimensional images of coronary vessels, particularly of coronary veins
JP5110295B2 (en) * 2008-04-16 2012-12-26 株式会社島津製作所 X-ray diagnostic equipment
US8073221B2 (en) * 2008-05-12 2011-12-06 Markus Kukuk System for three-dimensional medical instrument navigation
ES2450391T3 (en) 2008-06-19 2014-03-24 Sync-Rx, Ltd. Progressive progress of a medical instrument
JP4541434B2 (en) * 2008-07-14 2010-09-08 ザイオソフト株式会社 Image processing apparatus and image processing program
US8155411B2 (en) * 2008-07-22 2012-04-10 Pie Medical Imaging B.V. Method, apparatus and computer program for quantitative bifurcation analysis in 3D using multiple 2D angiographic images
US8077954B2 (en) * 2008-08-04 2011-12-13 Siemens Aktiengesellschaft System and method for image processing
RU2508907C2 (en) * 2008-08-11 2014-03-10 Конинклейке Филипс Электроникс Н.В. Detecting perspective optimal representation maps with provision for cross-sectional shape of vessel in cardiovascular x-ray systems
JP5399407B2 (en) * 2008-11-13 2014-01-29 株式会社日立メディコ Image processing apparatus and image processing method
US9101286B2 (en) 2008-11-18 2015-08-11 Sync-Rx, Ltd. Apparatus and methods for determining a dimension of a portion of a stack of endoluminal data points
US9974509B2 (en) 2008-11-18 2018-05-22 Sync-Rx Ltd. Image super enhancement
US11064903B2 (en) 2008-11-18 2021-07-20 Sync-Rx, Ltd Apparatus and methods for mapping a sequence of images to a roadmap image
US10362962B2 (en) 2008-11-18 2019-07-30 Synx-Rx, Ltd. Accounting for skipped imaging locations during movement of an endoluminal imaging probe
US8855744B2 (en) 2008-11-18 2014-10-07 Sync-Rx, Ltd. Displaying a device within an endoluminal image stack
US9095313B2 (en) 2008-11-18 2015-08-04 Sync-Rx, Ltd. Accounting for non-uniform longitudinal motion during movement of an endoluminal imaging probe
US9144394B2 (en) 2008-11-18 2015-09-29 Sync-Rx, Ltd. Apparatus and methods for determining a plurality of local calibration factors for an image
DE102009006636B4 (en) * 2008-12-30 2016-02-18 Siemens Aktiengesellschaft Method for determining a 2D contour of a vessel structure depicted in 3D image data
TWI385544B (en) * 2009-09-01 2013-02-11 Univ Nat Pingtung Sci & Tech Density-based data clustering method
TWI391837B (en) * 2009-09-23 2013-04-01 Univ Nat Pingtung Sci & Tech Data clustering method based on density
WO2011099992A1 (en) 2010-02-12 2011-08-18 Brigham And Women's Hospital, Inc. System and method for automated adjustment of cardiac resynchronization therapy control parameters
FR2958434B1 (en) * 2010-04-02 2012-05-11 Gen Electric METHOD FOR PROCESSING RADIOLOGICAL IMAGES
MY152058A (en) * 2010-06-21 2014-08-15 Univ Putra Malaysia A method of constructing at least one 3 dimensional image
WO2012001623A1 (en) * 2010-06-30 2012-01-05 Koninklijke Philips Electronics N.V. Interactive image analysis
KR101826331B1 (en) 2010-09-15 2018-03-22 삼성전자주식회사 Apparatus and method for encoding and decoding for high frequency bandwidth extension
EP2453408B1 (en) 2010-11-12 2013-06-05 General Electric Company Method for processing radiographic images for stenosis detection
US8737713B2 (en) 2010-11-30 2014-05-27 Siemens Medical Solutions Usa, Inc. System for frame selection for optimal registration of a multi-frame dataset
US9510763B2 (en) 2011-05-03 2016-12-06 Medtronic, Inc. Assessing intra-cardiac activation patterns and electrical dyssynchrony
CN103930036B (en) * 2011-08-26 2016-03-09 Ebm株式会社 For the system of blood flow character diagnosis
EP2600315B1 (en) * 2011-11-29 2019-04-10 Dassault Systèmes Creating a surface from a plurality of 3D curves
CN102520721B (en) * 2011-12-08 2015-05-27 北京控制工程研究所 Autonomous obstacle-avoiding planning method of tour detector based on binocular stereo vision
KR101881924B1 (en) * 2011-12-29 2018-08-27 삼성전자주식회사 Apparatus and Method for processing ultrasound image
CN103218797B (en) * 2012-01-19 2016-01-27 中国科学院上海生命科学研究院 The method and system of blood-vessel image treatment and analyses
US9216008B2 (en) * 2012-01-30 2015-12-22 Technion Research & Development Foundation Limited Quantitative assessment of neovascularization
US9053551B2 (en) 2012-05-23 2015-06-09 International Business Machines Corporation Vessel identification using shape and motion mapping for coronary angiogram sequences
JP6134789B2 (en) 2012-06-26 2017-05-24 シンク−アールエックス,リミティド Image processing related to flow in luminal organs
US9858387B2 (en) * 2013-01-15 2018-01-02 CathWorks, LTD. Vascular flow assessment
US9278219B2 (en) 2013-03-15 2016-03-08 Medtronic, Inc. Closed loop optimization of control parameters during cardiac pacing
WO2014143974A1 (en) * 2013-03-15 2014-09-18 Bio-Tree Systems, Inc. Methods and system for linking geometry obtained from images
JP6066197B2 (en) * 2013-03-25 2017-01-25 富士フイルム株式会社 Surgery support apparatus, method and program
US9924884B2 (en) 2013-04-30 2018-03-27 Medtronic, Inc. Systems, methods, and interfaces for identifying effective electrodes
US10064567B2 (en) 2013-04-30 2018-09-04 Medtronic, Inc. Systems, methods, and interfaces for identifying optimal electrical vectors
US9474457B2 (en) 2013-06-12 2016-10-25 Medtronic, Inc. Metrics of electrical dyssynchrony and electrical activation patterns from surface ECG electrodes
US10251555B2 (en) 2013-06-12 2019-04-09 Medtronic, Inc. Implantable electrode location selection
US9877789B2 (en) 2013-06-12 2018-01-30 Medtronic, Inc. Implantable electrode location selection
US9974442B2 (en) 2013-06-24 2018-05-22 Toshiba Medical Systems Corporation Method of, and apparatus for, processing volumetric image data
US9278220B2 (en) 2013-07-23 2016-03-08 Medtronic, Inc. Identification of healthy versus unhealthy substrate for pacing from a multipolar lead
US9282907B2 (en) 2013-07-23 2016-03-15 Medtronic, Inc. Identification of healthy versus unhealthy substrate for pacing from a multipolar lead
US9265954B2 (en) 2013-07-26 2016-02-23 Medtronic, Inc. Method and system for improved estimation of time of left ventricular pacing with respect to intrinsic right ventricular activation in cardiac resynchronization therapy
US9265955B2 (en) 2013-07-26 2016-02-23 Medtronic, Inc. Method and system for improved estimation of time of left ventricular pacing with respect to intrinsic right ventricular activation in cardiac resynchronization therapy
US9626757B2 (en) 2013-08-08 2017-04-18 Washington University System and method for the validation and quality assurance of computerized contours of human anatomy
US10376715B2 (en) 2013-08-08 2019-08-13 Washington University System and method for the validation and quality assurance of computerized contours of human anatomy
US9547894B2 (en) * 2013-10-08 2017-01-17 Toshiba Medical Systems Corporation Apparatus for, and method of, processing volumetric medical image data
US9406129B2 (en) 2013-10-10 2016-08-02 Medtronic, Inc. Method and system for ranking instruments
KR101578770B1 (en) 2013-11-21 2015-12-18 삼성전자주식회사 Apparatus por processing a medical image and method for processing a medical image
US9320446B2 (en) 2013-12-09 2016-04-26 Medtronic, Inc. Bioelectric sensor device and methods
US10206601B2 (en) 2013-12-09 2019-02-19 Medtronic, Inc. Noninvasive cardiac therapy evaluation
DE102013226858A1 (en) * 2013-12-20 2015-06-25 Siemens Aktiengesellschaft Method for generating an at least three-dimensional display data record, X-ray device and computer program
US9776009B2 (en) 2014-03-20 2017-10-03 Medtronic, Inc. Non-invasive detection of phrenic nerve stimulation
EP2930691A1 (en) 2014-04-10 2015-10-14 Dassault Systèmes Fitting sample points with an isovalue surface
EP2930692A1 (en) 2014-04-10 2015-10-14 Dassault Systèmes Fitting sample points of 3D curves sketched by a user with an isovalue surface
US10004467B2 (en) 2014-04-25 2018-06-26 Medtronic, Inc. Guidance system for localization and cannulation of the coronary sinus
JP6653667B2 (en) 2014-05-06 2020-02-26 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Devices, systems and methods for vascular evaluation
US9633431B2 (en) 2014-07-02 2017-04-25 Covidien Lp Fluoroscopic pose estimation
US9591982B2 (en) 2014-07-31 2017-03-14 Medtronic, Inc. Systems and methods for evaluating cardiac therapy
US9586050B2 (en) 2014-08-15 2017-03-07 Medtronic, Inc. Systems and methods for configuration of atrioventricular interval
US9586052B2 (en) 2014-08-15 2017-03-07 Medtronic, Inc. Systems and methods for evaluating cardiac therapy
US9764143B2 (en) 2014-08-15 2017-09-19 Medtronic, Inc. Systems and methods for configuration of interventricular interval
US9707400B2 (en) 2014-08-15 2017-07-18 Medtronic, Inc. Systems, methods, and interfaces for configuring cardiac therapy
US9668818B2 (en) 2014-10-15 2017-06-06 Medtronic, Inc. Method and system to select an instrument for lead stabilization
KR102367446B1 (en) 2014-12-11 2022-02-25 삼성메디슨 주식회사 Ultrasonic diagnostic apparatus and operating method for the same
US11253178B2 (en) 2015-01-29 2022-02-22 Medtronic, Inc. Noninvasive assessment of cardiac resynchronization therapy
JP2016158916A (en) * 2015-03-03 2016-09-05 キヤノンマーケティングジャパン株式会社 Medical image processing apparatus, program that can be incorporated into medical image processing apparatus, and medical image processing method
DE102015224176A1 (en) * 2015-12-03 2017-06-08 Siemens Healthcare Gmbh Tomography system and method for generating a sequence of volume images of a vascular system
US11172895B2 (en) 2015-12-07 2021-11-16 Covidien Lp Visualization, navigation, and planning with electromagnetic navigation bronchoscopy and cone beam computed tomography integrated
US11219769B2 (en) 2016-02-26 2022-01-11 Medtronic, Inc. Noninvasive methods and systems of determining the extent of tissue capture from cardiac pacing
US10780279B2 (en) 2016-02-26 2020-09-22 Medtronic, Inc. Methods and systems of optimizing right ventricular only pacing for patients with respect to an atrial event and left ventricular event
ES2963345T3 (en) * 2016-03-31 2024-03-26 Eyes Ltd System and methods for analysis of diagnostic images and evaluation of image quality
EP3555851B1 (en) * 2016-12-14 2021-09-22 Eyes Ltd Edge detection in digitized images
KR20180082114A (en) 2017-01-10 2018-07-18 삼성메디슨 주식회사 Apparatus and method for displaying an ultrasound image of the object
JP6871007B2 (en) * 2017-02-13 2021-05-12 キヤノンメディカルシステムズ株式会社 Medical image processing equipment and medical diagnostic imaging system
US10532213B2 (en) 2017-03-03 2020-01-14 Medtronic, Inc. Criteria for determination of local tissue latency near pacing electrode
US10987517B2 (en) 2017-03-15 2021-04-27 Medtronic, Inc. Detection of noise signals in cardiac signals
EP3382641A1 (en) * 2017-03-30 2018-10-03 Koninklijke Philips N.V. Contrast injection imaging
CN107233106A (en) * 2017-06-01 2017-10-10 上海交通大学 Blood vessel correspondence position relation search method and system under multi-angle radiography
WO2019023472A1 (en) 2017-07-28 2019-01-31 Medtronic, Inc. Generating activation times
WO2019023478A1 (en) 2017-07-28 2019-01-31 Medtronic, Inc. Cardiac cycle selection
US10709399B2 (en) * 2017-11-30 2020-07-14 Shenzhen Keya Medical Technology Corporation Methods and devices for performing three-dimensional blood vessel reconstruction using angiographic images
US11419539B2 (en) 2017-12-22 2022-08-23 Regents Of The University Of Minnesota QRS onset and offset times and cycle selection using anterior and posterior electrode signals
US10433746B2 (en) 2017-12-22 2019-10-08 Regents Of The University Of Minnesota Systems and methods for anterior and posterior electrode signal analysis
US10799703B2 (en) 2017-12-22 2020-10-13 Medtronic, Inc. Evaluation of his bundle pacing therapy
US10786167B2 (en) 2017-12-22 2020-09-29 Medtronic, Inc. Ectopic beat-compensated electrical heterogeneity information
US10492705B2 (en) 2017-12-22 2019-12-03 Regents Of The University Of Minnesota Anterior and posterior electrode signals
WO2019152850A1 (en) 2018-02-02 2019-08-08 Centerline Biomedical, Inc. Segmentation of anatomic structures
US10617318B2 (en) 2018-02-27 2020-04-14 Medtronic, Inc. Mapping electrical activity on a model heart
US10668290B2 (en) 2018-03-01 2020-06-02 Medtronic, Inc. Delivery of pacing therapy by a cardiac pacing device
US10918870B2 (en) 2018-03-07 2021-02-16 Medtronic, Inc. Atrial lead placement for treatment of atrial dyssynchrony
EP3768369A1 (en) 2018-03-23 2021-01-27 Medtronic, Inc. Av synchronous vfa cardiac therapy
US10780281B2 (en) 2018-03-23 2020-09-22 Medtronic, Inc. Evaluation of ventricle from atrium pacing therapy
CN111902187A (en) 2018-03-23 2020-11-06 美敦力公司 VFA cardiac resynchronization therapy
WO2019183514A1 (en) 2018-03-23 2019-09-26 Medtronic, Inc. Vfa cardiac therapy for tachycardia
CN111902082A (en) 2018-03-29 2020-11-06 美敦力公司 Left ventricular assist device adjustment and evaluation
US10940321B2 (en) 2018-06-01 2021-03-09 Medtronic, Inc. Systems, methods, and interfaces for use in cardiac evaluation
US11304641B2 (en) 2018-06-01 2022-04-19 Medtronic, Inc. Systems, methods, and interfaces for use in cardiac evaluation
JP2020017097A (en) * 2018-07-26 2020-01-30 株式会社Subaru Design support system, design support method, and design support program
JP2021535765A (en) 2018-08-31 2021-12-23 メドトロニック,インコーポレイテッド Adaptive VFA cardiac therapy
EP3856331A1 (en) 2018-09-26 2021-08-04 Medtronic, Inc. Capture in ventricle-from-atrium cardiac therapy
EP3880289B1 (en) 2018-11-17 2023-07-05 Medtronic, Inc. Vfa delivery systems
WO2020106702A1 (en) * 2018-11-19 2020-05-28 Cardioinsight Technologies, Inc. Graph total variation for ecgi
US20200196892A1 (en) 2018-12-20 2020-06-25 Medtronic, Inc. Propagation patterns method and related systems and devices
US20200197705A1 (en) 2018-12-20 2020-06-25 Medtronic, Inc. Implantable medical device delivery for cardiac therapy
CN113226444A (en) 2018-12-21 2021-08-06 美敦力公司 Delivery system and method for left ventricular pacing
US11679265B2 (en) 2019-02-14 2023-06-20 Medtronic, Inc. Lead-in-lead systems and methods for cardiac therapy
US11701517B2 (en) 2019-03-11 2023-07-18 Medtronic, Inc. Cardiac resynchronization therapy using accelerometer
US11547858B2 (en) 2019-03-29 2023-01-10 Medtronic, Inc. Systems, methods, and devices for adaptive cardiac therapy
US11697025B2 (en) 2019-03-29 2023-07-11 Medtronic, Inc. Cardiac conduction system capture
US11213676B2 (en) 2019-04-01 2022-01-04 Medtronic, Inc. Delivery systems for VfA cardiac therapy
JP7434008B2 (en) 2019-04-01 2024-02-20 キヤノンメディカルシステムズ株式会社 Medical image processing equipment and programs
EP3946061A4 (en) 2019-04-04 2022-12-14 Centerline Biomedical, Inc. Modeling regions of interest of an anatomic structure
US11071500B2 (en) 2019-05-02 2021-07-27 Medtronic, Inc. Identification of false asystole detection
US11712188B2 (en) 2019-05-07 2023-08-01 Medtronic, Inc. Posterior left bundle branch engagement
US11229367B2 (en) 2019-07-18 2022-01-25 Ischemaview, Inc. Systems and methods for analytical comparison and monitoring of aneurysms
US11328413B2 (en) 2019-07-18 2022-05-10 Ischemaview, Inc. Systems and methods for analytical detection of aneurysms
US11633607B2 (en) 2019-07-24 2023-04-25 Medtronic, Inc. AV synchronous septal pacing
US11305127B2 (en) 2019-08-26 2022-04-19 Medtronic Inc. VfA delivery and implant region detection
EP3792870A1 (en) * 2019-09-12 2021-03-17 Siemens Healthcare GmbH Method and device for automatic determination of the change of a hollow organ
US20210106832A1 (en) 2019-10-09 2021-04-15 Medtronic, Inc. Synchronizing external electrical activity
US20210106227A1 (en) 2019-10-09 2021-04-15 Medtronic, Inc. Systems, methods, and devices for determining cardiac condition
US11497431B2 (en) 2019-10-09 2022-11-15 Medtronic, Inc. Systems and methods for configuring cardiac therapy
US11642533B2 (en) 2019-11-04 2023-05-09 Medtronic, Inc. Systems and methods for evaluating cardiac therapy
CN114746008A (en) 2019-12-02 2022-07-12 美敦力公司 Generating representative cardiac information
CN111009032B (en) * 2019-12-04 2023-09-19 浙江理工大学 Vascular three-dimensional reconstruction method based on improved epipolar line constraint matching
CN111161342B (en) * 2019-12-09 2023-08-29 杭州脉流科技有限公司 Method, apparatus, device, system and readable storage medium for obtaining fractional flow reserve based on coronary angiography image
CN112508965B (en) * 2019-12-10 2023-08-22 广州柏视医疗科技有限公司 Automatic outline sketching system for normal organs in medical image
US11642032B2 (en) 2019-12-31 2023-05-09 Medtronic, Inc. Model-based therapy parameters for heart failure
US11813466B2 (en) 2020-01-27 2023-11-14 Medtronic, Inc. Atrioventricular nodal stimulation
US20210236038A1 (en) 2020-01-30 2021-08-05 Medtronic, Inc. Disturbance detection and removal in cardiac signals
US20210298658A1 (en) 2020-03-30 2021-09-30 Medtronic, Inc. Pacing efficacy determination using a representative morphology of external cardiac signals
US11911168B2 (en) 2020-04-03 2024-02-27 Medtronic, Inc. Cardiac conduction system therapy benefit determination
US20210308458A1 (en) 2020-04-03 2021-10-07 Medtronic, Inc. Cardiac conduction system engagement
US20210361219A1 (en) 2020-05-21 2021-11-25 Medtronic, Inc. Qrs detection and bracketing
US20220032069A1 (en) 2020-07-30 2022-02-03 Medtronic, Inc. Ecg belt systems to interoperate with imds
US20220031221A1 (en) 2020-07-30 2022-02-03 Medtronic, Inc. Patient screening and ecg belt for brady therapy tuning
US11813464B2 (en) 2020-07-31 2023-11-14 Medtronic, Inc. Cardiac conduction system evaluation
CN112419484B (en) * 2020-11-25 2024-03-22 苏州润迈德医疗科技有限公司 Three-dimensional vascular synthesis method, system, coronary artery analysis system and storage medium
WO2022270152A1 (en) * 2021-06-25 2022-12-29 富士フイルム株式会社 Image processing device, method, and program
WO2023021367A1 (en) 2021-08-19 2023-02-23 Medtronic, Inc. Pacing artifact mitigation
WO2023105316A1 (en) 2021-12-07 2023-06-15 Medtronic, Inc. Determination of cardiac conduction system therapy benefit

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6332034B1 (en) * 1998-03-24 2001-12-18 U.S. Philips Corporation Image processing method including steps for the segmentation of a multidimensional image, and medical imaging apparatus utilizing this method
US6381350B1 (en) * 1999-07-02 2002-04-30 The Cleveland Clinic Foundation Intravascular ultrasonic analysis using active contour method and system
US6385332B1 (en) * 1999-02-19 2002-05-07 The John P. Roberts Research Institute Automated segmentation method for 3-dimensional ultrasound
US6463309B1 (en) * 2000-05-11 2002-10-08 Hanna Ilia Apparatus and method for locating vessels in a living body

Family Cites Families (88)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3357550A (en) * 1966-06-23 1967-12-12 American Cyanamid Co Combination reel and label for surgical sutures
US4263916A (en) * 1978-03-27 1981-04-28 University Of Southern California Image averaging for angiography by registration and combination of serial images
US4889128A (en) * 1985-09-13 1989-12-26 Pfizer Hospital Products Doppler catheter
FR2615619B1 (en) 1987-05-21 1991-07-19 Commissariat Energie Atomique METHOD AND DEVICE FOR THREE-DIMENSIONAL IMAGING FROM TWO-DIMENSIONAL MEASUREMENTS OF RADIATION MITIGATION
US5309356A (en) 1987-09-29 1994-05-03 Kabushiki Kaisha Toshiba Three-dimensional reprojected image forming apparatus
US4875165A (en) 1987-11-27 1989-10-17 University Of Chicago Method for determination of 3-D structure in biplane angiography
FR2636451A1 (en) * 1988-09-13 1990-03-16 Gen Electric Cgr METHOD FOR RECONSTRUCTION OF THREE-DIMENSIONAL TREE BY LABELING
JPH0357081A (en) 1989-07-26 1991-03-12 Canon Inc Picture processor
US5207226A (en) * 1991-01-25 1993-05-04 Regents Of The University Of Minnesota Device and method for measurement of blood flow
JP3167367B2 (en) 1991-09-09 2001-05-21 株式会社東芝 Cardiovascular diagnostic device
US5734384A (en) * 1991-11-29 1998-03-31 Picker International, Inc. Cross-referenced sectioning and reprojection of diagnostic image volumes
JPH05253215A (en) * 1992-03-12 1993-10-05 Toshiba Corp Image processing device
US5203777A (en) * 1992-03-19 1993-04-20 Lee Peter Y Radiopaque marker system for a tubular device
JP3305774B2 (en) * 1992-11-27 2002-07-24 東芝医用システムエンジニアリング株式会社 Image processing device
US5391199A (en) * 1993-07-20 1995-02-21 Biosense, Inc. Apparatus and method for treating cardiac arrhythmias
FR2708166A1 (en) 1993-07-22 1995-01-27 Philips Laboratoire Electroniq A method of processing digitized images for the automatic detection of stenoses.
JP2978035B2 (en) * 1993-08-03 1999-11-15 浜松ホトニクス株式会社 Inside diameter measuring device for cylindrical objects
JPH0779955A (en) * 1993-09-14 1995-03-28 Toshiba Corp Radiographic apparatus
US5609627A (en) * 1994-02-09 1997-03-11 Boston Scientific Technology, Inc. Method for delivering a bifurcated endoluminal prosthesis
AU2432795A (en) * 1994-05-03 1995-11-29 Molecular Biosystems, Inc. Composition for ultrasonically quantitating myocardial perfusion
US5446800A (en) * 1994-06-13 1995-08-29 Diasonics Ultrasound, Inc. Method and apparatus for displaying angiographic data in a topographic format
JPH08131429A (en) * 1994-11-11 1996-05-28 Toshiba Corp Method and device for reproducing tubular body image
WO1996025882A1 (en) 1995-02-22 1996-08-29 Groenningsaeter Aage Method for ultrasound guidance during clinical procedures
RU2119765C1 (en) 1995-02-23 1998-10-10 Пермская государственная медицинская академия Method for estimating coronary and myocardial reserves of the heart in tachycardia cases under two-chamber electric cardiac stimulation
US6246898B1 (en) * 1995-03-28 2001-06-12 Sonometrics Corporation Method for carrying out a medical procedure using a three-dimensional tracking and imaging system
US5729129A (en) * 1995-06-07 1998-03-17 Biosense, Inc. Magnetic location system with feedback adjustment of magnetic field generator
US5647360A (en) 1995-06-30 1997-07-15 Siemens Corporate Research, Inc. Digital subtraction angiography for 3D diagnostic imaging
US5671265A (en) 1995-07-14 1997-09-23 Siemens Corporate Research, Inc. Evidential reconstruction of vessel trees from X-ray angiograms with a dynamic contrast bolus
US6027460A (en) * 1995-09-14 2000-02-22 Shturman Cardiology Systems, Inc. Rotatable intravascular apparatus
US5583902A (en) * 1995-10-06 1996-12-10 Bhb General Partnership Method of and apparatus for predicting computed tomography contrast enhancement
JPH11514269A (en) * 1995-10-13 1999-12-07 トランスバスキュラー インコーポレイテッド Methods and apparatus for bypassing arterial occlusion and / or performing other transvascular approaches
US6709444B1 (en) * 1996-02-02 2004-03-23 Transvascular, Inc. Methods for bypassing total or near-total obstructions in arteries or other anatomical conduits
US5833607A (en) 1996-03-25 1998-11-10 Siemens Corporate Research, Inc. Automatic full-leg mosaic and display for peripheral angiography
US5699799A (en) * 1996-03-26 1997-12-23 Siemens Corporate Research, Inc. Automatic determination of the curved axis of a 3-D tube-shaped object in image volume
US6047080A (en) * 1996-06-19 2000-04-04 Arch Development Corporation Method and apparatus for three-dimensional reconstruction of coronary vessels from angiographic images
JPH105203A (en) * 1996-06-21 1998-01-13 Toshiba Corp Diagnostic system, diagnostic information producing method and three dimensional image reconfiguration method
US6167296A (en) * 1996-06-28 2000-12-26 The Board Of Trustees Of The Leland Stanford Junior University Method for volumetric image navigation
DE19705599A1 (en) * 1997-02-14 1998-08-20 Philips Patentverwaltung X-ray imaging process with a series of exposures from different perspectives
US6095976A (en) 1997-06-19 2000-08-01 Medinol Ltd. Method for enhancing an image derived from reflected ultrasound signals produced by an ultrasound transmitter and detector inserted in a bodily lumen
US5912945A (en) * 1997-06-23 1999-06-15 Regents Of The University Of California X-ray compass for determining device orientation
US6148095A (en) 1997-09-08 2000-11-14 University Of Iowa Research Foundation Apparatus and method for determining three-dimensional representations of tortuous vessels
US6195450B1 (en) 1997-09-18 2001-02-27 Siemens Corporate Research, Inc. Methods and apparatus for controlling X-ray angiographic image acquisition
US6249695B1 (en) * 1997-11-21 2001-06-19 Fonar Corporation Patient movement during image guided surgery
IT1297396B1 (en) 1997-12-30 1999-09-01 Francesco Buzzigoli METHOD AND DEVICE FOR THE RECONSTRUCTION OF THREE-DIMENSIONAL IMAGES OF BLOOD VESSELS, IN PARTICULAR OF CORONARY ARTERIES, OR OF OTHER
IL138667A0 (en) * 1998-03-31 2001-10-31 Transvascular Inc Tissue penetrating catheters having integral imaging transducers and methods of their use
US6094591A (en) * 1998-04-10 2000-07-25 Sunnybrook Health Science Centre Measurement of coronary flow reserve with MR oximetry
US6088488A (en) 1998-04-17 2000-07-11 General Electric Company Vascular imaging with adaptive averaging
US6301498B1 (en) 1998-04-17 2001-10-09 Cornell Research Foundation, Inc. Method of determining carotid artery stenosis using X-ray imagery
FR2779853B1 (en) 1998-06-11 2000-08-11 Ge Medical Syst Sa PROCESS FOR RECONSTRUCTING A THREE-DIMENSIONAL IMAGE OF AN OBJECT, IN PARTICULAR AN ANGIOGRAPHIC THREE-DIMENSIONAL IMAGE
US6081577A (en) 1998-07-24 2000-06-27 Wake Forest University Method and system for creating task-dependent three-dimensional images
US6195577B1 (en) * 1998-10-08 2001-02-27 Regents Of The University Of Minnesota Method and apparatus for positioning a device in a body
DE19853964C1 (en) 1998-11-23 2000-05-18 Siemens Ag X-ray image recording for examination of rhythmically moving vessel or organ
SE9804147D0 (en) 1998-12-01 1998-12-01 Siemens Elema Ab System for three-dimensional imaging of an internal organ or body structure
US20030032886A1 (en) 1999-03-09 2003-02-13 Elhanan Dgany System for determining coronary flow reserve (CFR) value for a stenosed blood vessel, CFR processor therefor, and method therefor
JP2000271110A (en) 1999-03-26 2000-10-03 Hitachi Medical Corp Medical x-ray system
DE19919907C2 (en) * 1999-04-30 2003-10-16 Siemens Ag Method and device for catheter navigation in three-dimensional vascular tree images
US6233476B1 (en) * 1999-05-18 2001-05-15 Mediguide Ltd. Medical positioning system
US6290673B1 (en) * 1999-05-20 2001-09-18 Conor Medsystems, Inc. Expandable medical device delivery system and method
DE19944982A1 (en) 1999-09-20 2001-09-27 Siemens Ag X-ray diagnostic device, especially suited to cardangiography, employs a method of moving both X-ray source and X-ray image capture device in a cyclical manner that is used to compensate for heart beat effects
US6711433B1 (en) 1999-09-30 2004-03-23 Siemens Corporate Research, Inc. Method for providing a virtual contrast agent for augmented angioscopy
US6546271B1 (en) 1999-10-01 2003-04-08 Bioscience, Inc. Vascular reconstruction
US6515657B1 (en) 2000-02-11 2003-02-04 Claudio I. Zanelli Ultrasonic imager
US6535756B1 (en) * 2000-04-07 2003-03-18 Surgical Navigation Technologies, Inc. Trajectory storage apparatus and method for surgical navigation system
AU2001235964A1 (en) 2000-05-09 2001-11-20 Paieon Inc. System and method for three-dimensional reconstruction of an artery
US6334864B1 (en) * 2000-05-17 2002-01-01 Aga Medical Corp. Alignment member for delivering a non-symmetric device with a predefined orientation
US6748259B1 (en) * 2000-06-15 2004-06-08 Spectros Corporation Optical imaging of induced signals in vivo under ambient light conditions
US6351513B1 (en) * 2000-06-30 2002-02-26 Siemens Corporate Research, Inc. Fluoroscopy based 3-D neural navigation based on co-registration of other modalities with 3-D angiography reconstruction data
US6389104B1 (en) * 2000-06-30 2002-05-14 Siemens Corporate Research, Inc. Fluoroscopy based 3-D neural navigation based on 3-D angiography reconstruction data
US6505064B1 (en) * 2000-08-22 2003-01-07 Koninklijke Philips Electronics, N.V. Diagnostic imaging systems and methods employing temporally resolved intensity tracing
US6980675B2 (en) * 2000-10-18 2005-12-27 Paieon, Inc. Method for processing images of coronary arteries
ATE493070T1 (en) 2000-10-18 2011-01-15 Paieon Inc SYSTEM FOR POSITIONING A DEVICE IN A TUBULAR ORGAN
JP2002207992A (en) * 2001-01-12 2002-07-26 Hitachi Ltd Method and device for processing image
US6503203B1 (en) * 2001-01-16 2003-01-07 Koninklijke Philips Electronics N.V. Automated ultrasound system for performing imaging studies utilizing ultrasound contrast agents
US6754522B2 (en) 2001-09-05 2004-06-22 Medimag C.V.I., Inc. Imaging methods and apparatus particularly useful for two and three-dimensional angiography
US6669481B2 (en) * 2001-11-08 2003-12-30 The United States Of America As Represented By The Secretary Of The Army Neurocognitive assessment apparatus and method
US7289652B2 (en) * 2001-11-21 2007-10-30 Koninklijke Philips Electronics, N. V. Medical viewing system and method for detecting and enhancing structures in noisy images
US6990368B2 (en) * 2002-04-04 2006-01-24 Surgical Navigation Technologies, Inc. Method and apparatus for virtual digital subtraction angiography
US20030199759A1 (en) * 2002-04-18 2003-10-23 Richard Merwin F. Coronary catheter with radiopaque length markers
AU2003247379A1 (en) 2002-05-17 2003-12-02 Case Western Reserve University Systems and methods for assessing blood flow in a target tissue
EP1514239A2 (en) 2002-06-05 2005-03-16 Koninklijke Philips Electronics N.V. Analysis of a multi-dimensional structure
WO2004008967A1 (en) * 2002-07-23 2004-01-29 Ge Medical Systems Global Technology Company, Llc Methods and systems for detecting components of plaque
US7324675B2 (en) * 2002-11-27 2008-01-29 The Board Of Trustees Of The Leland Stanford Junior University Quantification of aortoiliac endoluminal irregularity
CA2533538A1 (en) 2003-07-21 2005-01-27 Paieon Inc. Method and system for identifying an optimal image within a series of images that depict a moving organ
WO2005020148A1 (en) 2003-08-21 2005-03-03 Philips Intellectual Property & Standards Gmbh Device and method for combined display of angiograms and current x-ray images
JP5129480B2 (en) 2003-09-25 2013-01-30 パイエオン インコーポレイテッド System for performing three-dimensional reconstruction of tubular organ and method for operating blood vessel imaging device
US8014849B2 (en) * 2003-11-21 2011-09-06 Stryker Corporation Rotational markers
US20060074285A1 (en) 2004-09-24 2006-04-06 Paieon Inc. Apparatus and method for fusion and in-operating-room presentation of volumetric data and 3-D angiographic data
EP1830701A1 (en) 2004-12-08 2007-09-12 Paieon Inc. Method and apparatus for blood vessel parameter determinations

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6332034B1 (en) * 1998-03-24 2001-12-18 U.S. Philips Corporation Image processing method including steps for the segmentation of a multidimensional image, and medical imaging apparatus utilizing this method
US6385332B1 (en) * 1999-02-19 2002-05-07 The John P. Roberts Research Institute Automated segmentation method for 3-dimensional ultrasound
US6381350B1 (en) * 1999-07-02 2002-04-30 The Cleveland Clinic Foundation Intravascular ultrasonic analysis using active contour method and system
US6463309B1 (en) * 2000-05-11 2002-10-08 Hanna Ilia Apparatus and method for locating vessels in a living body

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP1665130A4 *

Cited By (110)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7778685B2 (en) 2000-10-18 2010-08-17 Paieon Inc. Method and system for positioning a device in a tubular organ
US8126241B2 (en) 2001-10-15 2012-02-28 Michael Zarkh Method and apparatus for positioning a device in a tubular organ
US7587074B2 (en) 2003-07-21 2009-09-08 Paieon Inc. Method and system for identifying optimal image within a series of images that depict a moving organ
US7742629B2 (en) 2003-09-25 2010-06-22 Paieon Inc. System and method for three-dimensional reconstruction of a tubular organ
US9984456B2 (en) 2004-04-14 2018-05-29 Edda Technology, Inc. Method and system for labeling hepatic vascular structure in interactive liver disease diagnosis
US8295577B2 (en) 2005-03-31 2012-10-23 Michael Zarkh Method and apparatus for guiding a device in a totally occluded or partly occluded tubular organ
WO2006117773A1 (en) * 2005-05-03 2006-11-09 Paieon Inc. Method and apparatus for positioning a biventrivular pacemaker lead and electrode
WO2006137016A1 (en) * 2005-06-21 2006-12-28 Koninklijke Philips Electronics N. V. Method and device for imaging a blood vessel
WO2007002562A3 (en) * 2005-06-24 2007-09-20 Edda Technology Inc Methods for interactive liver disease diagnosis
WO2007002562A2 (en) * 2005-06-24 2007-01-04 Edda Technology, Inc. Methods for interactive liver disease diagnosis
JP2007069000A (en) * 2005-09-06 2007-03-22 Pie Medical Imaging Bv Method, apparatus and computer program for detecting contour of blood vessel using x-ray density measuring method
JP2007083048A (en) * 2005-09-23 2007-04-05 Mediguide Ltd Method and system for determining three dimensional representation of tubular organ
JP2010504794A (en) * 2006-09-29 2010-02-18 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Protrusion detection method, system, and computer program
WO2008051841A3 (en) * 2006-10-20 2008-07-03 Stereotaxis Inc Location and display of occluded portions of vessels on 3-d angiographic images
WO2008051841A2 (en) * 2006-10-20 2008-05-02 Stereotaxis, Inc. Location and display of occluded portions of vessels on 3-d angiographic images
US8135185B2 (en) 2006-10-20 2012-03-13 Stereotaxis, Inc. Location and display of occluded portions of vessels on 3-D angiographic images
US20100309198A1 (en) * 2007-05-15 2010-12-09 Claude Kauffmann method for tracking 3d anatomical and pathological changes in tubular-shaped anatomical structures
US11107587B2 (en) 2008-07-21 2021-08-31 The Board Of Trustees Of The Leland Stanford Junior University Method for tuning patient-specific cardiovascular simulations
US8200466B2 (en) 2008-07-21 2012-06-12 The Board Of Trustees Of The Leland Stanford Junior University Method for tuning patient-specific cardiovascular simulations
US10354050B2 (en) 2009-03-17 2019-07-16 The Board Of Trustees Of Leland Stanford Junior University Image processing method for determining patient-specific cardiovascular information
US9226672B2 (en) 2010-08-12 2016-01-05 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US10441361B2 (en) 2010-08-12 2019-10-15 Heartflow, Inc. Method and system for image processing and patient-specific modeling of blood flow
US8311750B2 (en) 2010-08-12 2012-11-13 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8315813B2 (en) 2010-08-12 2012-11-20 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8315814B2 (en) 2010-08-12 2012-11-20 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8315812B2 (en) 2010-08-12 2012-11-20 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8321150B2 (en) 2010-08-12 2012-11-27 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8386188B2 (en) 2010-08-12 2013-02-26 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US11793575B2 (en) 2010-08-12 2023-10-24 Heartflow, Inc. Method and system for image processing to determine blood flow
US8496594B2 (en) 2010-08-12 2013-07-30 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8523779B2 (en) 2010-08-12 2013-09-03 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US11583340B2 (en) 2010-08-12 2023-02-21 Heartflow, Inc. Method and system for image processing to determine blood flow
US11298187B2 (en) 2010-08-12 2022-04-12 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US8594950B2 (en) 2010-08-12 2013-11-26 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8606530B2 (en) 2010-08-12 2013-12-10 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8630812B2 (en) 2010-08-12 2014-01-14 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US11154361B2 (en) 2010-08-12 2021-10-26 Heartflow, Inc. Method and system for image processing to determine blood flow
US11135012B2 (en) 2010-08-12 2021-10-05 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US8734356B2 (en) 2010-08-12 2014-05-27 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8734357B2 (en) 2010-08-12 2014-05-27 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8812245B2 (en) 2010-08-12 2014-08-19 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8812246B2 (en) 2010-08-12 2014-08-19 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US11116575B2 (en) 2010-08-12 2021-09-14 Heartflow, Inc. Method and system for image processing to determine blood flow
US11090118B2 (en) 2010-08-12 2021-08-17 Heartflow, Inc. Method and system for image processing and patient-specific modeling of blood flow
US11083524B2 (en) 2010-08-12 2021-08-10 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US11033332B2 (en) 2010-08-12 2021-06-15 Heartflow, Inc. Method and system for image processing to determine blood flow
US10702340B2 (en) 2010-08-12 2020-07-07 Heartflow, Inc. Image processing and patient-specific modeling of blood flow
US10702339B2 (en) 2010-08-12 2020-07-07 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US10682180B2 (en) 2010-08-12 2020-06-16 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US9081882B2 (en) 2010-08-12 2015-07-14 HeartFlow, Inc Method and system for patient-specific modeling of blood flow
US9078564B2 (en) 2010-08-12 2015-07-14 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US9152757B2 (en) 2010-08-12 2015-10-06 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US9149197B2 (en) 2010-08-12 2015-10-06 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US9167974B2 (en) 2010-08-12 2015-10-27 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US10531923B2 (en) 2010-08-12 2020-01-14 Heartflow, Inc. Method and system for image processing to determine blood flow
US8311747B2 (en) 2010-08-12 2012-11-13 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US9235679B2 (en) 2010-08-12 2016-01-12 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US9268902B2 (en) 2010-08-12 2016-02-23 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US9271657B2 (en) 2010-08-12 2016-03-01 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US10492866B2 (en) 2010-08-12 2019-12-03 Heartflow, Inc. Method and system for image processing to determine blood flow
US9449147B2 (en) 2010-08-12 2016-09-20 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US10478252B2 (en) 2010-08-12 2019-11-19 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US9585723B2 (en) 2010-08-12 2017-03-07 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US9697330B2 (en) 2010-08-12 2017-07-04 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US9706925B2 (en) 2010-08-12 2017-07-18 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US9743835B2 (en) 2010-08-12 2017-08-29 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US9801689B2 (en) 2010-08-12 2017-10-31 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US9839484B2 (en) 2010-08-12 2017-12-12 Heartflow, Inc. Method and system for image processing and patient-specific modeling of blood flow
US9855105B2 (en) 2010-08-12 2018-01-02 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US9861284B2 (en) 2010-08-12 2018-01-09 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US9888971B2 (en) 2010-08-12 2018-02-13 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US8249815B2 (en) 2010-08-12 2012-08-21 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US10052158B2 (en) 2010-08-12 2018-08-21 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US10080613B2 (en) 2010-08-12 2018-09-25 Heartflow, Inc. Systems and methods for determining and visualizing perfusion of myocardial muscle
US10080614B2 (en) 2010-08-12 2018-09-25 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US10092360B2 (en) 2010-08-12 2018-10-09 Heartflow, Inc. Method and system for image processing and patient-specific modeling of blood flow
US10149723B2 (en) 2010-08-12 2018-12-11 Heartflow, Inc. Method and system for image processing and patient-specific modeling of blood flow
US10154883B2 (en) 2010-08-12 2018-12-18 Heartflow, Inc. Method and system for image processing and patient-specific modeling of blood flow
US10159529B2 (en) 2010-08-12 2018-12-25 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US10166077B2 (en) 2010-08-12 2019-01-01 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US10179030B2 (en) 2010-08-12 2019-01-15 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8311748B2 (en) 2010-08-12 2012-11-13 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US10376317B2 (en) 2010-08-12 2019-08-13 Heartflow, Inc. Method and system for image processing and patient-specific modeling of blood flow
US10321958B2 (en) 2010-08-12 2019-06-18 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US10327847B2 (en) 2010-08-12 2019-06-25 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8157742B2 (en) 2010-08-12 2012-04-17 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
WO2012028190A1 (en) 2010-09-02 2012-03-08 Pie Medical Imaging Bv Method and apparatus for quantitative analysis of a tree of recursively splitting tubular organs
US11810661B2 (en) 2011-09-13 2023-11-07 Koninklijke Philips N.V. Vessel annotator
WO2013038290A1 (en) * 2011-09-13 2013-03-21 Koninklijke Philips Electronics N.V. Vessel annotator.
RU2595805C2 (en) * 2011-09-13 2016-08-27 Конинклейке Филипс Н.В. Vessel annotator
CN103814379A (en) * 2011-09-13 2014-05-21 皇家飞利浦有限公司 Vessel annotator
US9063635B2 (en) 2012-05-14 2015-06-23 Heartflow, Inc. Method and system for providing information from a patient-specific model of blood flow
US9002690B2 (en) 2012-05-14 2015-04-07 Heartflow, Inc. Method and system for providing information from a patient-specific model of blood flow
US10842568B2 (en) 2012-05-14 2020-11-24 Heartflow, Inc. Method and system for providing information from a patient-specific model of blood flow
US11826106B2 (en) 2012-05-14 2023-11-28 Heartflow, Inc. Method and system for providing information from a patient-specific model of blood flow
US9517040B2 (en) 2012-05-14 2016-12-13 Heartflow, Inc. Method and system for providing information from a patient-specific model of blood flow
US8914264B1 (en) 2012-05-14 2014-12-16 Heartflow, Inc. Method and system for providing information from a patient-specific model of blood flow
US9063634B2 (en) 2012-05-14 2015-06-23 Heartflow, Inc. Method and system for providing information from a patient-specific model of blood flow
US8855984B2 (en) 2012-05-14 2014-10-07 Heartflow, Inc. Method and system for providing information from a patient-specific model of blood flow
US9168012B2 (en) 2012-05-14 2015-10-27 Heartflow, Inc. Method and system for providing information from a patient-specific model of blood flow
US8706457B2 (en) 2012-05-14 2014-04-22 Heartflow, Inc. Method and system for providing information from a patient-specific model of blood flow
US8548778B1 (en) 2012-05-14 2013-10-01 Heartflow, Inc. Method and system for providing information from a patient-specific model of blood flow
CN103295262A (en) * 2013-05-21 2013-09-11 东软集团股份有限公司 Rotating multi-angle surface reconstruction method and device for tubular cavity tissue
US20140369583A1 (en) * 2013-06-18 2014-12-18 Konica Minolta, Inc. Ultrasound diagnostic device, ultrasound diagnostic method, and computer-readable medium having recorded program therein
CN104408408A (en) * 2014-11-10 2015-03-11 杭州保迪自动化设备有限公司 Extraction method and extraction device for robot spraying track based on curve three-dimensional reconstruction
WO2019052709A1 (en) * 2017-09-13 2019-03-21 Siemens Healthcare Gmbh Improved 3-d vessel tree surface reconstruction
CN109366446A (en) * 2018-10-30 2019-02-22 中国冶集团有限公司 The setting out method of electricity driving displacement line-plotting device and abnormal-shaped screw body intersection
KR102542972B1 (en) * 2022-07-04 2023-06-15 재단법인 아산사회복지재단 Method and apparatus for generating three-dimensional blood vessel structure
CN115984301A (en) * 2022-12-13 2023-04-18 徐州医科大学 Three-dimensional reconstruction precision comparison method based on Bujia syndrome blood vessel image
CN115984301B (en) * 2022-12-13 2023-07-11 徐州医科大学 Blood vessel image three-dimensional reconstruction precision comparison method based on Buddha syndrome

Also Published As

Publication number Publication date
US20070116342A1 (en) 2007-05-24
EP1665130A4 (en) 2009-11-18
EP1665130A1 (en) 2006-06-07
IL174514A0 (en) 2006-08-01
US7742629B2 (en) 2010-06-22
IL174514A (en) 2010-12-30
JP5129480B2 (en) 2013-01-30
JP2007506531A (en) 2007-03-22

Similar Documents

Publication Publication Date Title
US7742629B2 (en) System and method for three-dimensional reconstruction of a tubular organ
US6501848B1 (en) Method and apparatus for three-dimensional reconstruction of coronary vessels from angiographic images and analytical techniques applied thereto
JP6894896B2 (en) X-ray image feature detection and alignment systems and methods
EP1595228B1 (en) Method for the 3d modeling of a tubular structure
US8965074B2 (en) Image processing apparatus
EP2099378B1 (en) Apparatus for determining a position of a first object within a second object
Baert et al. Three-dimensional guide-wire reconstruction from biplane image sequences for integrated display in 3-D vasculature
JP4891541B2 (en) Vascular stenosis rate analysis system
US20060074285A1 (en) Apparatus and method for fusion and in-operating-room presentation of volumetric data and 3-D angiographic data
US20060036167A1 (en) Vascular image processing
RU2594811C2 (en) Visualisation for navigation instruction
JP2006246941A (en) Image processing apparatus and vessel tracking method
JP2008253753A (en) Heart function display device and its program
CN111009032B (en) Vascular three-dimensional reconstruction method based on improved epipolar line constraint matching
EP3494548A1 (en) System and method of generating and updating a three dimensional model of a luminal network
CN100378750C (en) System and method for three-dimensional reconstruction of a tubular organ
WO2017038300A1 (en) Ultrasonic imaging device, and image processing device and method
US7706589B2 (en) Analysis of a multi-dimensional structure
US20220254131A1 (en) Methods, apparatus, and system for synchronization between a three-dimensional vascular model and an imaging device
CN111784751B (en) 3D/2D registration-based guide wire 3D simulation tracking method and device
JPWO2005011499A1 (en) Tomographic image constructing apparatus and method
US20200297292A1 (en) Catheter tip detection in fluoroscopic video using deep learning
CN115131508B (en) DSA modeling point cloud data fusion processing method based on data processing
Sen et al. Determination and evaluation of 3D biplane imaging geometries without a calibration object
CN117256006A (en) Stitching multiple images to create a panoramic image

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200480033739.7

Country of ref document: CN

AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 174514

Country of ref document: IL

WWE Wipo information: entry into national phase

Ref document number: 2007116342

Country of ref document: US

Ref document number: 10573464

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 2006528281

Country of ref document: JP

REEP Request for entry into the european phase

Ref document number: 2004785100

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2004785100

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 1784/DELNP/2006

Country of ref document: IN

WWP Wipo information: published in national office

Ref document number: 2004785100

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 10573464

Country of ref document: US