US20090281452A1 - System and method for a medical procedure using computed tomography - Google Patents

System and method for a medical procedure using computed tomography Download PDF

Info

Publication number
US20090281452A1
US20090281452A1 US12/429,546 US42954609A US2009281452A1 US 20090281452 A1 US20090281452 A1 US 20090281452A1 US 42954609 A US42954609 A US 42954609A US 2009281452 A1 US2009281452 A1 US 2009281452A1
Authority
US
United States
Prior art keywords
interventional
patient
target
bull
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/429,546
Inventor
Marcus Pfister
Norbert Strobel
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Siemens AG
Original Assignee
Siemens AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens AG filed Critical Siemens AG
Priority to US12/429,546 priority Critical patent/US20090281452A1/en
Assigned to SIEMENS AKTIENGESELLSCHAFT reassignment SIEMENS AKTIENGESELLSCHAFT ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: PFISTER, MARCUS, STROBEL, NORBERT
Publication of US20090281452A1 publication Critical patent/US20090281452A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/12Devices for detecting or locating foreign bodies
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/44Constructional features of apparatus for radiation diagnosis
    • A61B6/4429Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units
    • A61B6/4435Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure
    • A61B6/4441Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure the rigid structure being a C-arm or U-arm
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5229Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/10Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
    • A61B90/11Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis with guides for needles or instruments, e.g. arcuate slides or ball joints
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/10Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
    • A61B90/11Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis with guides for needles or instruments, e.g. arcuate slides or ball joints
    • A61B90/13Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis with guides for needles or instruments, e.g. arcuate slides or ball joints guided by light, e.g. laser pointers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • A61N5/1048Monitoring, verifying, controlling systems and methods
    • A61N5/1049Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam
    • A61N2005/1061Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam using an x-ray imaging system having a separate imaging source

Definitions

  • the present application may relate to a system and method for performing percutaneous medical procedures using imaging modalities for instrument alignment and guidance.
  • German Patent application DE 10200729119 filed on Jul. 25, 2007, and published as DE 10 2007 029 199 on Jan. 2, 2009, having a common inventor, entitled “Method for Aligning a Target Guidance System for a Puncture, and an X-ray Angiographic System” describes a method of orienting a biopsy needle using a laser aligned along the planned puncture path.
  • the aiming of the laser includes manual alignments (such as the table position), in order to achieve a match between the needle and the planned path. This is attained by moving the C-arm and/or the patient support table, or both, so that certain markings on a display screen match.
  • the needle can be aligned without further irradiation of the patient (and doctor); however, the approach is limited to the positions that are feasible for orientation of the C-arm or the patient support table, which may also limit the orientation of the central (optical) axis of the radiation beam with respect to the patient.
  • a method planning or performing a percutaneous medical procedure including the steps of obtaining three-dimensional image data of a patient; and orienting a representation of the patient, or the patient, on a patient support table of an interventional apparatus.
  • the three-dimensional image data is registered with respect to the coordinate system of the oriented patient and a percutaneous medical procedure is planned using the three dimensional image data.
  • the planned path is such that a guiding axis for an interventional device passes through a skin entry point and a target.
  • the computed angulation of the interventional apparatus is adjusted such that one of a bull's eye or a generalized bull's eye image is obtained in a projection view.
  • a computer program product includes instructions for configuring a computer to accept three-dimensional image data of a patient and register the three dimensional data image with respect to the patient. Images suitable for planning a guiding path between a skin entry point and a target in the patient are produced, and a planned path is one whose axis passes through the skin entry point and the target. The selected angulation of the planned path is one which meets one of a bull's eye or generalized bull's eye criterion. For the selected angulation, relation between a volumetric representation of the interventional apparatus and the patient support table is computed so as to determine whether there would be a collision between the interventional apparatus and the patient support table.
  • FIG. 1 is a block diagram showing the component equipment of a treatment unit for planning and performing a percutaneous medical procedure
  • FIG. 2 illustrates schematically the orientation of a skin entry point and a target in a patient body on a radiograph for (A) a bull's eye view; (B) a generalized bull's eye view; and, (C) no alignment, where the representations of the skin entry point and the target are shown in a relative size for clarity in illustration; and
  • FIG. 3 is a workflow for a method of planning and performing the percutaneous procedure
  • the system may comprise an imaging system having a movable arm, an X-ray source and an X-ray detector; and, a display and a system controller connected to and in communication with the imaging system and display, and include a machine-readable storage medium encoded with a computer program code such that, when the computer program code is executed by a processor, the processor performs a method as described herein.
  • a “patient 3-dimensional image data set” is a three dimensional numerical array whose elements hold the values of specific physical properties at small volumes in space inside the patient's body; each volume is known as a “voxel” and may be considered to be comparable to a “pixel” in a 2-dimensional image.
  • a “multiplanar reformation image (MPR)” is a planar cross-section of the patient 3-dimensional image data set generated by cutting through the three-dimensional data set at some orientation (e.g., axial, coronal, sagittal, or oblique), which may be a slice of CT or CT-like data.
  • a “fluoroscopic image” or “radiographic image” is a two-dimensional x-ray projection image showing internal tissues of a region of the body. This image may be either live or synthesized (rendered) from previously obtained three dimensional voxel data. Two-dimensional representations of the 3D volume data set generated using volume rendering techniques may also be referred to as fluoroscopy overlay images. Fluoroscopy overlay images may include graphical representations of instrument trajectories or of instruments together with the surrounding anatomy.
  • a “live fluoroscopic image” is a sequence of x-ray images taken successively showing live movement of internal tissues of a region of the body.
  • a “combined image” is an image in which an X-ray image is combined with a 2D representation of a 3D data set computed using volume rendering techniques. Such images may also be overlaid with icons or other representations of a target, a skin entry point, or a planned or estimated position of an interventional device.
  • “Registering” or “co-registration” means aligning an X-ray image, such as a fluoroscopic image, with the patient 3-dimensional image data set such that associated features of the X-ray image and a two-dimensional overlay image generated from the previously obtained patient 3-dimensional image data set appear at the same location on a display when the X-ray image and the overlay image are shown together.
  • Registration can be point-based or gray-level based. In point-based registration, a transform is applied to the 3-dimensional image data set such that points in the resulting overlay image line up with their counterparts in the x-ray image and meeting a metric, which may be analytic or visual.
  • Gray-level based registration techniques determine the transform not by minimizing the distance between associated points in the overlay image and X-ray image, but by minimizing an error metric based on the resulting overlay image gray levels and the X-ray image gray levels.
  • Interventional device or “Interventional Instrument” refers to any object which may pierce tissue of a patient, a non-limiting listing of which include needles and other biopsy devices, screws, implants, cannula, endoscopes, and anything else that can be inserted into a patient's body percutaneously.
  • An “Interventional apparatus” refers to an imaging modality such as a C-arm X-ray device for obtaining internal image data of a patient, or other device performing a similar function.
  • a “skin entry point” is the position on a patient's skin at which an interventional device is inserted.
  • Skin entry point data is data representative of the skin entry point within the patient 3-dimensional image data set or within two X-ray views taken under different view orientations using a triangulation technique.
  • a “target” or “target point” is a point within the body of a patient that is the object of a percutaneous procedure.
  • An “interventional path” is a line between the skin entry point and the target point and the “target.”
  • “Instrument trajectory” is a desired motion of the instrument along the interventional path.
  • a “progression view” is an x-ray image taken at an oblique angle with respect to a line joining the skin entry point and the target.
  • a “collimator” is a device used to narrow the radiation field to a size needed for the examination at hand.
  • the collimator may have sets of lead plates providing either a round or a square-shaped radiation field.
  • the aperture of the collimator may be adjusted either automatically or manually, depending on the system.
  • Many collimators are axially symmetric and are oriented so that the axis of symmetry of the collimator is aligned with the optical axis of the X-ray system.
  • optical axis (or “central axis”) of a C-arm X-ray device is defined by a line orthogonal to a detector and passing through the center of radiation of an X-ray source.
  • the optical axis is usually positioned so as to pass through a central point of the detector, and is aligned with the C-arm X-ray device's central ray of the X-ray radiation cone.
  • a “Bull's Eye View” is an x-ray view under which a target point and another point along the instrument trajectory are projected onto each other.
  • the other point along the instrument trajectory may be the skin entry point.
  • the view direction of the imaging modality can be visualized using a graphical overlay in which the target point and skin entry point, forward-projected from 3-dimensions to 2-dimensions, are displayed as individual circles. If the Bull's Eye View has been reached, these two circles are projected at the same 2-dimensional position (i.e., they appear concentrically aligned). Further the image skin entry point and the target point lie on the optical axis of the X-ray device.
  • a “Generalized Bull's Eye View” has the same properties as a “Bulls Eye View”; however, a ray different from the central ray of the x-ray cone is used to project target and another point along the instrument trajectory onto each other. As a consequence, the concentrically aligned circles are not located on the optical axis of the X-ray device.
  • the bull's eye view places the aligned circles at the central point, whereas the generalized bulls-eye view places the aligned circles at any other point on the flat panel detector.
  • bulls-eye view and “generalized bulls-eye” view are conceptually adopted from the sport of target shooting, where a target having concentric rings is placed at a distance and a firearm aimed at the central ring and discharged.
  • the central, smallest ring which is typically a solid circle, is known as the “bull's eye”.
  • Aiming the firearm, whether using “iron sights” or a telescopic sight, includes the steps of generally pointing the firearm at the target, and aligning longitudinally displaced front and rear sights of the firearm so that the sights and the bulls eye of the target coincide. This is considered to be the alignment that will place the bullet in the central ring.
  • a reticle or crosshair is provided to be aligned with the target bulls eye. It might also be visualized as a “down-the-gunbarrel view,” or “down-the-barrel view”.
  • aiming refers to an angulation of the C-arm of the X-ray device such that where a needle appears as a circle in a radiographic image, rather than as a line. That is, a near end of an object such as a needle, and the far end of the object of the object appear to be collapsed into a circle when projected onto the flat panel detector using the radiated X-rays, or in a computer simulation of the same orientation.
  • a target such as a bodily structure or organ to be treated or investigated, such as by biopsy, also coincides with the circle, bulls eye aiming has been performed.
  • this also means that a long axis of the needle is aligned along the optical axis of the C-arm device, else the bulls-eye circle will not be obtained.
  • This circumstance also places the bull's eye in the center of the flat panel detector, providing that the flat-panel detector is disposed so as to be orthogonal to the optical axis and symmetrically arranged with respect thereto.
  • the bull's eye situation may be optimum from the viewpoint of minimizing the radiation exposure of the patient, as the fluoroscopic images that may be needed during an alignment procedure for the needle may be obtained with the optimum adjustment of the collimator.
  • collimator devices are typically axially symmetric with the X-ray beam axis.
  • the C-arm has an angulation direction such that the start point of an object, such as a needle, is projected onto its end point on a 2D detector.
  • the position at which the X-ray through object start point and object end point hits the detector can be anywhere on the detector. In this circumstance, the entry point and the target are not on the optical axis of the X-ray system.
  • a 3D (volumetric) image data set of a patient may be acquired using a C-arm X-ray device, a computerized tomographic unit (CT), a Magnetic Resonance Imager (MRI), or any other suitable imaging modality.
  • This 3D data may be obtained contemporaneously with the planning and execution of a medical procedure, or at an earlier time.
  • a C-arm X-ray device 10 as shown in FIG. 1 may be representative of imaging modalities which may be used in the system and method described herein.
  • Other imaging modalities such as a MRI, ultrasound (US), CT, and the like may be used for all or part of the procedure.
  • a C-arm 26 has a radiation source 11 and a detector 14 attached thereto, and the C-arm may be moved in arcuate or other three-dimensional paths, around, or partially around the patient 20 .
  • the patient may be positioned on a patient support table 25 .
  • a collimator (not shown) may be associated with the radiation source 11 so as to limit the angular field and direction of the radiation 6 .
  • the C-arm X-ray device 10 may be rotated such that a sequence of projection X-ray images is obtained by an X-ray detector 14 , which may be a flat panel solid state two-dimensional detector positioned on an opposite side of the patient 20 from the X-ray source 11 so as to obtain data for 3-dimensional imaging and produce CT-like data.
  • an X-ray detector 14 which may be a flat panel solid state two-dimensional detector positioned on an opposite side of the patient 20 from the X-ray source 11 so as to obtain data for 3-dimensional imaging and produce CT-like data.
  • the X-ray detector 14 may be amorphous Silicon (a-Si), amorphous Selenium (a-Se), PbI2, CdTe or HgI2 detectors, or the like, using direct detection or TFT technology, or indirect detectors as is known in the art, or may be subsequently developed, to provide high resolution, high-dynamic-range essentially real-time X-ray detection.
  • a-Si amorphous Silicon
  • a-Se amorphous Selenium
  • PbI2, CdTe or HgI2 detectors or the like, using direct detection or TFT technology, or indirect detectors as is known in the art, or may be subsequently developed, to provide high resolution, high-dynamic-range essentially real-time X-ray detection.
  • a C-arm X-ray device does not completely surround a patient, as does a conventional CT apparatus.
  • the C-arm X-ray device may provide more convenient access to the patient during interventional treatment, without moving the patient with respect to the imaging modality, or disconnecting monitoring and therapy equipment.
  • the CT-like data and corresponding fluoroscopic data may be obtained with the same device. Images can be reconstructed so as to form CT-like voxel data sets, and segmented by any technique of image or data processing for realizing computed tomographic (CT) images and representations thereof.
  • CT computed tomographic
  • the C-arm X-ray unit and the associated image processing may be of the type described in US PG-Pub Application US2006/0120507, entitled “Angiographic X-ray Diagnostic Device for Rotational Angiography”, filed on Nov. 21, 2005, which is incorporated herein by reference.
  • a patient support table 25 may be used for some or all of the examination steps and thus may transfer the patient 20 between various sensors or otherwise position the patient 20 .
  • the patient support table 25 is trackable, using motion or position sensors, so that the table may be located and relocated with respect to the coordinate system of the C-arm 10 .
  • the motion sensor 15 may transmit data to the image processor 38 through a wired connection or in wireless form.
  • the function of eliminating motion artifacts may include motions that are due to breathing.
  • a chest belt using suitable sensors may be used to ascertain the breathing amplitude and frequency, and initiate corrective calculations in the image processor 38 or control the timing of the X-ray images using the X-ray controller 28 so that that motion artifacts are mitigated.
  • a variety of video displays 33 which may be flat panel displays, may be provided to present images and data for manipulation and analysis.
  • a computer device 70 may be a notebook PC computer, or other processing device with which the demographic, history, diagnosis or therapy data of the patient can be recorded, called up and sent to and from the medical information management system of the hospital over a local area network (LAN) or the like.
  • the computer device 70 may be provided with a data interface for retrieving data from an HMO (health maintenance organization), health insurance smart card, or other patient data base, and may be connected to the remainder of the therapy suite by a wired or a wireless connection.
  • a user input device 71 such as a keyboard, computer display device, and mouse, may be provided for manual input and control.
  • the computer device may also perform signal and data processing operations that have been described herein as being performed by separate computer devices, and the allocation of system functions to one or more computers is a matter of design choice, which may be expected in different embodiments.
  • Additional, different, or fewer devices may be provided in a therapy suite.
  • the devices and functions shown are representative, but not inclusive.
  • the individual units, devices, or functions may communicate with each other over cables, wires, or in a wireless manner.
  • FIG. 2A shows a schematic plan view of an radiographic image that would be produced where a bull's eye view orientation of a target 200 and an interventional instrument, which may be a needle 250 is achieved.
  • the image area 300 produced by, for example X-ray detector 14 , receiving radiation emitted by radiation source 11 is centered so that the central point 350 of the image area 300 lies on, and is perpendicular to the optical axis of the C-arm device 10 . That is, the radiation source 11 and the X-ray detector are mounted to the C-arm 26 and disposed such that a central ray of the emitted X-radiation is aligned with and passes through the needle 250 and the target 200 .
  • the target 200 is representative of the internal location in the patient 20 towards which the needle 250 would be advanced during the interventional procedure.
  • the target may be visualized in the radiographic image taken by the X-ray device, or be synthesized using the previously acquired voxel data for the patient.
  • the position of the needle 250 on the central axis may be computed and displayed on the synthesized image without emitting radiation. Combinations of synthesis and real-time radiographs (fluoroscopy) may also be used.
  • a symmetrical collimator may be used to minimize the angular width of the radiation field 6 needed to obtain further radiographic images.
  • a laser pointer, or fan beams whose sources may be mounted to the detector 14 may be used to align the needle 250 with the interventional apparatus optical axis.
  • FIG. 2B illustrates a situation where the alignment synthetic image of the needle 250 with the target 200 (whether a synthesized target image or real-time radiograph) is displaced from the central point 350 of the image.
  • This is a generalized bull's eye view. That is, a synthesized image of a needle 250 may be aligned so as to point towards the target 200 ; however the long axis of the needle 250 does not coincide with the optical axis of the C-arm device 10 . In this circumstance the needle may be guided or aligned by a mechanical fixture, as the laser pointer approach was intended to be used along the optical axis of the imaging modality.
  • X-ray radiation may be needed so that the long axis of the needle can be aligned. So, for a generalized bull's eye view, the final alignment of the needle 250 is generally done under X-ray guidance.
  • a collimator may have to be adjusted to permit a wider cone angle of emitted radiation, particularly if the collimator is limited to symmetrical x-ray beams.
  • FIG. 2C shows a situation where the needle position is neither a bull's eye view nor a generalized bull's eye view. In this situation, the planning process has not resulted in a satisfactory solution. The planning process may be continued, however other criteria may have to be changed, including, for example, the position of the patient 20 on the patient support table 25 .
  • the adjustment of the needle 250 under radiation conditions requires additional safety precautions, and the doctor may use tongs or other holding devices to manipulate the needle 250 while remaining outside of the direct radiation field.
  • the sensor portions of the therapy unit may be located in a therapy room. Some of, or all of, the signal and data processing and data display may also be located in the therapy room; however, some of, or all of, the equipment and functionality not directly associated with the sensing of the patient and the imaging modality data sensors, may be remotely located. Such remote location of portions of the equipment may be facilitated by high-speed data communications on local-area networks, wide-area networks, or the Internet.
  • the therapy unit may thus be located remotely from the specialists making the diagnosis and for determining the appropriate course of treatment. Of course, the specialists may also be present with the patient in the treatment room.
  • the 3-dimensional image data set used to identify the target or to plan and monitor the procedure may be obtained using a variety of known image generating systems in which typical targets (e.g., tumors) can be seen clearly. Examples of such systems include magnetic resonance imaging (MRI), Positron emission tomography (PET), computer tomography (CT), and C-arm X-ray. If the target is visible using 2D X-ray imaging, the target may be localized using multiple x-ray views and triangulation techniques.
  • the 3-dimensional image data set may be obtained by taking a plurality of X-ray images acquired under different view directions by the C-arm X-ray device.
  • images obtained by the C-arm X-ray device and a 3D image data set obtained by another imaging modality may be registered by any of the known registration methods, or by registration methods which may be developed in the future.
  • a 2D radiographic image may be computed by, for example, by a maximum projection intensity algorithm (MIP) or any other volume rendering technique using the 3D data set as the source data.
  • MIP maximum projection intensity algorithm
  • the fluoroscopic images and rendered projections are intrinsically registered.
  • the registration step ensures that the fluoroscopic (X-ray) images of the patient obtained using the C-arm device to be used for the procedure match the images of the patient constructed from the 3-dimensional data set. This enables instrument positioning using information on target position obtained from the 3-dimensional data set.
  • the C-arm X-ray apparatus may be fitted with a collision control system so as to maintain a relationship of the physical aspects of the C-arm X-ray device with respect to the patient, the attending staff, ancillary equipment and the like.
  • This collision avoidance system may use any of a variety of sensors, which may be acoustic, optical or electromagnetic, as well as motion and distance sensors attached to or integrated with the X-ray apparatus. Since the motion of the C-arm may be controlled robotically, the C-arm may be fitted with sensors so that the spatial location of the C-arm with respect to the laboratory coordinate system, and to other devices located within the laboratory coordinate system may be determined so as to plan or execute a motion and to predict or avoid interference (“collisions”).
  • the planned orientation of the X-ray apparatus and the patient, including any motion of the patient support table may be simulated by manipulating the reconstruction of the 3D and 2D image data during the planning of the procedure.
  • the orientation of the C-arm is changed analytically and images are reconstructed from the image data base so as to represent data obtained from the particular angulation selected.
  • the equipment motions may thus be simulated within the known working environment so as to enable visualizing of the procedure.
  • the radiation (optical) axis of the C-arm X-ray device can be brought into an orientation such that, based on a “bulls-eye” view, a laser may be used to define the virtual path of the intervention device (e.g., biopsy needle), without a collision occurring, or exceeding any other equipment capability.
  • a laser may be used to define the virtual path of the intervention device (e.g., biopsy needle), without a collision occurring, or exceeding any other equipment capability.
  • the same procedure and computer programs may be used to determining whether the C-arm X-ray device may be oriented so that the C-arm may be generally oriented so as to project the virtual path onto the detector in a “generalized bulls eye” view. This circumstance may be more efficient in terms of the number of procedural steps required, but may expose the patient and physician to higher radiation doses.
  • the path and angulation of the C-arm needs to be re-planned.
  • This re-planning may be so as to reorient the planned interventional device path so as to use the laser orientation, or the generalized bull's eye view.
  • a generalized bull's eye view may also be re-planned to bring the orientation of the interventional path into better alignment with the optical axis, or to achieve a bull's eye view.
  • the re-planning can be performed as described above, using a display of the 3D and 2D data and a graphical user interface, or by the use of a device configured such that it may be tracked on the skin surface of the patient by the location/collision-avoidance system, or by a device specific tracking system.
  • the doctor aligns the puncture needle.
  • the C-arm X-ray device can be aligned so that radiation central axis (optical axis) may be such that the optical axis passes through the puncture location and the internal target location
  • a laser may be used to align the puncture needle with the virtual path of the planned procedure.
  • Using a laser alignment apparatus and method may not require additional X-radiation during the alignment step.
  • the alignment of the interventional device may use X-radiation.
  • the progress of the procedure may be monitored by one of virtual images, derived from the motion sensors and position sensors previously described, or by obtaining progression view X-ray images at periodic intervals.
  • a combination of the two techniques may be used so as, to conform to the doctor's experience in obtaining the required level of precision in guidance of the interventional device.
  • the process of aligning a needle along the interventional path selected for the procedure, or re-planning the path may track a skin entry point (puncture point) using a position sensor which may be placed on the patient's skin.
  • the position sensor which may be optical or electromagnetic, may be secured to, or integrated into, a device that is capable of determining both the location and the direction of a vector aligned with the interventional instrument. If this position sensor is moved on the surface of the skin, an entry point can identified with the position of the position sensor, and the entry point re-determined with respect to a previously established target (usually within the body) until a satisfactory interventional plan has been made.
  • the virtual needle path may be plotted in the 3 D volume or with respect to a fluoroscopic overlay image derived from the 3 D data set using volume rendering techniques.
  • a marking on the skin for use in making the puncture during the procedure such as a color marking, optionally combined with contrast agent injection, a small puncture, or the like. If the instrument is a puncture guide needle, that the needle can be inserted into the skin surface (after local anesthesia) and thus fixed in position.
  • a variety of positioning devices may be used such as a SeeStar (available from RADI Medical Systems AB, Upplsala, Sweden), a flexible securing arm with mounting provisions for instrument guidance (a guide sleeve, instrument securing or fastening device), or the like. These devices may be robotically controllable. With such a robotic device, the puncture direction may be defined without touching the patient.
  • the SeeStar device is an instrument guide that produces an elongated artifact in an X-ray image. This elongated artifact indicates the trajectory of an instrument inserted through the SeeStar, and thus one can determine whether the selected trajectory will intersect the target as desired.
  • An elongated radio-opaque object may also be used to represent the needle during planning, including the orientation of a locating fixture.
  • radio-opaque markers may be temporarily affixed to the patient throughout the process, so that a radiograph can quickly confirm the correct positioning.
  • a hand guided device such as a biopsy gun may be tracked by devices similar to those previously described, and the guidance path of the intervention displayed on the previously obtained images. If the orientation of the biopsy gun is determined to be sufficiently well positioned with respect to the target, then the gun can be activated to obtain a biopsy sample.
  • Such hand guided devices may be designed and constructed so as to be powered by batteries contained in the device so that power cords may be avoided and the maneuvering of the device in a congested area simplified
  • an ultrasound device whose location and orientation can be determined, e.g., by localizing a guide sleeve attached to it, may be used to orient a needle or other interventional device.
  • the guidance path of the needle and the skin entry point can also be displayed in the ultrasound images.
  • the position and orientation of the guide sleeve may be tracked during the procedure and an updated virtual image may be displayed showing the progress of the needle with respect to the planned needle path.
  • the planning process for determining the puncture location and the orientation of the guidance path so as to reach the target may use the 3D data previously obtained and projected in a forward-looking (“endoscopic”) view so as to enable visualization of the internal bodily tissues along the path of the planned intervention; that is, the view is along the virtual path to the target.
  • other views from the 3D data set may involve segmenting the image so as to, for example, remove the soft tissue, while leaving other bodily parts such as bones and blood vessels in place. This assists in determining if the planned interventional path will encounter obstacles such as bone, or if the interventional path intersects with blood vessels or the like.
  • An analogous 3D data set could be created with an ultrasound head that is movably mounted and then tracked with respect to the securing arm of a guiding device.
  • a 2D ultrasonic (US) transducer producing a forward-looking view may also be used as a planning tool. Both the target and the direction of the percutaneous intervention can be determined. If the orientation of the US transducer and the C-arm are recorded with reference to each other, the US path planning information can be communicated to the 3D C-arm CT data set so as to facilitate aligning the C arm.
  • US ultrasonic
  • the projection of the skin entry point and “target” may not be in the central region of the detector but in some outer region that cannot be shielded by a central collimator. This may be mitigated by movement of the table on which the patient is positioned.
  • the table may be moved such that the “generalized bulls-eye view” extends through the central collimation opening. Further, if the table can be moved such that the “generalized bulls eye view” coincides with the optical axis of the C-arch, then the conventional “bulls-eye view” orientation will have been reached. In this situation laser guidance can be used.
  • the instrument may be aligned using fluoroscopy; this results in additional X-ray exposure.
  • the “bulls-eye view” and the “generalized bulls-eye view” become the same view when the C-arm and the table have been oriented such that the planned instrument trajectory in the patient extends through the isocenter of the C-arm.
  • the C-arm device may have a collimator capable of automatic/dynamic collimation around the planned instrument path, or the tracked path, where the instrument path is tracked so as to reduce the area subject to radiation during each step of the procedure.
  • collimation may be restricted to symmetry about the optical axis of the C-arm, and this may influence the planning of the intervention.
  • Collimators having asymmetrical properties may also be used. They may be better suited for shielding when working under a “generalized bulls-eye view”.
  • the optical axis may be considered to be defined by the direction of the laser beam.
  • the deviation from the target point (for example in degrees or cm) the discrepancy (in degrees or cm) from the target point at the user display of the C-arm system may be displayed. This may also be represented as a “traffic light” display where the color of an indicator changes as one gets closer to the target.
  • the procedure may be commenced by moving the C-arm into a “generalized bulls-eye view”.
  • the table can be manually moved until the patient is positioned in such a way that the C-arm can be moved into a viewing direction in which the optical system axis intersects the target point.
  • the laser may turned on. If the C-arm is moved manually, then a haptic controller may be used to assure that the position can be approached in a controlled way but without overshooting.
  • the table in particular can be moved automatically to a position favorable for the laser mode.
  • the C-arm may be reoriented so as to follow the progress of the intervention.
  • the C-arm may have been oriented in the “bulls eye view;” however, this orientation may not be suitable for visualizing the internal location (the progress) of the interventional device, as the distance into the patient body would be along the length of the instrument, which may be a needle, which is visualized as a point or small circle in the “bull's eye view”. Oblique views would be used rather than axial views to show the progression of the device.
  • the C-arm angulation would need to be changed to obtain a first progression view at an appropriate oblique angle. It would perhaps also be changed again so as to obtain a second progression view, and the like.
  • This reorientation of the angulation of the C-arm it may be possible for collisions to occur. That is, the physical structure of the C-arm and attachments thereto may intrude on the physical space of other equipment, personnel, or the patient. Collision may be avoided by using the coordinate location information of the overall system and component parts thereof, by using collision sensors, or by limiting the motions of the C-arm during the course of an intervention. For example, rotations about the patient may be performed along the center plane (using a CRAN/CAUD angle of 0 degrees).
  • the use of the combination of the techniques for planning and performing percutaneous interventions with the C-arm X-ray device results in aligning the needle using the laser beam, without exposing the doctor or patient to radiation, and facilitating the planning and evaluating realizable needle paths when the C-arm or the patient table cannot be positioned so as to use the laser, and X-radiation is used to perform the alignment.
  • the combination of hardware and software to accomplish the tasks described herein may be termed a platform or “therapy unit”.
  • the instructions for implementing processes of the platform may be provided on computer-readable storage media or memories, such as a cache, buffer, RAM, FLASH, removable media, hard drive or other computer readable storage media.
  • Computer readable storage media include various types of volatile and nonvolatile storage media.
  • the functions, acts or tasks illustrated or described herein may be executed in response to one or more sets of instructions stored in or on computer readable storage media.
  • the functions, acts or tasks may be independent of the particular type of instruction set, storage media, processor or processing strategy and may be performed by software, hardware, integrated circuits, firmware, micro code and the like, operating alone or in combination. Some aspects of the functions, acts, or tasks may be performed by dedicated hardware, or manually by an operator.
  • the instructions may be stored on a removable media device for reading by local or remote systems.
  • the instructions may be stored in a remote location for transfer through a computer network, a local or wide area network, by wireless techniques, or over telephone lines.
  • the instructions are stored within a given computer, system, or device.
  • data network such as “data network”, “web” or “Internet”
  • Internet the intent is to describe an internetworking environment, including both local and wide area networks, where defined transmission protocols are used to facilitate communications between diverse, possibly geographically dispersed, entities.
  • An example of such an environment is the world-wide-web (WWW) and the use of the TCP/IP data packet protocol, and the use of Ethernet or other known or later developed hardware and software protocols for some of the data paths.
  • WWW world-wide-web
  • TCP/IP data packet protocol TCP/IP data packet protocol
  • Ethernet or other known or later developed hardware and software protocols for some of the data paths.
  • the internetworking environment is provided, in whole or in part, as an attribute of the facility in which the platform is located.
  • Wireless communication may include, audio, radio, lightwave or other technique not requiring a physical connection between a transmitting device and a corresponding receiving device. While the communication may be described as being from a transmitter to a receiver, this does not exclude the reverse path, and a wireless communications device may include both transmitting and receiving functions.
  • Such wireless communication may be performed by electronic devices capable of modulating data as signals on carrier waves for transmission, and receiving and demodulating such signals to recover the data.
  • the devices may be compatible with an industry standard protocol such as IEEE 802.11b/g, or other protocols that exist, or may be developed.
  • a medical workflow follows hospital routine procedures prior to and subsequent to the performance of diagnostic tests and treatment.
  • the patient will have been admitted to the hospital and assigned to a medical treatment specialty.
  • the patient demographic information may be entered into the hospital information processing system as part of the admitting procedures, and may be updated as needed.
  • the patient may be transported to the treatment room and positioned on the treatment table.
  • a transportation step may be manual or may use robotic devices.
  • robotic devices may be generally substituted for a human activity, or used to facilitate a human activity, as such robotic devices are being introduced into the hospital environment.
  • the use of such robotic devices may be presumed as at least an optional part of the workflow, unless specifically excluded.
  • Three dimensional imaging data may have been previously obtained during diagnosis. Often this data is suitable for at least the planning stage of the interventional treatment, as the target will have been already identified. Providing that the previously obtained data is satisfactory, the voxel data may be processed to produce images that are registered with images of the obtained by the C-arm X-ray device used during the interventional phase. Such images may have been taken with a breathing monitor, if the thoracic region is involved, using fixtures to position the patient, or by applying radio-opaque markers to the skin of the patient in the area of treatment.
  • the planning of intervention may be performed using synthetic images formed from the prior voxel data.
  • Such planning may be performed entirely as a synthetic (computational) process, using the voxel data and a parameter set locating the various components of the treatment apparatus, including the C-arm, patient table, the interventional device (e.g., the needle), the target, and the patient.
  • An objective of the planning process is to determine if the patient can be positioned such that the intervention can be performed with the needle aligned with the optical axis of the C-arm. Success in this planning would result in a “bull's eye view”.
  • Such planning may need to be performed in an iterative manner, as a first plan may result in a “generalized bull's eye view.” The doctor may decide that the generalized bull's eye view is a sufficient for the procedure.
  • the positioning may be re-planned so that the generalized bull's eye view lies closer to the optical axis of the C-arm system so as to make more efficient use of the collimator. This usually would result in re-positioning of the patient table.
  • the skin penetration point may be identified by the synthetic views and the radiation views used to align the axis of the needle, while confirming the coincidence of the aligned needle with the target. Both the active radiation view and synthetic views may be used in the process.
  • the positioning may be re-planned so that the “bull's eye view” is obtained.
  • the skin entry point and the target lie along the optical axis of the C-arm.
  • This may be confirmed, if desired, by radiation with a narrow collimation, or the alignment of the needle may be performed using a laser guide defining the optical axis of the C-arm.
  • the laser guide is usually attached perpendicularly to the X-ray detector, as the X-ray emitter is usually positioned beneath the patient table so as to reduce the radiation dosage to the patient's eyes when X-rays are being emitted.
  • the angulations that may be desired for obtaining progression views may be explored. That is, the oblique views which may be desired to confirm the progress of the needle along its path, and the depth of penetration, and the intersection of the needle with the target may be simulated. The entire image may be reconstructed, or only the entry point and target may be used. A synthetic path may be displayed so as to assist in visualizing the expected result. In this manner, the planning process also considers the possibility that satisfactory progression views may not be obtainable due to interference between the C-arm, other equipment, the patient, and the doctor. The interventional path may be re-planned to determine if a more satisfactory planned path for the needle may be determined.
  • the step moving the patient into position on the table is performed so that the patient may be moved into a preplanned position. This may be confirmed by taking X-ray images if deemed desirable.
  • the needle or other interventional device may be aligned by observing the laser spot on the patient skin, placing the penetrating end on the spot, and orienting the axis of the device along the laser beam.
  • a fixture attached so that the interventional device may be physically oriented may also be used. Once the interventional device has been oriented, one of a number of known techniques may be used to maintain the orientation prior to, during, or after penetration.
  • the C-arm is positioned using the planned angulation for penetration, and using a real time fluoroscopic image.
  • the penetrating point is positioned at the determined penetration point by comparing the real-time position with the synthetic bull's eye image, or by orienting the needle so that a real-time generalized bull's eye is obtained.
  • the interventional device may be held in position mechanically, similarly to the situation which obtains for the bull's eye case.
  • the procedure may now proceed using the aligned interventional instrument.
  • the C-arm angulation may be changed to one of the positions that may be suitable for progression views, and such additional X-ray images obtained as are considered needed by the doctor to verify or correct the path of the interventional instrument towards the target, and to verify reaching the target.
  • a method of planning or performing an interventional procedure 500 may include positioning a patient such that a three dimensional voxel image data set can be obtained (step 510 ).
  • the three dimensional voxel data set may be used to identify a target area or target volume to be accessed during the interventional procedure (step 520 ).
  • Volume rendering techniques may be used to plan the path of an interventional device (step 530 ).
  • the angulation of a C-arm X-ray device is preferably planned such that skin entry point and target in the patient are projected onto each other (step 540 ).
  • the positioning may be re-planned by, for example, repositioning the patient table to bring the target area closer to the optical axis of the C-arm (step 550 ).
  • Step 550 may be performed iteratively until a satisfactory result is obtained, although this may still not be a bull's eye situation.
  • an optional step may be performed to determine whether the C-arm may be positioned so as to obtain satisfactory progressions views. This step is dependent on the complexity of the procedure to be performed.
  • the patient (if not already present) is placed on the patient table and the orientation of the patient with respect to the C-arm adjusted so as to conform to the planned location.
  • This location may be confirmed by sensors, by radiographic markers, or by the registration of fluoroscopic images with the previously obtained voxel data set (step 570 ).
  • the penetrating end of the interventional device is positioned with respect to the skin entry point using a laser, or a mechanical device oriented along the optical axis of the C-arm (step 580 ), and the long axis of the interventional device oriented so as to coincide with the optical axis.
  • a laser or a mechanical device oriented along the optical axis of the C-arm
  • Other laser configurations such as a fan beam, may be used. After alignment, the position may be maintained or regained using known techniques.
  • the penetrating end of the interventional device is located with respect to one of the target or a synthetic entry point on a real-time radiographic image, and the axis of the interventional device is oriented so as to lie along the path between the determined skin entry point and the target (step 590 ). After alignment, the position may be maintained or regained using known techniques.
  • the interventional procedure may be then performed by advancing the interventional device so as to penetrate the skin (step 600 ).
  • the needle for example, having been aligned as previously described is advanced so as to penetrate the patient skin along the pre-planned intervention path (step 610 ).
  • the C-arm is re-angulated so as to obtain a progression fluoroscopic view (step 620 ).
  • a synthetic line may be overlaid on the radiograph indicating the projection of the planned interventional path on the specific view. Intermediate positions may be presented in synthesized views.
  • the position of the C-arm relative to the room coordinate system is known through the previously preformed registration steps.
  • Step 620 is performed as many times as the doctor considers necessary to achieve a satisfactory positioning of the needle with respect to the target, which may be, for example, a suspected tumor to be biopsied.
  • the angulation may be returned to that of the bull's eye view, or the generalized bull's eye view that was determined in steps 540 and 550 so as to verify or adjust the axial alignment.
  • the specific procedure may be performed (step 630 ).
  • the procedure may include obtaining a biopsy, administering a medication, applying cement, or the like, depending on the syndrome being treated or diagnosed.

Abstract

A system and method for planning and performing a percutaneous medical procedure is described. A three dimensional data set of a patient is registered with respect to a patient position and an interventional apparatus, which may be a C-arm X-ray device. A target within the patient is identified in the image data, and a skin entry point chosen for planning the procedure. The image data set is processed so as to compute a two dimensional fluoroscopic overlay image upon which the target and the skin entry point are displayed. The angulation of the volumetric representation of the C-arm is controlled so as to plan the guiding path for an interventional device, and the planning attempts to achieve one of a bull's eye orientation or a generalized bull's orientation. The interventional device is aligned with the guiding axis to perform the procedure, which may be monitored using X-ray progression views.

Description

  • This application claims the benefit of U.S. provisional application Ser. No. 61/049,803, filed on May 2, 2008, which is incorporated herein by reference in its entirety.
  • TECHNICAL FIELD
  • The present application may relate to a system and method for performing percutaneous medical procedures using imaging modalities for instrument alignment and guidance.
  • BACKGROUND
  • With modern angiography X-ray systems (such as Axiom Artis made by Siemens, Munich, Germany), not only can 2-dimensional radiographic images be obtained but also, by rotation of the C-arm around the patient, 3-dimensional (3D), computed tomographic (CT)-like images can also be attained. These 3D data sets can be registered anatomically correctly to the 2D radiographic images (2D/3D registration) for purposes of, for example, intervention planning or intervention navigation. In particular, percutaneous punctures, for obtaining biopsies or administering treatment can be planned on these data sets. For all systems involving the exposure of patients and personnel to ionizing radiation, the total radiation dosage is a concern and should be held to the lowest value consistent with the medical benefit.
  • German Patent application DE 10200729119, filed on Jul. 25, 2007, and published as DE 10 2007 029 199 on Jan. 2, 2009, having a common inventor, entitled “Method for Aligning a Target Guidance System for a Puncture, and an X-ray Angiographic System” describes a method of orienting a biopsy needle using a laser aligned along the planned puncture path. This application is incorporated herein by reference. The aiming of the laser includes manual alignments (such as the table position), in order to achieve a match between the needle and the planned path. This is attained by moving the C-arm and/or the patient support table, or both, so that certain markings on a display screen match. The needle can be aligned without further irradiation of the patient (and doctor); however, the approach is limited to the positions that are feasible for orientation of the C-arm or the patient support table, which may also limit the orientation of the central (optical) axis of the radiation beam with respect to the patient.
  • U.S. patent application Ser. No. 12/329,657, filed on Dec. 8, 2008, having a common inventor, and entitled “X-ray device and Workflow for Guiding Percutaneous Procedures,” describes a method which permits a general alignment of the needle with a planned puncture path (virtual path). This application is incorporated herein by reference. The C-arm is positioned in such a way that the puncture target and the primary penetration point of the body, which together define a puncture path, are projected onto the X-ray detector so as to coincide (known as a “generalized bulls-eye view”). This method generally results in fewer limitations on the orientations of the radiation beam, and manual movement of the patient support table is not needed. However the method requires the alignment under X-radiation of the patient (and the doctor).
  • SUMMARY
  • A method planning or performing a percutaneous medical procedure is disclosed, the method including the steps of obtaining three-dimensional image data of a patient; and orienting a representation of the patient, or the patient, on a patient support table of an interventional apparatus. The three-dimensional image data is registered with respect to the coordinate system of the oriented patient and a percutaneous medical procedure is planned using the three dimensional image data. The planned path is such that a guiding axis for an interventional device passes through a skin entry point and a target. The computed angulation of the interventional apparatus is adjusted such that one of a bull's eye or a generalized bull's eye image is obtained in a projection view.
  • In an aspect, a computer program product includes instructions for configuring a computer to accept three-dimensional image data of a patient and register the three dimensional data image with respect to the patient. Images suitable for planning a guiding path between a skin entry point and a target in the patient are produced, and a planned path is one whose axis passes through the skin entry point and the target. The selected angulation of the planned path is one which meets one of a bull's eye or generalized bull's eye criterion. For the selected angulation, relation between a volumetric representation of the interventional apparatus and the patient support table is computed so as to determine whether there would be a collision between the interventional apparatus and the patient support table.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram showing the component equipment of a treatment unit for planning and performing a percutaneous medical procedure;
  • FIG. 2 illustrates schematically the orientation of a skin entry point and a target in a patient body on a radiograph for (A) a bull's eye view; (B) a generalized bull's eye view; and, (C) no alignment, where the representations of the skin entry point and the target are shown in a relative size for clarity in illustration; and
  • FIG. 3 is a workflow for a method of planning and performing the percutaneous procedure
  • DETAILED DESCRIPTION
  • Exemplary embodiments may be better understood with reference to the drawings. In the interest of clarity, not all the routine features of the implementations described herein are described. It will of course be appreciated that in the development of any such actual implementation, numerous implementation-specific decisions must be made to achieve a developer's specific goals, such as compliance with system and business related constraints, medical protocols and regulatory requirements, and that these constraints will vary from one implementation to another.
  • A system for planning or performing a percutaneous procedure is disclosed. The system may comprise an imaging system having a movable arm, an X-ray source and an X-ray detector; and, a display and a system controller connected to and in communication with the imaging system and display, and include a machine-readable storage medium encoded with a computer program code such that, when the computer program code is executed by a processor, the processor performs a method as described herein.
  • The examples of diseases, syndromes, conditions, and the like, and the types of examination and treatment protocols described herein are by way of example, and are not meant to suggest that the method and apparatus is limited to those named, or the equivalents thereof. As the medical arts are continually advancing, the use of the methods and apparatus described herein may be expected to encompass a broader scope in the diagnosis and treatment of patients.
  • The following terms are used herein: A “patient 3-dimensional image data set” is a three dimensional numerical array whose elements hold the values of specific physical properties at small volumes in space inside the patient's body; each volume is known as a “voxel” and may be considered to be comparable to a “pixel” in a 2-dimensional image.
  • A “multiplanar reformation image (MPR)” is a planar cross-section of the patient 3-dimensional image data set generated by cutting through the three-dimensional data set at some orientation (e.g., axial, coronal, sagittal, or oblique), which may be a slice of CT or CT-like data.
  • A “fluoroscopic image” or “radiographic image” is a two-dimensional x-ray projection image showing internal tissues of a region of the body. This image may be either live or synthesized (rendered) from previously obtained three dimensional voxel data. Two-dimensional representations of the 3D volume data set generated using volume rendering techniques may also be referred to as fluoroscopy overlay images. Fluoroscopy overlay images may include graphical representations of instrument trajectories or of instruments together with the surrounding anatomy.
  • A “live fluoroscopic image” is a sequence of x-ray images taken successively showing live movement of internal tissues of a region of the body. A “combined image” is an image in which an X-ray image is combined with a 2D representation of a 3D data set computed using volume rendering techniques. Such images may also be overlaid with icons or other representations of a target, a skin entry point, or a planned or estimated position of an interventional device.
  • “Registering” or “co-registration” means aligning an X-ray image, such as a fluoroscopic image, with the patient 3-dimensional image data set such that associated features of the X-ray image and a two-dimensional overlay image generated from the previously obtained patient 3-dimensional image data set appear at the same location on a display when the X-ray image and the overlay image are shown together. Registration can be point-based or gray-level based. In point-based registration, a transform is applied to the 3-dimensional image data set such that points in the resulting overlay image line up with their counterparts in the x-ray image and meeting a metric, which may be analytic or visual. Gray-level based registration techniques determine the transform not by minimizing the distance between associated points in the overlay image and X-ray image, but by minimizing an error metric based on the resulting overlay image gray levels and the X-ray image gray levels.
  • “Interventional device” or “Interventional Instrument” refers to any object which may pierce tissue of a patient, a non-limiting listing of which include needles and other biopsy devices, screws, implants, cannula, endoscopes, and anything else that can be inserted into a patient's body percutaneously.
  • An “Interventional apparatus refers to an imaging modality such as a C-arm X-ray device for obtaining internal image data of a patient, or other device performing a similar function.
  • A “skin entry point” is the position on a patient's skin at which an interventional device is inserted. “Skin entry point data” is data representative of the skin entry point within the patient 3-dimensional image data set or within two X-ray views taken under different view orientations using a triangulation technique.
  • A “target” or “target point” is a point within the body of a patient that is the object of a percutaneous procedure.
  • An “interventional path” is a line between the skin entry point and the target point and the “target.”
  • “Instrument trajectory” is a desired motion of the instrument along the interventional path.
  • A “progression view” is an x-ray image taken at an oblique angle with respect to a line joining the skin entry point and the target.
  • A “collimator” is a device used to narrow the radiation field to a size needed for the examination at hand. For this purpose, the collimator may have sets of lead plates providing either a round or a square-shaped radiation field. The aperture of the collimator may be adjusted either automatically or manually, depending on the system. Many collimators are axially symmetric and are oriented so that the axis of symmetry of the collimator is aligned with the optical axis of the X-ray system.
  • An “optical axis” (or “central axis”) of a C-arm X-ray device is defined by a line orthogonal to a detector and passing through the center of radiation of an X-ray source. The optical axis is usually positioned so as to pass through a central point of the detector, and is aligned with the C-arm X-ray device's central ray of the X-ray radiation cone.
  • A “Bull's Eye View” is an x-ray view under which a target point and another point along the instrument trajectory are projected onto each other. The other point along the instrument trajectory may be the skin entry point. The view direction of the imaging modality can be visualized using a graphical overlay in which the target point and skin entry point, forward-projected from 3-dimensions to 2-dimensions, are displayed as individual circles. If the Bull's Eye View has been reached, these two circles are projected at the same 2-dimensional position (i.e., they appear concentrically aligned). Further the image skin entry point and the target point lie on the optical axis of the X-ray device.
  • A “Generalized Bull's Eye View” has the same properties as a “Bulls Eye View”; however, a ray different from the central ray of the x-ray cone is used to project target and another point along the instrument trajectory onto each other. As a consequence, the concentrically aligned circles are not located on the optical axis of the X-ray device. In an aspect, where a central point of a flat panel display may be considered to represent the optical axis of the X-ray device, the bull's eye view places the aligned circles at the central point, whereas the generalized bulls-eye view places the aligned circles at any other point on the flat panel detector.
  • The terms “bulls-eye” view and “generalized bulls-eye” view are conceptually adopted from the sport of target shooting, where a target having concentric rings is placed at a distance and a firearm aimed at the central ring and discharged. The central, smallest ring, which is typically a solid circle, is known as the “bull's eye”. Aiming the firearm, whether using “iron sights” or a telescopic sight, includes the steps of generally pointing the firearm at the target, and aligning longitudinally displaced front and rear sights of the firearm so that the sights and the bulls eye of the target coincide. This is considered to be the alignment that will place the bullet in the central ring. Where a telescopic sight is used, a reticle or crosshair is provided to be aligned with the target bulls eye. It might also be visualized as a “down-the-gunbarrel view,” or “down-the-barrel view”.
  • The analogy is not so stretched, as the system and method described herein may be used, for example, with a biopsy gun. In the present context, aiming refers to an angulation of the C-arm of the X-ray device such that where a needle appears as a circle in a radiographic image, rather than as a line. That is, a near end of an object such as a needle, and the far end of the object of the object appear to be collapsed into a circle when projected onto the flat panel detector using the radiated X-rays, or in a computer simulation of the same orientation. Where a target, such as a bodily structure or organ to be treated or investigated, such as by biopsy, also coincides with the circle, bulls eye aiming has been performed. When planning is performed on a C-arm X-ray device, for example, this also means that a long axis of the needle is aligned along the optical axis of the C-arm device, else the bulls-eye circle will not be obtained. This circumstance also places the bull's eye in the center of the flat panel detector, providing that the flat-panel detector is disposed so as to be orthogonal to the optical axis and symmetrically arranged with respect thereto.
  • The bull's eye situation may be optimum from the viewpoint of minimizing the radiation exposure of the patient, as the fluoroscopic images that may be needed during an alignment procedure for the needle may be obtained with the optimum adjustment of the collimator. Such collimator devices are typically axially symmetric with the X-ray beam axis.
  • In a “Generalized Bulls Eye View” the C-arm has an angulation direction such that the start point of an object, such as a needle, is projected onto its end point on a 2D detector. However, the position at which the X-ray through object start point and object end point hits the detector can be anywhere on the detector. In this circumstance, the entry point and the target are not on the optical axis of the X-ray system.
  • A 3D (volumetric) image data set of a patient may be acquired using a C-arm X-ray device, a computerized tomographic unit (CT), a Magnetic Resonance Imager (MRI), or any other suitable imaging modality. This 3D data may be obtained contemporaneously with the planning and execution of a medical procedure, or at an earlier time.
  • In an example, a C-arm X-ray device 10 as shown in FIG. 1 may be representative of imaging modalities which may be used in the system and method described herein. Other imaging modalities such a MRI, ultrasound (US), CT, and the like may be used for all or part of the procedure.
  • A C-arm 26 has a radiation source 11 and a detector 14 attached thereto, and the C-arm may be moved in arcuate or other three-dimensional paths, around, or partially around the patient 20. The patient may be positioned on a patient support table 25. A collimator (not shown) may be associated with the radiation source 11 so as to limit the angular field and direction of the radiation 6.
  • The C-arm X-ray device 10 may be rotated such that a sequence of projection X-ray images is obtained by an X-ray detector 14, which may be a flat panel solid state two-dimensional detector positioned on an opposite side of the patient 20 from the X-ray source 11 so as to obtain data for 3-dimensional imaging and produce CT-like data.
  • The X-ray detector 14 may be amorphous Silicon (a-Si), amorphous Selenium (a-Se), PbI2, CdTe or HgI2 detectors, or the like, using direct detection or TFT technology, or indirect detectors as is known in the art, or may be subsequently developed, to provide high resolution, high-dynamic-range essentially real-time X-ray detection.
  • A C-arm X-ray device does not completely surround a patient, as does a conventional CT apparatus. Thus, the C-arm X-ray device may provide more convenient access to the patient during interventional treatment, without moving the patient with respect to the imaging modality, or disconnecting monitoring and therapy equipment. The CT-like data and corresponding fluoroscopic data may be obtained with the same device. Images can be reconstructed so as to form CT-like voxel data sets, and segmented by any technique of image or data processing for realizing computed tomographic (CT) images and representations thereof.
  • The C-arm X-ray unit and the associated image processing may be of the type described in US PG-Pub Application US2006/0120507, entitled “Angiographic X-ray Diagnostic Device for Rotational Angiography”, filed on Nov. 21, 2005, which is incorporated herein by reference.
  • A patient support table 25 may be used for some or all of the examination steps and thus may transfer the patient 20 between various sensors or otherwise position the patient 20. The patient support table 25 is trackable, using motion or position sensors, so that the table may be located and relocated with respect to the coordinate system of the C-arm 10. The motion sensor 15 may transmit data to the image processor 38 through a wired connection or in wireless form. The function of eliminating motion artifacts may include motions that are due to breathing. For example, a chest belt using suitable sensors may be used to ascertain the breathing amplitude and frequency, and initiate corrective calculations in the image processor 38 or control the timing of the X-ray images using the X-ray controller 28 so that that motion artifacts are mitigated. A variety of video displays 33, which may be flat panel displays, may be provided to present images and data for manipulation and analysis.
  • A computer device 70 may be a notebook PC computer, or other processing device with which the demographic, history, diagnosis or therapy data of the patient can be recorded, called up and sent to and from the medical information management system of the hospital over a local area network (LAN) or the like. The computer device 70 may be provided with a data interface for retrieving data from an HMO (health maintenance organization), health insurance smart card, or other patient data base, and may be connected to the remainder of the therapy suite by a wired or a wireless connection. A user input device 71, such as a keyboard, computer display device, and mouse, may be provided for manual input and control. The computer device may also perform signal and data processing operations that have been described herein as being performed by separate computer devices, and the allocation of system functions to one or more computers is a matter of design choice, which may be expected in different embodiments.
  • Additional, different, or fewer devices may be provided in a therapy suite. The devices and functions shown are representative, but not inclusive. The individual units, devices, or functions may communicate with each other over cables, wires, or in a wireless manner.
  • FIG. 2A shows a schematic plan view of an radiographic image that would be produced where a bull's eye view orientation of a target 200 and an interventional instrument, which may be a needle 250 is achieved. The image area 300 produced by, for example X-ray detector 14, receiving radiation emitted by radiation source 11 is centered so that the central point 350 of the image area 300 lies on, and is perpendicular to the optical axis of the C-arm device 10. That is, the radiation source 11 and the X-ray detector are mounted to the C-arm 26 and disposed such that a central ray of the emitted X-radiation is aligned with and passes through the needle 250 and the target 200. The target 200 is representative of the internal location in the patient 20 towards which the needle 250 would be advanced during the interventional procedure. The target may be visualized in the radiographic image taken by the X-ray device, or be synthesized using the previously acquired voxel data for the patient. Similarly, the position of the needle 250 on the central axis may be computed and displayed on the synthesized image without emitting radiation. Combinations of synthesis and real-time radiographs (fluoroscopy) may also be used.
  • Where a bull's eye alignment is achieved, a symmetrical collimator may be used to minimize the angular width of the radiation field 6 needed to obtain further radiographic images. A laser pointer, or fan beams whose sources may be mounted to the detector 14 may be used to align the needle 250 with the interventional apparatus optical axis.
  • FIG. 2B illustrates a situation where the alignment synthetic image of the needle 250 with the target 200 (whether a synthesized target image or real-time radiograph) is displaced from the central point 350 of the image. This is a generalized bull's eye view. That is, a synthesized image of a needle 250 may be aligned so as to point towards the target 200; however the long axis of the needle 250 does not coincide with the optical axis of the C-arm device 10. In this circumstance the needle may be guided or aligned by a mechanical fixture, as the laser pointer approach was intended to be used along the optical axis of the imaging modality. In order to position the needle 250 so that the skin entry point and the target 200 are along the axis of the needle 250, X-ray radiation may be needed so that the long axis of the needle can be aligned. So, for a generalized bull's eye view, the final alignment of the needle 250 is generally done under X-ray guidance. As the area to be imaged is off of the optical axis of the X-ray device 10, a collimator may have to be adjusted to permit a wider cone angle of emitted radiation, particularly if the collimator is limited to symmetrical x-ray beams.
  • FIG. 2C shows a situation where the needle position is neither a bull's eye view nor a generalized bull's eye view. In this situation, the planning process has not resulted in a satisfactory solution. The planning process may be continued, however other criteria may have to be changed, including, for example, the position of the patient 20 on the patient support table 25.
  • The adjustment of the needle 250 under radiation conditions requires additional safety precautions, and the doctor may use tongs or other holding devices to manipulate the needle 250 while remaining outside of the direct radiation field.
  • The sensor portions of the therapy unit may be located in a therapy room. Some of, or all of, the signal and data processing and data display may also be located in the therapy room; however, some of, or all of, the equipment and functionality not directly associated with the sensing of the patient and the imaging modality data sensors, may be remotely located. Such remote location of portions of the equipment may be facilitated by high-speed data communications on local-area networks, wide-area networks, or the Internet. The therapy unit may thus be located remotely from the specialists making the diagnosis and for determining the appropriate course of treatment. Of course, the specialists may also be present with the patient in the treatment room.
  • The 3-dimensional image data set used to identify the target or to plan and monitor the procedure may be obtained using a variety of known image generating systems in which typical targets (e.g., tumors) can be seen clearly. Examples of such systems include magnetic resonance imaging (MRI), Positron emission tomography (PET), computer tomography (CT), and C-arm X-ray. If the target is visible using 2D X-ray imaging, the target may be localized using multiple x-ray views and triangulation techniques. The 3-dimensional image data set may be obtained by taking a plurality of X-ray images acquired under different view directions by the C-arm X-ray device.
  • Where the 3D image data set is obtained by other than the C-arm X-ray device, images obtained by the C-arm X-ray device and a 3D image data set obtained by another imaging modality may be registered by any of the known registration methods, or by registration methods which may be developed in the future. When the C-arm X-ray device has been used to obtain the 3D data set, a 2D radiographic image may be computed by, for example, by a maximum projection intensity algorithm (MIP) or any other volume rendering technique using the 3D data set as the source data. When the same system has been used for both 2D and 3D imaging, the fluoroscopic images and rendered projections are intrinsically registered. The registration step ensures that the fluoroscopic (X-ray) images of the patient obtained using the C-arm device to be used for the procedure match the images of the patient constructed from the 3-dimensional data set. This enables instrument positioning using information on target position obtained from the 3-dimensional data set.
  • The C-arm X-ray apparatus may be fitted with a collision control system so as to maintain a relationship of the physical aspects of the C-arm X-ray device with respect to the patient, the attending staff, ancillary equipment and the like. This collision avoidance system may use any of a variety of sensors, which may be acoustic, optical or electromagnetic, as well as motion and distance sensors attached to or integrated with the X-ray apparatus. Since the motion of the C-arm may be controlled robotically, the C-arm may be fitted with sensors so that the spatial location of the C-arm with respect to the laboratory coordinate system, and to other devices located within the laboratory coordinate system may be determined so as to plan or execute a motion and to predict or avoid interference (“collisions”).
  • The planned orientation of the X-ray apparatus and the patient, including any motion of the patient support table may be simulated by manipulating the reconstruction of the 3D and 2D image data during the planning of the procedure. In this procedure, the orientation of the C-arm is changed analytically and images are reconstructed from the image data base so as to represent data obtained from the particular angulation selected. Without moving the actual equipment, the equipment motions may thus be simulated within the known working environment so as to enable visualizing of the procedure. In this manner, it may be determined if the radiation (optical) axis of the C-arm X-ray device can be brought into an orientation such that, based on a “bulls-eye” view, a laser may be used to define the virtual path of the intervention device (e.g., biopsy needle), without a collision occurring, or exceeding any other equipment capability.
  • Alternatively, the same procedure and computer programs may be used to determining whether the C-arm X-ray device may be oriented so that the C-arm may be generally oriented so as to project the virtual path onto the detector in a “generalized bulls eye” view. This circumstance may be more efficient in terms of the number of procedural steps required, but may expose the patient and physician to higher radiation doses.
  • If the path cannot be planned so as achieve one of the “bull's eye” or “generalized bull's eye” orientations as described above, the path and angulation of the C-arm needs to be re-planned. This re-planning may be so as to reorient the planned interventional device path so as to use the laser orientation, or the generalized bull's eye view. A generalized bull's eye view may also be re-planned to bring the orientation of the interventional path into better alignment with the optical axis, or to achieve a bull's eye view. The re-planning can be performed as described above, using a display of the 3D and 2D data and a graphical user interface, or by the use of a device configured such that it may be tracked on the skin surface of the patient by the location/collision-avoidance system, or by a device specific tracking system.
  • Once the procedure path has been planned, including identifying the puncture location on the patient body, and the orientation of the puncture needle, the doctor aligns the puncture needle. Where the C-arm X-ray device can be aligned so that radiation central axis (optical axis) may be such that the optical axis passes through the puncture location and the internal target location, a laser may be used to align the puncture needle with the virtual path of the planned procedure. Using a laser alignment apparatus and method may not require additional X-radiation during the alignment step. Where the generalized bull's eye orientation is used for the orientation, the alignment of the interventional device may use X-radiation.
  • The progress of the procedure may be monitored by one of virtual images, derived from the motion sensors and position sensors previously described, or by obtaining progression view X-ray images at periodic intervals. A combination of the two techniques may be used so as, to conform to the doctor's experience in obtaining the required level of precision in guidance of the interventional device.
  • In an aspect, the process of aligning a needle along the interventional path selected for the procedure, or re-planning the path, may track a skin entry point (puncture point) using a position sensor which may be placed on the patient's skin. The position sensor, which may be optical or electromagnetic, may be secured to, or integrated into, a device that is capable of determining both the location and the direction of a vector aligned with the interventional instrument. If this position sensor is moved on the surface of the skin, an entry point can identified with the position of the position sensor, and the entry point re-determined with respect to a previously established target (usually within the body) until a satisfactory interventional plan has been made. During the re-planning stage, the virtual needle path may be plotted in the 3 D volume or with respect to a fluoroscopic overlay image derived from the 3 D data set using volume rendering techniques.
  • Once a suitable needle guidance path has been found and the entry point thus determined then, for instance, one can make a marking on the skin for use in making the puncture during the procedure (such as a color marking, optionally combined with contrast agent injection, a small puncture, or the like). If the instrument is a puncture guide needle, that the needle can be inserted into the skin surface (after local anesthesia) and thus fixed in position.
  • A variety of positioning devices may be used such as a SeeStar (available from RADI Medical Systems AB, Upplsala, Sweden), a flexible securing arm with mounting provisions for instrument guidance (a guide sleeve, instrument securing or fastening device), or the like. These devices may be robotically controllable. With such a robotic device, the puncture direction may be defined without touching the patient. The SeeStar device is an instrument guide that produces an elongated artifact in an X-ray image. This elongated artifact indicates the trajectory of an instrument inserted through the SeeStar, and thus one can determine whether the selected trajectory will intersect the target as desired. An elongated radio-opaque object may also be used to represent the needle during planning, including the orientation of a locating fixture.
  • Once the needle path direction is fixed, one can introduce the instrument into the patient. During this process, care would be taken so that the position of the patient did not change and that the puncture is made during the same respiratory phase used to determine the guidance path. In another aspect, radio-opaque markers may be temporarily affixed to the patient throughout the process, so that a radiograph can quickly confirm the correct positioning.
  • In an alternative, a hand guided device such as a biopsy gun may be tracked by devices similar to those previously described, and the guidance path of the intervention displayed on the previously obtained images. If the orientation of the biopsy gun is determined to be sufficiently well positioned with respect to the target, then the gun can be activated to obtain a biopsy sample. Such hand guided devices may be designed and constructed so as to be powered by batteries contained in the device so that power cords may be avoided and the maneuvering of the device in a congested area simplified
  • In another aspect, an ultrasound device, whose location and orientation can be determined, e.g., by localizing a guide sleeve attached to it, may be used to orient a needle or other interventional device. In this case, the guidance path of the needle and the skin entry point can also be displayed in the ultrasound images. Moreover, the position and orientation of the guide sleeve may be tracked during the procedure and an updated virtual image may be displayed showing the progress of the needle with respect to the planned needle path.
  • The planning process for determining the puncture location and the orientation of the guidance path so as to reach the target may use the 3D data previously obtained and projected in a forward-looking (“endoscopic”) view so as to enable visualization of the internal bodily tissues along the path of the planned intervention; that is, the view is along the virtual path to the target. Further, other views from the 3D data set may involve segmenting the image so as to, for example, remove the soft tissue, while leaving other bodily parts such as bones and blood vessels in place. This assists in determining if the planned interventional path will encounter obstacles such as bone, or if the interventional path intersects with blood vessels or the like.
  • An analogous 3D data set could be created with an ultrasound head that is movably mounted and then tracked with respect to the securing arm of a guiding device. In an aspect, a 2D ultrasonic (US) transducer, producing a forward-looking view may also be used as a planning tool. Both the target and the direction of the percutaneous intervention can be determined. If the orientation of the US transducer and the C-arm are recorded with reference to each other, the US path planning information can be communicated to the 3D C-arm CT data set so as to facilitate aligning the C arm.
  • If the “generalized bulls-eye view” is used for instrument guidance, the projection of the skin entry point and “target” may not be in the central region of the detector but in some outer region that cannot be shielded by a central collimator. This may be mitigated by movement of the table on which the patient is positioned. The table may be moved such that the “generalized bulls-eye view” extends through the central collimation opening. Further, if the table can be moved such that the “generalized bulls eye view” coincides with the optical axis of the C-arch, then the conventional “bulls-eye view” orientation will have been reached. In this situation laser guidance can be used. However, where the “generalized bulls-eye view” and the optical axis of the C-arch are not in agreement, then the instrument may be aligned using fluoroscopy; this results in additional X-ray exposure. The “bulls-eye view” and the “generalized bulls-eye view” become the same view when the C-arm and the table have been oriented such that the planned instrument trajectory in the patient extends through the isocenter of the C-arm.
  • The C-arm device may have a collimator capable of automatic/dynamic collimation around the planned instrument path, or the tracked path, where the instrument path is tracked so as to reduce the area subject to radiation during each step of the procedure. Depending on the specific equipment configuration, such collimation may be restricted to symmetry about the optical axis of the C-arm, and this may influence the planning of the intervention. Collimators having asymmetrical properties may also be used. They may be better suited for shielding when working under a “generalized bulls-eye view”.
  • As an additional aid to ascertain how precisely the planned needle path agrees with (X-ray) optical axis of the C-arm system, the optical axis may be considered to be defined by the direction of the laser beam. The deviation from the target point (for example in degrees or cm) the discrepancy (in degrees or cm) from the target point at the user display of the C-arm system may be displayed. This may also be represented as a “traffic light” display where the color of an indicator changes as one gets closer to the target.
  • Using such a feedback mechanism, the procedure may be commenced by moving the C-arm into a “generalized bulls-eye view”. When using the laser, the table can be manually moved until the patient is positioned in such a way that the C-arm can be moved into a viewing direction in which the optical system axis intersects the target point. Once the patient and the C-arm have been moved to the correct relative location, the laser may turned on. If the C-arm is moved manually, then a haptic controller may be used to assure that the position can be approached in a controlled way but without overshooting. The table in particular can be moved automatically to a position favorable for the laser mode.
  • During the interventional procedure, the C-arm may be reoriented so as to follow the progress of the intervention. At first, the C-arm may have been oriented in the “bulls eye view;” however, this orientation may not be suitable for visualizing the internal location (the progress) of the interventional device, as the distance into the patient body would be along the length of the instrument, which may be a needle, which is visualized as a point or small circle in the “bull's eye view”. Oblique views would be used rather than axial views to show the progression of the device.
  • The C-arm angulation would need to be changed to obtain a first progression view at an appropriate oblique angle. It would perhaps also be changed again so as to obtain a second progression view, and the like. During this reorientation of the angulation of the C-arm, it may be possible for collisions to occur. That is, the physical structure of the C-arm and attachments thereto may intrude on the physical space of other equipment, personnel, or the patient. Collision may be avoided by using the coordinate location information of the overall system and component parts thereof, by using collision sensors, or by limiting the motions of the C-arm during the course of an intervention. For example, rotations about the patient may be performed along the center plane (using a CRAN/CAUD angle of 0 degrees). Thus, in moving from one orientation to another, one may first retract the angulation of a C-arm (in the CRAN/CAUD direction) and then approach a new rotation angle (in the LAO/RAO direction), and then set the new angulation. This would provide some certainty of the motions of the C-arm device that could occur, particularly when the doctor is standing close to the patient, but has to pivot the C-arm sharply so as to obtain the desired angulation.
  • The use of the combination of the techniques for planning and performing percutaneous interventions with the C-arm X-ray device results in aligning the needle using the laser beam, without exposing the doctor or patient to radiation, and facilitating the planning and evaluating realizable needle paths when the C-arm or the patient table cannot be positioned so as to use the laser, and X-radiation is used to perform the alignment.
  • The combination of hardware and software to accomplish the tasks described herein may be termed a platform or “therapy unit”. The instructions for implementing processes of the platform may be provided on computer-readable storage media or memories, such as a cache, buffer, RAM, FLASH, removable media, hard drive or other computer readable storage media. Computer readable storage media include various types of volatile and nonvolatile storage media. The functions, acts or tasks illustrated or described herein may be executed in response to one or more sets of instructions stored in or on computer readable storage media. The functions, acts or tasks may be independent of the particular type of instruction set, storage media, processor or processing strategy and may be performed by software, hardware, integrated circuits, firmware, micro code and the like, operating alone or in combination. Some aspects of the functions, acts, or tasks may be performed by dedicated hardware, or manually by an operator.
  • In an embodiment, the instructions may be stored on a removable media device for reading by local or remote systems. In other embodiments, the instructions may be stored in a remote location for transfer through a computer network, a local or wide area network, by wireless techniques, or over telephone lines. In yet other embodiments, the instructions are stored within a given computer, system, or device.
  • Where the term “data network”, “web” or “Internet” is used, the intent is to describe an internetworking environment, including both local and wide area networks, where defined transmission protocols are used to facilitate communications between diverse, possibly geographically dispersed, entities. An example of such an environment is the world-wide-web (WWW) and the use of the TCP/IP data packet protocol, and the use of Ethernet or other known or later developed hardware and software protocols for some of the data paths. Often the internetworking environment is provided, in whole or in part, as an attribute of the facility in which the platform is located.
  • Communications between the devices, systems and applications may be by the use of either wired or wireless connections. Wireless communication may include, audio, radio, lightwave or other technique not requiring a physical connection between a transmitting device and a corresponding receiving device. While the communication may be described as being from a transmitter to a receiver, this does not exclude the reverse path, and a wireless communications device may include both transmitting and receiving functions. Such wireless communication may be performed by electronic devices capable of modulating data as signals on carrier waves for transmission, and receiving and demodulating such signals to recover the data. The devices may be compatible with an industry standard protocol such as IEEE 802.11b/g, or other protocols that exist, or may be developed.
  • Generally a medical workflow follows hospital routine procedures prior to and subsequent to the performance of diagnostic tests and treatment. The patient will have been admitted to the hospital and assigned to a medical treatment specialty. The patient demographic information may be entered into the hospital information processing system as part of the admitting procedures, and may be updated as needed.
  • In a workflow method example, the patient may be transported to the treatment room and positioned on the treatment table. Such a transportation step may be manual or may use robotic devices. Herein, it would be understood by a person of skill in the art that robotic devices may be generally substituted for a human activity, or used to facilitate a human activity, as such robotic devices are being introduced into the hospital environment. The use of such robotic devices may be presumed as at least an optional part of the workflow, unless specifically excluded.
  • Three dimensional imaging data (voxel data) may have been previously obtained during diagnosis. Often this data is suitable for at least the planning stage of the interventional treatment, as the target will have been already identified. Providing that the previously obtained data is satisfactory, the voxel data may be processed to produce images that are registered with images of the obtained by the C-arm X-ray device used during the interventional phase. Such images may have been taken with a breathing monitor, if the thoracic region is involved, using fixtures to position the patient, or by applying radio-opaque markers to the skin of the patient in the area of treatment.
  • To the extent that the previously obtained voxel data can be co-registered with the present C-arm apparatus, the planning of intervention may be performed using synthetic images formed from the prior voxel data. Such planning may be performed entirely as a synthetic (computational) process, using the voxel data and a parameter set locating the various components of the treatment apparatus, including the C-arm, patient table, the interventional device (e.g., the needle), the target, and the patient. An objective of the planning process is to determine if the patient can be positioned such that the intervention can be performed with the needle aligned with the optical axis of the C-arm. Success in this planning would result in a “bull's eye view”. Such planning may need to be performed in an iterative manner, as a first plan may result in a “generalized bull's eye view.” The doctor may decide that the generalized bull's eye view is a sufficient for the procedure.
  • Alternatively, the positioning may be re-planned so that the generalized bull's eye view lies closer to the optical axis of the C-arm system so as to make more efficient use of the collimator. This usually would result in re-positioning of the patient table. In this circumstance, the skin penetration point may be identified by the synthetic views and the radiation views used to align the axis of the needle, while confirming the coincidence of the aligned needle with the target. Both the active radiation view and synthetic views may be used in the process.
  • In another alternative, the positioning may be re-planned so that the “bull's eye view” is obtained. In this circumstance, the skin entry point and the target lie along the optical axis of the C-arm. This may be confirmed, if desired, by radiation with a narrow collimation, or the alignment of the needle may be performed using a laser guide defining the optical axis of the C-arm. The laser guide is usually attached perpendicularly to the X-ray detector, as the X-ray emitter is usually positioned beneath the patient table so as to reduce the radiation dosage to the patient's eyes when X-rays are being emitted.
  • In another aspect, during the planning or re-planning process, once an apparently satisfactory angulation and patient table location has been obtained, the angulations that may be desired for obtaining progression views may be explored. That is, the oblique views which may be desired to confirm the progress of the needle along its path, and the depth of penetration, and the intersection of the needle with the target may be simulated. The entire image may be reconstructed, or only the entry point and target may be used. A synthetic path may be displayed so as to assist in visualizing the expected result. In this manner, the planning process also considers the possibility that satisfactory progression views may not be obtainable due to interference between the C-arm, other equipment, the patient, and the doctor. The interventional path may be re-planned to determine if a more satisfactory planned path for the needle may be determined.
  • If the patient is not already on the patient table, the step moving the patient into position on the table is performed so that the patient may be moved into a preplanned position. This may be confirmed by taking X-ray images if deemed desirable.
  • Where a bull's eye view has been obtained, the needle or other interventional device may be aligned by observing the laser spot on the patient skin, placing the penetrating end on the spot, and orienting the axis of the device along the laser beam. Similarly, a fixture attached so that the interventional device may be physically oriented may also be used. Once the interventional device has been oriented, one of a number of known techniques may be used to maintain the orientation prior to, during, or after penetration.
  • Where a generalized bull's eye view has been obtained, the C-arm is positioned using the planned angulation for penetration, and using a real time fluoroscopic image. The penetrating point is positioned at the determined penetration point by comparing the real-time position with the synthetic bull's eye image, or by orienting the needle so that a real-time generalized bull's eye is obtained. Once the interventional device has been oriented such that the penetration axis intersects the target, the interventional device may be held in position mechanically, similarly to the situation which obtains for the bull's eye case.
  • The procedure may now proceed using the aligned interventional instrument. As the instrument is advanced into the patient's body, the C-arm angulation may be changed to one of the positions that may be suitable for progression views, and such additional X-ray images obtained as are considered needed by the doctor to verify or correct the path of the interventional instrument towards the target, and to verify reaching the target.
  • In an example, shown in FIG. 3, a method of planning or performing an interventional procedure 500 may include positioning a patient such that a three dimensional voxel image data set can be obtained (step 510). The three dimensional voxel data set may be used to identify a target area or target volume to be accessed during the interventional procedure (step 520). Volume rendering techniques may be used to plan the path of an interventional device (step 530). For a position of the patient on the patient table, the angulation of a C-arm X-ray device is preferably planned such that skin entry point and target in the patient are projected onto each other (step 540). In the event that a satisfactory “bull's eye” is not obtained (but a generalized bull's eye” is obtained) the positioning may be re-planned by, for example, repositioning the patient table to bring the target area closer to the optical axis of the C-arm (step 550). Step 550 may be performed iteratively until a satisfactory result is obtained, although this may still not be a bull's eye situation.
  • When the desired angulation and patient table positions have been determined, an optional step (step 560) may be performed to determine whether the C-arm may be positioned so as to obtain satisfactory progressions views. This step is dependent on the complexity of the procedure to be performed.
  • After planning the procedure, the patient (if not already present) is placed on the patient table and the orientation of the patient with respect to the C-arm adjusted so as to conform to the planned location. This location may be confirmed by sensors, by radiographic markers, or by the registration of fluoroscopic images with the previously obtained voxel data set (step 570).
  • Where a bull's eye orientation has been obtained, the penetrating end of the interventional device is positioned with respect to the skin entry point using a laser, or a mechanical device oriented along the optical axis of the C-arm (step 580), and the long axis of the interventional device oriented so as to coincide with the optical axis. Other laser configurations, such as a fan beam, may be used. After alignment, the position may be maintained or regained using known techniques.
  • Where a generalized bull's eye is the most satisfactory orientation that is available for use in the circumstance, the penetrating end of the interventional device is located with respect to one of the target or a synthetic entry point on a real-time radiographic image, and the axis of the interventional device is oriented so as to lie along the path between the determined skin entry point and the target (step 590). After alignment, the position may be maintained or regained using known techniques.
  • The interventional procedure may be then performed by advancing the interventional device so as to penetrate the skin (step 600). During the interventional procedure, the needle, for example, having been aligned as previously described is advanced so as to penetrate the patient skin along the pre-planned intervention path (step 610).
  • At a juncture where the doctor may wish to confirm the alignment or penetration depth of needle, the C-arm is re-angulated so as to obtain a progression fluoroscopic view (step 620). To aid the interpretation of the resultant radiograph, a synthetic line may be overlaid on the radiograph indicating the projection of the planned interventional path on the specific view. Intermediate positions may be presented in synthesized views. The position of the C-arm relative to the room coordinate system is known through the previously preformed registration steps. Step 620 is performed as many times as the doctor considers necessary to achieve a satisfactory positioning of the needle with respect to the target, which may be, for example, a suspected tumor to be biopsied.
  • The angulation may be returned to that of the bull's eye view, or the generalized bull's eye view that was determined in steps 540 and 550 so as to verify or adjust the axial alignment. When the progression views or a “dead-reckoning” calculation shows that the needle tip is in the proper relationship to the target (angle and distance), the specific procedure may be performed (step 630). The procedure may include obtaining a biopsy, administering a medication, applying cement, or the like, depending on the syndrome being treated or diagnosed.
  • While the workflows and methods disclosed herein have been described and shown with reference to particular steps performed in a particular order, it will be understood that these steps may be combined, sub-divided, or reordered to from an equivalent workflow method without departing from the teachings of the present invention. Accordingly, unless specifically indicated herein, the order and grouping of steps is not a limitation of the present invention.
  • Although only a few exemplary embodiments of this invention have been described in detail above, those skilled in the art will readily appreciate that many modifications are possible in the exemplary embodiments without materially departing from the novel teachings and advantages of the invention. Accordingly, all such modifications are intended to be included within the scope of this invention as set forth in the following claims.

Claims (30)

1. A method for planning a percutaneous medical procedure, the method comprising:
obtaining three-dimensional image data of a patient;
orienting a representation of the patient, or the patient, with respect to a trackable patient support table of an interventional apparatus;
registering the three-dimensional image data with respect to the patient; and
planning the percutaneous medical procedure using the three dimensional image data,
wherein an intervention path is planned so that an intervention device is alignable along a guiding axis passing through a skin entry point and a target and one of a bull's eye or a generalized bull's eye image is obtained in an X-ray projection image view.
2. The method of claim 1 wherein the interventional apparatus is a C-arm X-ray apparatus.
3. The method of claim 1, wherein the interventional device is a biopsy needle.
4. The method of claim 1, wherein a bull's eye view is obtained when the skin entry point and the target are aligned along the central axis of the interventional apparatus.
5. The method of claim 4, wherein the interventional device is aligned with the optical axis using one of a laser pointer or a plurality of laser fan beams.
6. The method of claim 5, wherein the aligned device is maintained in position using a mechanical fixture.
7. The method of claim 1, wherein a generalized bull's eye view is obtained when the interventional path does not lie along an optical axis of the interventional apparatus.
8. The method of claim 7, wherein, when a generalized bull's eye view is obtained, re-reorienting the patient to plan an intervention path lying closer to the interventional apparatus optical axis.
9. The method of claim 8, wherein the step of re-orienting includes re-positioning the patient support table.
10. The method of claim 1, wherein a first planned angulation of the interventional apparatus is associated with the interventional path.
11. The method of claim 1, further comprising:
comparing a volumetric location of the interventional apparatus for a first planned angulation of the interventional apparatus with a volumetric location of the patient, and the patient table and determining that a collision does not result.
12. The method of claim 1, wherein a second planned angulation of the interventional apparatus is associated with a progression view of the interventional path.
13. The method of claim 1, wherein a second planned angulation of the interventional axis is associated with a progression view of the interventional path and the target.
14. The method of claim 1, wherein orienting a representation of the patient comprises:
generating two-dimensional image data from the three dimensional image data of the patient and registering the coordinates of the patient with respect to coordinates of interventional apparatus.
15. The method of claim 14, further comprising:
positioning the patient so that the patient coincides with a registered skin surface image, at least within the field of view of an image to be obtained by the interventional apparatus.
16. The method of claim 1, further comprising:
positioning a tip of the interventional device such that a skin penetration is made at the skin entry point, and an axis of the interventional device passes through the target.
17. The method of claim 1, further comprising:
performing the planned medical procedure.
18. The method of claim 17, wherein the step of performing the planned interventional procedure further comprises:
advancing the intervention device along the guiding axis through the skin entry point;
adjusting an angulation of the interventional apparatus so as to obtain a progression image of the interventional device; and
determining that the interventional device is aligned along the guiding axis; or,
determining that the interventional device is not aligned along the guiding axis and correcting an alignment of the interventional device.
19. The method of claim 18, further comprising:
repeating the steps of advancing, adjusting and determining until a tip of the interventional device is positioned in a planned position with respect to the target.
20. The method of claim 18, further comprising:
using the interventional device to perform a procedure on the target.
21. The method of claim 1, wherein the target is identified in the three-dimensional image data.
22. The method of claim 21, wherein the target is displayed as a synthetic image in a two-dimensional image derived from the three-dimensional image data.
23. The method of claim 21, wherein the target is displayed as a synthetic image in a fluoroscopic image obtained by the interventional apparatus.
24. The method of claim 1, wherein an endoscopic view along the guiding path is produced from the three-dimensional image data.
25. A computer program product for planning a medical procedure, the product being stored on a computer readable medium, comprising:
instructions for configuring a computer to:
accept three-dimensional image data of a patient;
register the three dimensional data image with respect to the patient;
produce images suitable for planning a guiding path between a skin entry point and a target in the patient;
determine the axis of the guiding path and a corresponding angulation of an interventional apparatus having imaging capability;
determine the volumetric relationship between the interventional apparatus when oriented at the angulation with respect to a patient support;
determine whether there is an collision between the interventional apparatus and the patient support.
26. The computer program product of claim 25, wherein, when the guiding path does not lie on an optical axis of the interventional apparatus, re-determining re-planning the procedure by changing at least one of the angulation or the position of the patient support table to bring the guiding path into closer angular alignment with the optical axis.
27. The computer program of claim 25, wherein a synthetic image of the target is superimposed on an image of the images used for planning.
28. The computer program of claim 25, wherein an icon representing a skin entry point is superimposed on the image.
29. The computer program of claim 28, wherein when the planned angulation of the interventional apparatus is adjusted, a relative position of the icon and the synthetic image of the target is compared so as to determine whether a bull's eye or generalized bull's eye alignment has been achieved.
30. The computer program of claim 29, wherein an operator input device controls the computed angulation of the interventional apparatus, and the corresponding image.
US12/429,546 2008-05-02 2009-04-24 System and method for a medical procedure using computed tomography Abandoned US20090281452A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/429,546 US20090281452A1 (en) 2008-05-02 2009-04-24 System and method for a medical procedure using computed tomography

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US4980308P 2008-05-02 2008-05-02
US12/429,546 US20090281452A1 (en) 2008-05-02 2009-04-24 System and method for a medical procedure using computed tomography

Publications (1)

Publication Number Publication Date
US20090281452A1 true US20090281452A1 (en) 2009-11-12

Family

ID=41257080

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/429,546 Abandoned US20090281452A1 (en) 2008-05-02 2009-04-24 System and method for a medical procedure using computed tomography
US12/431,518 Active 2030-06-07 US8165660B2 (en) 2008-05-02 2009-04-28 System and method for selecting a guidance mode for performing a percutaneous procedure

Family Applications After (1)

Application Number Title Priority Date Filing Date
US12/431,518 Active 2030-06-07 US8165660B2 (en) 2008-05-02 2009-04-28 System and method for selecting a guidance mode for performing a percutaneous procedure

Country Status (1)

Country Link
US (2) US20090281452A1 (en)

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010130056A1 (en) * 2009-05-14 2010-11-18 University Health Network Quantitative endoscopy
US20110235876A1 (en) * 2010-03-24 2011-09-29 Marcus Pfister Method and Device for Automatically Adapting a Reference Image
US20110313285A1 (en) * 2010-06-22 2011-12-22 Pascal Fallavollita C-arm pose estimation using intensity-based registration of imaging modalities
US20120065498A1 (en) * 2010-09-14 2012-03-15 Thomas Redel Apparatus and method for minimally invasive therapy of mitral regurgitation
DE102011075435A1 (en) * 2011-05-06 2012-11-08 Siemens Aktiengesellschaft display unit
US20120289777A1 (en) * 2011-05-13 2012-11-15 Intuitive Surgical Operations, Inc. Medical system providing dynamic registration of a model of an anatomical structure for image-guided surgery
US20120327081A1 (en) * 2010-02-24 2012-12-27 Yuji Suda Percutaneous puncture support system
US20130218003A1 (en) * 2012-02-21 2013-08-22 Siemens Aktiengesellschaft Rapid entry point localization for percutaneous interventions
US20130298329A1 (en) * 2012-05-14 2013-11-14 Hanns Eder Patient support apparatus, a medical imaging apparatus with the patient support apparatus and a method for marking a maximum occupancy area
US20130345554A1 (en) * 2012-06-08 2013-12-26 Baylis Medical Company Inc. Methods for recanalization of vessels
US20140270067A1 (en) * 2013-03-14 2014-09-18 Stephen Knecht Clark Radiographic marker
US8848874B2 (en) 2010-09-20 2014-09-30 Siemens Medical Solutions Usa, Inc. System for recovering from collision of components of an X-ray imaging unit
RU2551791C2 (en) * 2009-12-18 2015-05-27 Конинклейке Филипс Электроникс Н.В. Multi-section alignment of imaging data
US20150178886A1 (en) * 2013-12-20 2015-06-25 Marcus Pfister Image Monitoring During an Interventional Procedure, X-Ray Device, Computer Program and Data Medium
WO2015132787A1 (en) * 2014-03-04 2015-09-11 Xact Robotics Ltd. Dynamic planning method for needle insertion
CN104994787A (en) * 2013-02-14 2015-10-21 株式会社东芝 X-ray diagnostic device
DE102011086941B4 (en) * 2011-11-23 2016-01-21 Kuka Roboter Gmbh industrial robots
WO2016043411A1 (en) * 2014-09-18 2016-03-24 Samsung Electronics Co., Ltd. X-ray apparatus and method of scanning the same
US9349288B2 (en) 2014-07-28 2016-05-24 Econolite Group, Inc. Self-configuring traffic signal controller
US20170245951A1 (en) * 2012-06-21 2017-08-31 Globus Medical, Inc. Surgical robot platform
WO2018064566A1 (en) 2016-09-30 2018-04-05 Intuitive Surgical Operations, Inc. Systems and methods for entry point localization
US20180325474A1 (en) * 2017-05-10 2018-11-15 Siemens Healthcare Gmbh X-ray imaging system and method for recording x-ray images
US10614917B2 (en) * 2015-08-19 2020-04-07 Siemens Healthcare Gmbh Medical apparatus and method of controlling a medical apparatus
EP3515311A4 (en) * 2016-09-20 2020-06-24 Kornerstone Devices Pvt. Ltd. Light and shadow guided needle positioning system and method
US11490872B2 (en) 2020-08-21 2022-11-08 GE Precision Healthcare LLC C-arm imaging system and method

Families Citing this family (60)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5317933B2 (en) * 2009-11-17 2013-10-16 富士フイルム株式会社 Image display device and program thereof
DE102010031943A1 (en) * 2010-07-22 2012-01-26 Siemens Aktiengesellschaft Method for marking a predetermined guide path of a medical instrument and orientation device
US9078618B2 (en) * 2010-10-29 2015-07-14 General Electric Company Methods and systems for patient alignment for nuclear medicine imaging
US10631712B2 (en) * 2011-02-10 2020-04-28 Karl Storz Imaging, Inc. Surgeon's aid for medical display
EP2699166B1 (en) 2011-04-21 2019-09-04 Koninklijke Philips N.V. Mpr slice selection for visualization of catheter in three-dimensional ultrasound
JP5731888B2 (en) * 2011-04-22 2015-06-10 株式会社東芝 X-ray diagnostic imaging equipment
JP5501290B2 (en) * 2011-05-23 2014-05-21 富士フイルム株式会社 Image processing apparatus, radiographic image capturing system, and image processing program
EP2526868A1 (en) * 2011-05-26 2012-11-28 General Electric Company X-ray imaging apparatus having a variable distance between an X-ray source and an object to be imaged
WO2012168749A1 (en) * 2011-06-06 2012-12-13 Sarr Souleymane Removable guide device for radiofluoroscopic infiltration having an image intensifier
CN103118595B (en) * 2011-07-06 2015-09-16 株式会社东芝 Medical diagnostic imaging apparatus
DE102011083876B4 (en) * 2011-09-30 2018-12-27 Siemens Healthcare Gmbh Method for controlling the movement of an X-ray device and X-ray system
US9039283B2 (en) * 2011-10-11 2015-05-26 Siemens Aktiengesellschaft Method and apparatus for producing an X-ray projection image in a desired direction
US9510771B1 (en) 2011-10-28 2016-12-06 Nuvasive, Inc. Systems and methods for performing spine surgery
JP5954762B2 (en) * 2011-11-29 2016-07-20 東芝メディカルシステムズ株式会社 X-ray diagnostic imaging equipment
JP6283872B2 (en) * 2012-01-27 2018-02-28 キヤノンメディカルシステムズ株式会社 X-ray CT system, X-ray CT system
BR112014032112A2 (en) * 2012-06-28 2017-06-27 Koninklijke Philips Nv image acquisition system; and method for multimodal image acquisition
US10039518B2 (en) 2012-10-05 2018-08-07 Koninklijke Philips N.V. ROI painting
KR101811817B1 (en) 2013-02-14 2018-01-25 세이코 엡슨 가부시키가이샤 Head mounted display and control method for head mounted display
DE102013208793B4 (en) * 2013-05-14 2015-04-23 Siemens Aktiengesellschaft A method for generating a 3D image data set from a volume to be examined as a basis for display image data
FR3007962B1 (en) * 2013-07-04 2015-06-26 X Nov Ip GRAPHICAL SELECTION OF BONE ANCHOR PROSTHESIS
US10433911B2 (en) * 2013-09-18 2019-10-08 iMIRGE Medical INC. Optical targeting and visualization of trajectories
US9877795B2 (en) * 2013-09-18 2018-01-30 Imirge Medical Inc Optical targeting and visualization of trajectories
US9848922B2 (en) 2013-10-09 2017-12-26 Nuvasive, Inc. Systems and methods for performing spine surgery
US10130388B2 (en) 2014-03-13 2018-11-20 New York University Position guidance device with bubble level
WO2015142762A1 (en) * 2014-03-17 2015-09-24 Brown Roy A Surgical targeting systems and methods
DE102014219436A1 (en) * 2014-09-25 2016-03-31 Siemens Aktiengesellschaft Mobile X-ray machine
US10162935B2 (en) * 2014-11-26 2018-12-25 Koninklijke Philips N.V. Efficient management of visible light still images and/or video
EP3028675B1 (en) * 2014-12-05 2017-07-19 X.Nov IP Sarl Graphical selection of a prosthesis with bone anchor
JP6625347B2 (en) * 2015-06-01 2019-12-25 キヤノンメディカルシステムズ株式会社 X-ray angiography device
WO2016195684A1 (en) * 2015-06-04 2016-12-08 Siemens Healthcare Gmbh Apparatus and methods for a projection display device on x-ray imaging devices
WO2017009398A1 (en) * 2015-07-16 2017-01-19 Koninklijke Philips N.V. Device for remote fluoroscopy, nearby fluoroscopy and radiology
KR101798939B1 (en) * 2015-09-08 2017-11-17 삼성전자주식회사 X-ray image apparatus and control method for the same
EP3355822A1 (en) * 2015-09-28 2018-08-08 Koninklijke Philips N.V. Optical registration of a remote center of motion robot
US20180303559A1 (en) * 2015-10-19 2018-10-25 New York University Electronic position guidance device with real-time auditory and visual feedback
US11302435B2 (en) 2016-01-06 2022-04-12 Boston Scientific Scimed, Inc. Systems and methods for planning medical procedures
EP3988027A1 (en) 2016-03-13 2022-04-27 Vuze Medical Ltd. Apparatus and methods for use with skeletal procedures
WO2017157974A1 (en) * 2016-03-16 2017-09-21 Koninklijke Philips N.V. System for assisting in performing an interventional procedure
NL2016800B1 (en) * 2016-05-19 2017-12-05 Umc Utrecht Holding Bv Method of positioning an interventional device.
US20170354387A1 (en) * 2016-06-08 2017-12-14 General Electric Company Fluoroscopic Guidance System With Offset Light Source and Method Of Use
DE102016212467A1 (en) * 2016-07-08 2018-01-11 Siemens Healthcare Gmbh Motion control for mobile X-ray device
DE102016214319A1 (en) * 2016-08-03 2018-02-08 Siemens Healthcare Gmbh biopsy unit
US11815347B2 (en) * 2016-09-28 2023-11-14 Kla-Tencor Corporation Optical near-field metrology
JP6828465B2 (en) * 2017-01-30 2021-02-10 セイコーエプソン株式会社 Endoscope operation support system
JP6599913B2 (en) * 2017-02-28 2019-10-30 株式会社メディカロイド Robot operating table operating device
EP4344658A2 (en) 2017-05-10 2024-04-03 MAKO Surgical Corp. Robotic spine surgery system
US10667869B2 (en) * 2017-05-17 2020-06-02 General Electric Company Guidance system for needle procedures
WO2019012520A1 (en) 2017-07-08 2019-01-17 Vuze Medical Ltd. Apparatus and methods for use with image-guided skeletal procedures
CN111526794A (en) * 2017-12-04 2020-08-11 柯惠有限合伙公司 Automatic segmentation ablation antenna from CT image
JP7271183B2 (en) * 2018-01-15 2023-05-11 キヤノンメディカルシステムズ株式会社 Medical information processing device, X-ray diagnostic system and medical information processing program
EP3646790A1 (en) * 2018-10-31 2020-05-06 Koninklijke Philips N.V. Guidance during x-ray imaging
WO2020105049A1 (en) * 2018-11-22 2020-05-28 Vuze Medical Ltd. Apparatus and methods for use with image-guided skeletal procedures
CN109481018A (en) 2018-12-29 2019-03-19 上海联影医疗科技有限公司 A kind of navigation equipment and method applied in medical care precess
DE102019108753A1 (en) * 2019-04-03 2020-10-08 Amedo Smart Tracking Solutions Gmbh 3D laser-based positioning system
US11813026B2 (en) 2019-04-05 2023-11-14 Medos International Sarl Systems, devices, and methods for providing surgical trajectory guidance
DE102019204920A1 (en) * 2019-04-05 2020-10-08 Siemens Healthcare Gmbh Guide unit for guiding a needle with an X-ray-opaque contour; Management system; Investment; Computer program product and method
US11627924B2 (en) 2019-09-24 2023-04-18 Covidien Lp Systems and methods for image-guided navigation of percutaneously-inserted devices
CN110680481B (en) * 2019-11-15 2021-01-05 元亨同基医疗器械(北京)有限公司 Method for adjusting launching position of guide holder of puncture positioning instrument
CN113367779A (en) * 2021-06-16 2021-09-10 张涛 Puncture system and method based on C-arm CT and semiconductor laser
CN114271838B (en) * 2021-12-01 2022-11-11 赛诺威盛科技(北京)股份有限公司 Multifunctional scanning method, multifunctional scanning support, electronic equipment and storage medium
DE102022201952B3 (en) * 2022-02-25 2023-06-22 Siemens Healthcare Gmbh Trajectory planning for a medical robot system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6064904A (en) * 1997-11-28 2000-05-16 Picker International, Inc. Frameless stereotactic CT scanner with virtual needle display for planning image guided interventional procedures
US6470207B1 (en) * 1999-03-23 2002-10-22 Surgical Navigation Technologies, Inc. Navigational guidance via computer-assisted fluoroscopic imaging
US20060120507A1 (en) * 2004-11-26 2006-06-08 Thomas Brunner Angiographic x-ray diagnostic device for rotation angiography
US20060274888A1 (en) * 2005-05-19 2006-12-07 Philipp Bernhardt Medical imaging system with a part which can be moved about a patient and a collision protection method
US20070003014A1 (en) * 2005-06-30 2007-01-04 Siemens Aktiengesellschaft Method or x-ray device for creating a series of recordings of medical x-ray images of a patient who might possibly be moving during the recording of the series images
US20070055131A1 (en) * 2005-09-01 2007-03-08 Siemens Aktiengesellschaft Method for displaying a medical implant in an image and a medical imaging system

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19501069A1 (en) * 1995-01-16 1996-07-18 Wolfgang Kloess Light sighting device for marking guide path of instrument, esp. diagnostic or therapeutic needle
US6195577B1 (en) * 1998-10-08 2001-02-27 Regents Of The University Of Minnesota Method and apparatus for positioning a device in a body
DE10147160C1 (en) * 2001-09-25 2003-04-24 Siemens Ag C-arm x-ray system with flexible detector positioning
US20060039537A1 (en) * 2004-05-28 2006-02-23 Strobel Norbert K C-arm device with adjustable detector offset for cone beam imaging involving partial circle scan trajectories
DE102006020403B3 (en) * 2006-04-28 2007-08-16 Siemens Ag X-ray C-arch system for patients anatomy imaging comprises rotatable and slidable C-arch on which X-ray source and surface detector with even upper surface is arranged and a puncture needle holder adapted from two cross-end struts
DE102006037565B3 (en) * 2006-08-10 2008-02-21 Siemens Ag Object area`s e.g. kidney, image x-ray recording method for C-curve-system, involves reconstructing three-dimensional image of area and adjusting object bearing device synchronously to rotation so that area lies within jet cone of bundle
US8265731B2 (en) * 2007-02-13 2012-09-11 Siemens Medical Solutions Usa, Inc. Apparatus and method for aligning a light pointer with a medical interventional device trajectory
US9370627B2 (en) * 2007-02-20 2016-06-21 Siemens Medical Solutions Usa, Inc. Needle guidance with a dual-headed laser

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6064904A (en) * 1997-11-28 2000-05-16 Picker International, Inc. Frameless stereotactic CT scanner with virtual needle display for planning image guided interventional procedures
US6470207B1 (en) * 1999-03-23 2002-10-22 Surgical Navigation Technologies, Inc. Navigational guidance via computer-assisted fluoroscopic imaging
US20060120507A1 (en) * 2004-11-26 2006-06-08 Thomas Brunner Angiographic x-ray diagnostic device for rotation angiography
US20060274888A1 (en) * 2005-05-19 2006-12-07 Philipp Bernhardt Medical imaging system with a part which can be moved about a patient and a collision protection method
US20070003014A1 (en) * 2005-06-30 2007-01-04 Siemens Aktiengesellschaft Method or x-ray device for creating a series of recordings of medical x-ray images of a patient who might possibly be moving during the recording of the series images
US20070055131A1 (en) * 2005-09-01 2007-03-08 Siemens Aktiengesellschaft Method for displaying a medical implant in an image and a medical imaging system

Cited By (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010130056A1 (en) * 2009-05-14 2010-11-18 University Health Network Quantitative endoscopy
US9138597B2 (en) 2009-05-14 2015-09-22 University Health Network Quantitative endoscopy
RU2551791C2 (en) * 2009-12-18 2015-05-27 Конинклейке Филипс Электроникс Н.В. Multi-section alignment of imaging data
US20120327081A1 (en) * 2010-02-24 2012-12-27 Yuji Suda Percutaneous puncture support system
US20110235876A1 (en) * 2010-03-24 2011-09-29 Marcus Pfister Method and Device for Automatically Adapting a Reference Image
US8929631B2 (en) * 2010-03-24 2015-01-06 Siemens Aktiengesellschaft Method and device for automatically adapting a reference image
US20110313285A1 (en) * 2010-06-22 2011-12-22 Pascal Fallavollita C-arm pose estimation using intensity-based registration of imaging modalities
US9282944B2 (en) * 2010-06-22 2016-03-15 Queen's University At Kingston C-arm pose estimation using intensity-based registration of imaging modalities
US20120065498A1 (en) * 2010-09-14 2012-03-15 Thomas Redel Apparatus and method for minimally invasive therapy of mitral regurgitation
CN102440811A (en) * 2010-09-14 2012-05-09 西门子公司 Apparatus and method for minimally invasive therapy of mitral regurgitation
US8848874B2 (en) 2010-09-20 2014-09-30 Siemens Medical Solutions Usa, Inc. System for recovering from collision of components of an X-ray imaging unit
DE102011075435A1 (en) * 2011-05-06 2012-11-08 Siemens Aktiengesellschaft display unit
US8900131B2 (en) * 2011-05-13 2014-12-02 Intuitive Surgical Operations, Inc. Medical system providing dynamic registration of a model of an anatomical structure for image-guided surgery
US20120289777A1 (en) * 2011-05-13 2012-11-15 Intuitive Surgical Operations, Inc. Medical system providing dynamic registration of a model of an anatomical structure for image-guided surgery
DE102011086941B4 (en) * 2011-11-23 2016-01-21 Kuka Roboter Gmbh industrial robots
US20130218003A1 (en) * 2012-02-21 2013-08-22 Siemens Aktiengesellschaft Rapid entry point localization for percutaneous interventions
US9289181B2 (en) * 2012-05-14 2016-03-22 Siemens Aktiengesellschaft Patient support apparatus, a medical imaging apparatus with the patient support apparatus and a method for marking a maximum occupancy area
US20130298329A1 (en) * 2012-05-14 2013-11-14 Hanns Eder Patient support apparatus, a medical imaging apparatus with the patient support apparatus and a method for marking a maximum occupancy area
US20130345554A1 (en) * 2012-06-08 2013-12-26 Baylis Medical Company Inc. Methods for recanalization of vessels
US10485617B2 (en) * 2012-06-21 2019-11-26 Globus Medical, Inc. Surgical robot platform
US20170245951A1 (en) * 2012-06-21 2017-08-31 Globus Medical, Inc. Surgical robot platform
CN104994787A (en) * 2013-02-14 2015-10-21 株式会社东芝 X-ray diagnostic device
US20140270067A1 (en) * 2013-03-14 2014-09-18 Stephen Knecht Clark Radiographic marker
US9501835B2 (en) * 2013-12-20 2016-11-22 Siemens Aktiengesellschaft Image monitoring during an interventional procedure, X-ray device, computer program and data medium
US20150178886A1 (en) * 2013-12-20 2015-06-25 Marcus Pfister Image Monitoring During an Interventional Procedure, X-Ray Device, Computer Program and Data Medium
US10245110B2 (en) 2014-03-04 2019-04-02 Xact Robotics Ltd. Dynamic planning method for needle insertion
US11452567B2 (en) 2014-03-04 2022-09-27 Xact Robotics Ltd. Dynamic planning method for needle insertion
US10702341B2 (en) 2014-03-04 2020-07-07 Xact Robotics Ltd Dynamic planning method for needle insertion
WO2015132787A1 (en) * 2014-03-04 2015-09-11 Xact Robotics Ltd. Dynamic planning method for needle insertion
US9349288B2 (en) 2014-07-28 2016-05-24 Econolite Group, Inc. Self-configuring traffic signal controller
US9978270B2 (en) 2014-07-28 2018-05-22 Econolite Group, Inc. Self-configuring traffic signal controller
US10991243B2 (en) 2014-07-28 2021-04-27 Econolite Group, Inc. Self-configuring traffic signal controller
US10198943B2 (en) 2014-07-28 2019-02-05 Econolite Group, Inc. Self-configuring traffic signal controller
WO2016043411A1 (en) * 2014-09-18 2016-03-24 Samsung Electronics Co., Ltd. X-ray apparatus and method of scanning the same
US10614917B2 (en) * 2015-08-19 2020-04-07 Siemens Healthcare Gmbh Medical apparatus and method of controlling a medical apparatus
EP3515311A4 (en) * 2016-09-20 2020-06-24 Kornerstone Devices Pvt. Ltd. Light and shadow guided needle positioning system and method
EP3518807A4 (en) * 2016-09-30 2020-05-27 Intuitive Surgical Operations Inc. Systems and methods for entry point localization
US11219490B2 (en) 2016-09-30 2022-01-11 Intuitive Surgical Operations, Inc. Systems and methods for entry point localization
WO2018064566A1 (en) 2016-09-30 2018-04-05 Intuitive Surgical Operations, Inc. Systems and methods for entry point localization
US11779405B2 (en) 2016-09-30 2023-10-10 Intuitive Surgical Operations, Inc. Systems and methods for entry point localization
US20180325474A1 (en) * 2017-05-10 2018-11-15 Siemens Healthcare Gmbh X-ray imaging system and method for recording x-ray images
US11490872B2 (en) 2020-08-21 2022-11-08 GE Precision Healthcare LLC C-arm imaging system and method

Also Published As

Publication number Publication date
US8165660B2 (en) 2012-04-24
US20090274271A1 (en) 2009-11-05

Similar Documents

Publication Publication Date Title
US20090281452A1 (en) System and method for a medical procedure using computed tomography
US9259195B2 (en) Remotely held needle guide for CT fluoroscopy
US7603155B2 (en) Method and system of acquiring images with a medical imaging device
US11759272B2 (en) System and method for registration between coordinate systems and navigation
JP3947707B2 (en) Method and apparatus for acquiring and displaying computed tomographic images using a fluoroscopic imaging system
US8886286B2 (en) Determining and verifying the coordinate transformation between an X-ray system and a surgery navigation system
US9427286B2 (en) Method of image registration in a multi-source/single detector radiographic imaging system, and image acquisition apparatus
US11559266B2 (en) System and method for local three dimensional volume reconstruction using a standard fluoroscope
Schulz et al. Accuracy and speed of robotic assisted needle interventions using a modern cone beam computed tomography intervention suite: a phantom study
EP3148643B1 (en) Systems for brachytherapy planning based on imaging data
JP2008126075A (en) System and method for visual verification of ct registration and feedback
US20100111389A1 (en) System and method for planning and guiding percutaneous procedures
EP3254627A1 (en) Fluoroscopic guidance system with offset light source and method of use
Yaniv et al. Needle-based interventions with the image-guided surgery toolkit (IGSTK): from phantoms to clinical trials
JP6349278B2 (en) Radiation imaging apparatus, image processing method, and program
US11918297B2 (en) System and method for registration between coordinate systems and navigation
US8489176B1 (en) Radioactive emission detector equipped with a position tracking system and utilization thereof with medical systems and in medical procedures
CN109152929B (en) Image-guided treatment delivery
Bhattacharji et al. Application of real-time 3D navigation system in CT-guided percutaneous interventional procedures: a feasibility study
Toporek et al. Cone-beam computed tomography-guided stereotactic liver punctures: a phantom study
US20070055129A1 (en) Method and device for displaying a surgical instrument during placement thereof in a patient during a treatment
US20160183919A1 (en) Method for displaying stored high-resolution diagnostic 3-d image data and 2-d realtime sectional image data simultaneously, continuously, and in parallel during a medical intervention of a patient and arrangement for carrying out said method
US20080285707A1 (en) System and Method for Medical Navigation
US20230190377A1 (en) Technique Of Determining A Scan Region To Be Imaged By A Medical Image Acquisition Device
Lin et al. Phantom evaluation of an image-guided navigation system based on electromagnetic tracking and open source software

Legal Events

Date Code Title Description
AS Assignment

Owner name: SIEMENS AKTIENGESELLSCHAFT, GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PFISTER, MARCUS;STROBEL, NORBERT;REEL/FRAME:023004/0213

Effective date: 20090525

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION