US20110054303A1 - Apparatus for registering and tracking an instrument - Google Patents

Apparatus for registering and tracking an instrument Download PDF

Info

Publication number
US20110054303A1
US20110054303A1 US12/782,108 US78210810A US2011054303A1 US 20110054303 A1 US20110054303 A1 US 20110054303A1 US 78210810 A US78210810 A US 78210810A US 2011054303 A1 US2011054303 A1 US 2011054303A1
Authority
US
United States
Prior art keywords
curvature sensor
patient
computer
curvature
fiducials
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/782,108
Inventor
Earl Frederick Barrick
Judy Barrick
Kenneth J. Hintz
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
George Mason Intellectual Properties Inc
Original Assignee
George Mason Intellectual Properties Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by George Mason Intellectual Properties Inc filed Critical George Mason Intellectual Properties Inc
Priority to US12/782,108 priority Critical patent/US20110054303A1/en
Publication of US20110054303A1 publication Critical patent/US20110054303A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1077Measuring of profiles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6804Garments; Clothes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6814Head
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/7445Display arrangements, e.g. multiple display units
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2072Reference field transducer attached to an instrument or patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/45For evaluating or diagnosing the musculoskeletal system or teeth
    • A61B5/4504Bones

Definitions

  • This invention pertains to computer assisted surgery and/or therapy using medical imaging systems, such as computed tomography (CT), fluoroscopy and/or magnetic resonance imaging (MRI) for guidance, and more particularly to a fiducial reference system and position sensing.
  • medical imaging systems such as computed tomography (CT), fluoroscopy and/or magnetic resonance imaging (MRI) for guidance, and more particularly to a fiducial reference system and position sensing.
  • CT computed tomography
  • MRI magnetic resonance imaging
  • Computer assisted image guided stereotactic surgery is beneficial in the biopsy and ablation of primary brain tumors, benign and malignant, as well as in many other intracranial procedures using computed tomography, MRI, positron emission tomography (PET) and single photon emissions tomography (SPECT). It is especially useful in accurate localization of intracranial vital structures.
  • the passive articulated arm has been shown to be useful in the resection of brain metastases.
  • Surgical navigation is used in head and neck tumors, employing MRI and CT imaging.
  • Stereotactic interstitial brachytherapy has been used for inoperable head and neck cancers, with a head holder used for immobilization.
  • Brachytherapy the insertion of an ablative radioactive material into an otherwise inoperable tumor, can be placed accurately and through a smaller incision using computer assisted image guided stereotactic surgery.
  • Other uses include vaporization of tumor slice and MRI, digital angiography, and computer-resident stereotactic atlases. Such methods are particularly utilized in neurosurgical and otolaryngological procedures of the head and orthopaedic procedures of the pelvis and spine.
  • An essential capability of and step in the use of computer assisted surgery is registering the computer system and the digitized CT or MRI image data set to the patient in a common frame of reference in order to correlate the virtual CT or MRI image with the actual body section so imaged.
  • Image-to-instrument registration at its most basic level requires some fiducials distributed in 3-dimensional space to be preoperatively imaged simultaneously with the patient. These fiducials provide an image frame of reference (ImFOR).
  • the fiducials can either be synthetically added or consist of a set of pre-existing anatomical landmarks.
  • One method of registration uses CT or MRI imageable-markers or “fiducials” that can be recognized in renderings of the data set and also on the object body segment, and by touching or matching them point-to-point with a digitizing probe.
  • digitizing probe with sensors or emitters or reflectors for a waveform tracking system is then touched to each fiducial, enabling a computer to match the fiducials with the same points identified on the reconstructed images, planar and/or three dimensional, on a computer workstation.
  • the computer program determines if an accurate match is obtained.
  • This manual registration procedure locates the fiducials relative to an instrument frame of reference (InFOR). It is typical to use the operating room as the primary frame of reference (ORFOR) with the InFOR having a measured offset within the ORFOR.
  • This method is referred to as point-to-point registration.
  • a related registration method using fiducials attached to the patient involves mounting a reference frame directly to the patient's skeleton, such as the spine, pelvis or femur.
  • the skull can be fixed to a table mounted frame.
  • the position of this frame of reference is optically tracked in real-time using the same video cameras used to track the surgical or therapeutic instrument.
  • the fiducials' physical location being known relative to the InFOR and the InFOR being known relative to the ORFOR and the fiducials also being known relative to the image
  • the location of arbitrary points in the image can be located in the physical space.
  • the instrument is then tracked (passive navigation) using one of several methods.
  • a second method of registration involves touching a segment of the body multiple times with a digitizing probe to obtain a multitude of points that will give a surface or shape that can be matched with the anatomic shape.
  • Another version of this method uses an ultrasound probe with sensors, emitters or reflectors to map the surface of underlying bone to obtain a shape or surface.
  • the computer program then matches the shape of the combined points to match the reconstructed image of the object part of the body. This method is referred to as shape or surface matching registration.
  • a third method of registration involves taking an X-ray of the body segment with a digitizing fluoroscope and matching the two-dimensional images with the three-dimensional data set.
  • the first registration methods require the surgeon to place fiducials on or in a patient before an imaging study and then use a digitizing probe to touch point fiducials or a surface, which is a tedious process.
  • Using anatomic fiducials to register vertebrae for the insertion of pedicle screws has proven tedious and time consuming, so much so that this method has not gained general acceptance by orthopedic surgeons.
  • the second method requires the exposure of a large area of bony surface in some cases, which is contradictory to one of the aims of using small incisions.
  • the third method is more automatic but requires that a portable X-ray fluoroscopy machine be used.
  • Image to patient registration has been performed cutaneously by attaching spheres to a patient's scalp and then intraoperatively imaging these spheres using an ultrasonic sensor. Direct ultrasonic registration of bony tissue with their CT images is being developed.
  • a key component in any IGT/IGS system is the 3-dimensional (3-D) instrument tracker that informs the system computer of where the surgical or therapeutic instrument is and how it is oriented in 3-D space within the frame of reference.
  • 3-D 3-dimensional
  • Ultrasonic digitizers utilizing time-difference of arrival techniques have been used to locate instruments, but with limited success due to their: sensitivity to changes in the speed of sound; sensitivity to other operating room noises and echoes; and unacceptable accuracy in large operating volumes.
  • Magnetic field tracking of instruments has been tried, but suffered from operational difficulties caused by interfering fields associated with nearby metal objects and unacceptable positional accuracy for surgical or therapeutic use.
  • a surgical or device for performing surgery or therapeutic interventions on a patient comprising a first curvature sensor configured to be placed on the patient, an attachment fixture attached to the first curvature sensor, a computer electronically coupled to the curvature sensor, a plurality of fiducials capable of being detected by a medical imaging system, a second curvature sensor electronically coupled to the computer, the second curvature sensor having a first end and a second end and capable of being coupled to the attachment fixture at the first end, and a tool connector coupled to the second end of the second curvature sensor.
  • a surgical device for performing surgery or therapeutic intervention on a patient comprising an attachment fixture, at least one fiducial capable of being detected by a medical imaging system, a curvature sensor coupled to the attachment fixture at one end and coupled to a tool connector at the other end, and a computer electronically coupled to the curvature sensor.
  • a device for use in an image guided therapy or image guided surgery system comprising a curvature sensor configured to be applied to a patient, an attachment fixture coupled to the curvature sensor, and a plurality of fiducials coupled to the curvature sensor.
  • a device for generating a frame of reference for an image guided therapy and image guided surgery system comprising a curvature sensor configured to be applied to a patient, an attachment fixture and at least one fiducial.
  • a device for generating a frame of reference for an image guided therapy and image guided surgery system comprising a ribbon comprised of one or a combination of plastic, metal wire, metal strip, fabric, rubber, synthetic rubber, nylon, thread, glass, or paper, a plurality of fiducials attached at known inter-fiducial distances along the ribbon, and an attachment fixture coupled to the ribbon at a known position with respect to the plurality of fiducials.
  • a sensing mesh comprising at least one curvature sensor, a plurality of filaments coupled to the plurality of curvature sensors, a plurality of fiducials coupled to the curvature sensor(s) or to the plurality of filaments.
  • the sensing mesh is configured as a garment, such as a cap or as a garment to fit a human pelvis or torso.
  • a system for monitoring or enabling surgery or therapeutic intervention on a patient at a distance comprising a first curvature sensor configured to be placed on the patient, an attachment fixture attached to the first curvature sensor, a computer electronically coupled to the curvature sensor, a second curvature sensor electronically coupled to the computer, the second curvature sensor having a first end and a second end and capable of being coupled at the first end to the attachment fixture, a surgical tool capable of being coupled to the second end of the second curvature sensor, and a communication device electronically coupled to the computer.
  • a device for monitoring the motions of a body comprising a garment configured to be worn by a body, the garment including at least one curvature sensor(s) and a plurality of filaments coupled to curvature sensor(s) to form a mesh, and a communication device coupled to the curvature sensors and configured to communicate the output of the curvature sensors to a distant receiver.
  • a method of locating fiducials within a CT or MRI image of a patient comprising the steps of placing an array of fiducials on the patient, each fiducial within the array being located at known inter-fiducial distances apart, imaging the patient, identifying and locating in the image a reference point on the array of fiducials, inspecting the image one inter-fiducial distance from the reference point and identifying a fiducial using an image recognition means, inspecting the image one inter-fiducial distance from the last identified fiducial and identifying a fiducial using an image recognition means, and repeating the last step until all fiducials are located.
  • a method of registering a patient to an image from a CT or MRI system comprising the steps of placing a curvature sensor on the patient, the curvature sensor being coupled to at least one fiducial, imaging the patient using a CT or MRI imaging system to produce an imaging study, analyzing the imaging study to create a volumetric data set in a computer database, the data set including identification of the at least one fiducial and the curvature sensor, electronically connecting the computer to the curvature sensor, determining the three-dimensional shape of the curvature sensor by using the computer to analyze the signal produced by the curvature sensor, and correlating the volumetric data set in the computer database to the three-dimensional shape of the curvature sensor by identifying the position of the at least one fiducial as a common point in a frame of reference.
  • a method for conducting surgery on a body comprising the steps of placing a first curvature sensor on the body, the first curvature sensor having at least one fiducial in a known position with respect to the first curvature sensor, conducting an imaging study of the body using a CT or MRI system, the imaging study recording the position of the at least one fiducial with respect to the body, processing the imaging study to create an image data set and storing the image data set in a computer, the data set including the position of the at least one fiducial with respect to the body, connecting the first curvature sensor to the computer and using the first curvature sensor information to register the first curvature sensor and the at least one fiducial to the image data set, coupling one end of a second curvature sensor to the body at a known position and orientation with respect to the at least one fiducial and coupling a surgical tool to the other end of the second curvature sensor, displaying an image of the body from the image data set superimposed with an image
  • FIG. 1 is a perspective view of an embodiment of the invention positioned on a human head.
  • FIG. 2 is perspective view of a computer monitor showing graphic images.
  • FIG. 3 is a perspective view of an embodiment of the invention attached to the ilium with metallic screws.
  • FIG. 4 is a perspective view of a human head with a cranial mesh embodiment and fiber optic curvature sensor attachment for tool tracking in accordance with an embodiment of the invention.
  • FIG. 5 is a perspective view of a pelvic region with a mesh embodiment and fiber optic attachment for tool tracking in accordance with an embodiment of the invention.
  • FIG. 6 is a system diagram of an embodiment of the invention applied in a surgery on a femur with an intramedullary nail.
  • FIG. 7 is a system diagram of an image guided therapy/image guided surgery system in accordance with an embodiment of the present invention.
  • FIG. 8A is a flow diagram for producing an imaging study of a relevant portion of a patient wherein a 3-D internal image set is taken in preparation for pre-operative planning and intra-operative usage as per an aspect of an embodiment of the present invention.
  • FIG. 8B is a flow diagram for utilizing the imaging study of FIG. 8A in pre-operative planning, intra-operative use, and post-operative analysis and reconstruction as per an aspect of an embodiment of the present invention.
  • curvature sensors are able to precisely measure curvature and output electronic signals encoding their curvature in three-dimensional space with at least one imaging sensor observable position reference marker or fiducial.
  • One or more curvature sensors, as described herein, may be applied to the skin of a patient to electronically measure, in real-time, the precise contour of a portion of the patient's body and provide this three-dimensional surface contour data to a computer where such data may be correlated with a volumetric image data set obtained from a CT or MRI imaging study of the patient.
  • Attaching a positional reference fiducial marker to the curvature sensor(s) at a known position with respect to the curvature sensor(s) permits the curvature sensor to be located in a CT or MRI imaging study in three-dimensional space.
  • the computer of an image guided therapy or image guided surgery system can easily register (i.e. dimensionally correlate) the data set to the real-time surface contour measurements to create a correlated frame of reference for monitoring the position of a tracked instrument with respect to the patient.
  • the computer With the images thus correlated, the computer generates an image that superimposes the position and orientation of the surgical instrument on the correlated volumetric image of the patient drawn from the imaging study data set.
  • near-automatic registration of the patient to the image study data set may be accomplished intraoperatively, even when the patient is moved during therapy or surgery.
  • the inventors further realized that by coupling one end of a second curvature sensor to the patient at a known place in three-dimensional space, such as at or near the positional reference fiducial marker or the first curvature sensor, and coupling a surgical instrument to the other end of the second curvature sensor, the position and orientation of the surgical or therapeutic instrument can be registered to the patient and the imaging study data set intraoperatively in real-time.
  • a second curvature sensor anchored or coupled to a registerable known position on the patient, to track the surgical or therapeutic instrument enables a computerized image guided therapy or image guided surgery system that does not require an optical tracking or electromechanical tracking system. Eliminating optical and electromechanical tracking, eliminates the problems and cost associated with such devices.
  • the resulting computer aided therapy or computer aided surgery system comprises at least one fiducial reference point attached to a first curvature sensor for measuring the surface shape of a portion of a patient and/or a set of physically constrained fiducials for computing the surface shape of a portion of a patient, a second curvature sensor configured to hold a surgical or therapeutic tool or instrument on one end and to be attached at the other end to a known position with respect to the fiducial reference point or the first curvature sensor (e.g.
  • a computer system with software configured to determine the three-dimensional positions of the first and second curvature sensors, to register those positions with respect to an image data set, and to display the image and surgical or therapeutic tool in the same frame of reference for use as a guide for the doctor or therapist.
  • Curvature sensors include any sensor or combination of sensors whose function is to measure the curvature of a linear element in three-dimensional space with respect to a reference point, such as a fixed end, and output a digital signal that communicates the measurements to a computer.
  • the linear element of a curvature sensor may be in the form of a fiber, fiber optic, cable, bundle of fibers, strip, tape, or band, and, as is described in greater detail herein, a plurality of linear elements may be coupled to interconnecting filaments to form a flexible mesh that will measure the 3-D shape of a surface or manifold.
  • the curvature sensor may also comprise an electronic interface device for receiving measurement signals from the sensor's linear element and transforming such measurement signals into a digital signal output readable by a computer.
  • the term “curvature sensor” may encompass such an electronic interface device.
  • One curvature sensor that is suitable for use in various embodiments of the present invention and is illustrated in the drawings relies on linear, bipolar modulation of light throughput in specially treated fiber optic loops that are sealed in absorptive layers.
  • This fiber optic curvature sensor consists of paired loops of optical fibers that have been treated on one side to lose light proportional to bending of the fiber. The lost light is contained in absorptive layers that prevent the interaction of light with the environment.
  • An electronics interface box attached to the fiber optics illuminates the loops, measures return light, encodes the measurements and relays information to a computer having software that calculates the 3-D instantaneous shape of the sensor.
  • the computer is able to generate a 3-D model of the sensor and display a graphic image of the sensor's linear element on a computer screen.
  • the fiber optic type curvature sensor is disclosed in U.S. Pat. Nos. 5,321,257 and 5,633,494 issued to Danisch, the specifications of which are hereby incorporated by reference in their entirety.
  • a commercial version of the curvature sensor is produced by Measurand Inc. (New Brunswick, Canada), comprising a flexible, fiber-optic linear element that provides position measurements of the entire length and shape of the tape including its endpoint. Position determination is accomplished by the electronic processing of light signals transmitted down the fiber optic cable.
  • the curvature sensor uses internally sensed fiber optic cables to determine their position, the sensor can be made of little more fiber-optic fibers surrounded by an absorptive layer, reducing the interconnections between the patient's frame of reference and the instrument to a non-interfering, extremely low-inertia, highly flexible, thin encapsulated glass fiber that is easily sterilized and may be made to be disposable.
  • curvature sensors may employ: conductors whose electrical resistance varies when bent, such as strips of conductive polymers, or flexible strips of semiconductor or metal oxide materials; conductive wires covered by insulator material whose insulation properties vary when subjected to bending stress; or flexible cables, such as special co-axial cables, whose inductive or capacitive properties vary when the cable is bent (e.g. by reducing the gap between a central conductor and one or more surrounding conductors).
  • curvature sensors would employ a pulsed or oscillating current and an electronic interface/detector to locate the distance along the sensor to a bend, the amount of bend and direction of a bend in two-dimensions, and output this information in a form readable by a computer.
  • curvature sensor should be understood herein as encompassing any sensor capable of performing the functions of measuring the three-dimensional position of the length of a linear element, either continuously or at intervals (i.e. points along the linear element) with respect to a reference point (e.g. an end or mid point), and providing an output signal that can be read by a computer, including sensors that employ an intermediary electronic interface device to produce a computer-readable output, such that the computer can determine the 3-D positions and orientations of the linear element along its length.
  • a curvature sensor need not be a physical device which determines its own position in 3-D space, but can also include a set of physically constrained fiducial points which are capable of being imaged.
  • the physical constraints interconnecting the fiducial points can be used to aid in the automatic detection and localization of the fiducial points in the image as well as be used for the piecewise-linear (wire-frame) representation of the curvature of the surface which carries the fiducial points.
  • Fiducial refers to anatomic or anatomically-fixed landmarks recognizable by an imaging system and used to locate known points with respect to a frame of reference, and more particularly to radioopaque (i.e. CT-visible) or MRI-visible markers applied to the skin surface overlying the site of an operation or attached to the underlying bone.
  • Fiducials may be radioopaque spheres (e.g. lead, tungsten, or titanium spheres) for CT-imaging or fatty vitamin pills for MRI imaging, for instance.
  • attachment fixture refers to any fixture that is imageable and whose position is accurately known with respect to a set of fiducials and/or a curvature sensor.
  • the function of the attachment fixture is to provide a known point of reference for the fiducials and curvature sensors that can be correlated to the imaging study data set when the patient and data set are registered.
  • the attachment fixture may be as simple as an easily-recognized CT or MRI imageable feature on a garment, bandage, tape, band, wrap, screw or other patient-attachment item.
  • the attachment fixture is both an easily recognized fiducial at a known position with respect to an array of fiducials on a curvature sensor mesh, and a hard-point for attaching one end of the second curvature sensor to a known 3-D position with respect to the patient so the IGT/IGS can determine the position of the surgical or therapeutic tool with respect to the patient at the other end of the curvature sensor.
  • the attachment fixture comprises a fiducial, a clip securing one end of the curvature sensor in a known position and orientation, and a means for mounting the attachment fixture on the patient, such as being sewn, stapled or glued to a garment to be worn by the patient.
  • the attachment fixture is simply an easily recognizable fiducial that is at a known position with respect to the other fiducials, such as a radioopaque metal (e.g. lead, tungsten or titanium) cross attached (e.g. sewn or glued) to a garment comprising an array of fiducials.
  • the clip or latching mechanism for attaching one end of the curvature sensor may be any suitable physical interconnect that will hold one end of the curvature sensor linear element securely in 3-dimensions with a fixed orientation, including a spring clip, threaded connection, clamp, tongue-in-groove connection, or cylindrical cavity with a detent for receiving a grooved rod.
  • the clip will permit easy connect and disconnect of curvature sensors to enable patient preparation, sterilization of instruments, movement of the patient, etc.
  • the attachment fixture may be disposable or of a more permanent nature. There may be more than one attachment fixture provided in a particular embodiment. The attachment fixture may provide for attaching a plurality of curvature sensors to the fixture simultaneously. And the attachment fixture may be integrated with other elements of the various embodiments, including, but not limited to, a garment comprising an array of fiducials, a curvature sensor, a curvature sensor garment, an electronics interface device for the curvature sensors, a patient restraint, a medical device holder or positioner, the operating table, a patient monitor (e.g. temperature, blood pressure or pulse monitor), or any combination of devices that will be placed in a fixed position and orientation with respect to the patient during imaging studies and the treatment/operation.
  • a patient monitor e.g. temperature, blood pressure or pulse monitor
  • medical imaging system refers to any imaging capability, device or system capable of obtaining an image of a body, preferably a volumetric image, including without limitation computed tomography (CT), fluoroscopy, magnetic resonance imaging (MRI), positron emission tomography (PET) or single photon emission tomography (SPECT).
  • CT computed tomography
  • MRI magnetic resonance imaging
  • PET positron emission tomography
  • SPECT single photon emission tomography
  • tool refers to any device used in a surgery or therapy, including without limitation any instrument, probe, drill, scalpel, stent, suture, tool, scissors, clamp, or imager (such as a fiberscope).
  • imager such as a fiberscope
  • the term “tool” refers to any device that aids the operator in accomplishing a task, including without limitation any probe, drill, wedge, imager, screwdriver, pick, scissors, clamp, wrench, key or other tool.
  • an IGT/IGS system employing a first curvature sensor to measure the shape and orientation of a portion of a patient's body, at least one fiducial to enable registration to an imaging study data set, a second curvature sensor configured to hold and track an instrument (e.g. a surgical instrument or probe), a computer to receive the three-dimensional information from the first and second curvature sensors and calculate therefrom their positions and orientations with respect to a frame of reference and register their positions to an imaging study data set, and a computer monitor for displaying the correlated images of the instrument, the patient and the imaging study data set.
  • a first curvature sensor to measure the shape and orientation of a portion of a patient's body
  • at least one fiducial to enable registration to an imaging study data set
  • a second curvature sensor configured to hold and track an instrument (e.g. a surgical instrument or probe)
  • a computer to receive the three-dimensional information from the first and second curvature sensors and calculate therefrom their positions and orientations with respect to a
  • the CT or MRI imaging study data set is obtained with the first curvature sensor device fixed in place (such as with adhesive) on the object portion of the patient's body and imaged simultaneously with the anatomy.
  • the flexible curvature sensor may be in the form of a strip, tape, band or mesh, as described herein, that can be laid upon or wrapped about the patient in the area where surgery is to be performed.
  • CT or MRI readable fiducials may be incorporated within the curvature sensor strip, tape, band or mesh containing the curvature sensor(s), such as at set distances apart (forming an inter-fiducial distance).
  • a second flexible fiber optic curvature sensor device is physically attached at one end to the first curvature sensor, or to a structure that provides a known positional reference point for determining the location and orientation of the end of the second curvature sensor in 3-D space, such as an attachment fixture.
  • the attachment fixture itself is imageable on the CT or MRI scan, or incorporates an imageable fiducial, so that its frame of reference position is established in the imaging study, thereby providing a known position for the end or origin of the second flexible fiber optic curvature sensor.
  • the attachment device may be separate from or integrated with an electronics interface device that electronically couples the curvature sensor to the computer system.
  • the second curvature sensor device is electronically linked to the computer system either directly or through an electronic interface device, which may be the same electronic interface device coupled to the first curvature sensor or a separate electronic interface device.
  • This second curvature sensor has attached at its other end (i.e. the end not attached to the attachment fixture) a tool connector or holder for holding the surgical or therapeutic tool, instrument or probe to be used in the surgical or therapy procedure.
  • the tool connector may be any structure or mechanism suitable for securely holding a tool, instrument or probe in a fixed orientation with respect to the end of the second curvature sensor, including any one of or a combination of a clasp, slot, opening, flange, or threaded member.
  • the 3-D position of the second curvature sensor is measured or calculated in relation to the known reference point on the patient's body, which is registered in the computer system to the imaging study data set and the first curvature sensor which informs the computer of the position and orientation of the object part of the patient's body.
  • the tool or surgical instrument is tracked and its position and orientation is determined by the computer system in relation to the object anatomy as recorded in the imaging study.
  • the second curvature sensor coupled to a tool holder for holding a surgical or therapeutic instrument or probe is used in an IGT/IGS system that retains an optical tracking system, with the positional information generated by the second curvature sensor used to supplement the instrument tracking provided by an optical tracking system in order to improve tracking accuracy and/or to provide continuity of instrument tracking when an object, such as the physician's body, blocks the lines of sight between the optical trackers and the instrument or probe.
  • This first embodiment may be understood with reference to FIGS. 1 through 7 .
  • a flexible fiber optic curvature sensor 100 is applied with adhesive to a human head 110 .
  • an electronic interface box 105 that transmits, via a cord 109 to a computer 120 , the information that specifies the shape of the curvature sensor 100 .
  • the light quantity is measured in the electronic interface box 105 .
  • the curvature sensor 100 has radioopaque or MRI visible fiducials in the form of bands 102 on each side of the head 110 .
  • Atop the electronic interface box 105 is attached a light emitting diode (LED) array 108 .
  • the LED array 108 is tracked by a camera 150 attached to the computer 120 .
  • the position of the electronic interface box 105 , the curvature sensor 100 and the head 110 can be tracked by the computer 120 in all six degrees-of-freedom (DOF).
  • the graphic shape 140 of the curvature sensor 100 is displayed on a computer monitor 130 .
  • An outline 142 of the graphic shape 140 corresponds to radioopaque/MRI visible bands 102 .
  • a monitor 130 shows the three-dimensional reconstruction 210 of the head 110 .
  • Bands 202 are visualized on the reconstruction 210 .
  • the computer 120 using shape matching software, unites the fiducial bands 202 with the graphic shape 140 of the curvature sensor 100 .
  • the outline 142 of the graphic shape 140 is matched to radioopaque/MRI visible fiducial bands 102 .
  • relationships between the reconstruction 210 of the head 110 can now be made.
  • FIG. 3 shows an embodiment applicable to registration of a major bone.
  • Large screws 330 are drilled into an ilium 340 illustrated in a human body 310 .
  • Caps 320 are placed over the screws 330 .
  • Each cap 320 has a slot 321 through which a flexible curvature sensor 100 is placed.
  • Setscrews 322 hold the curvature sensor 100 firmly after it has been placed under tension through the slots 320 .
  • Setscrews 322 also hold the caps 320 firmly against the screws 330 .
  • the curvature sensor 100 is held firmly in a fixed position in relation to the ilium 340 .
  • a CT scan is then performed that provides a digital data set of the pelvis 340 and the curvature sensor 100 in a fixed relation.
  • the shape of curvature sensor 100 is seen in a reconstruction of the CT data, such as illustrated in FIG. 1 , as provided by radioopaque fiducials or bands 102 .
  • a cranial mesh 400 of flexible fiber optic curvature sensors 410 is held together with small connecting filaments 415 , forming a cap on a head 440 .
  • An integrated electronics interface box/attachment fixture 420 connects to a computer via cable 470 and attaches to a second flexible fiber optic curvature sensor device 430 .
  • the cranial mesh 400 is visualized on the CT scan of the head 440 and also by the graphic representation thereof. The two images are merged or superimposed so that a CT of the cranial mesh 400 is registered with the graphic representation of the cranial mesh 400 .
  • a graphic representation of a cranial mesh 400 is registered to a CT of the head 440 .
  • the graphic shape of a second flexible fiber optic curvature sensor device 430 is thereby registered to the CT of the head 440 , and a surgical probe 460 is thus registered to the CT of the head 440 .
  • Spaces 418 between the flexible fiber optic curvature sensor devices 410 and filaments 415 permit room for a surgical probe 460 to be used in surgical operations on the head 440 .
  • Flexible fiber optic curvature sensor devices 410 are wired individually to an electronic interface box 420 so that one flexible fiber optic curvature sensor device 410 can be disconnected and moved if needed for positioning the surgical probe 460 without affecting registration.
  • FIG. 5 illustrates a pelvic mesh 500 of flexible fiber optic curvature sensors 410 held together with small connecting filaments 415 , forming a pants-like garment on a pelvis region 540 .
  • An electronics interface box 420 connects to the computer via a cable 470 and has an attachment fixture 422 where a second flexible fiber optic curvature sensor device 430 is attached.
  • the pelvic mesh 500 is visualized on the CT scan of the pelvis region 540 and also by the graphic representation thereof. The two images are merged or superimposed so that the CT of the pelvic mesh 500 is registered with the graphic representation of the pelvic mesh 500 .
  • a graphic representation of the pelvic mesh 500 is registered to the CT of the pelvic region 540 and thus, more specifically, to the bony pelvis.
  • the graphic shape of the second flexible fiber optic curvature sensor device 430 is therefore registered to the CT of the pelvic region 500 , and a surgical drill 560 is thus registered to the CT of the pelvis 500 .
  • Spaces 418 between flexible fiber optic curvature sensor devices 410 and filaments 415 permit room for the surgical drill 560 to be used in surgical operations on the bony pelvis situated in the pelvic region 500 .
  • Flexible fiber optic curvature sensor devices 410 are wired individually to an electronic interface box 420 so that one flexible fiber optic curvature sensor device 410 can be disconnected and moved if needed for positioning the surgical drill 560 without affecting registration.
  • the dorsal mesh 580 is elastic to provide a good fit onto the pelvic region 540 .
  • the pelvic mesh 500 has an open perineal section 590 for normal excretory functions.
  • a flexible fiber optic curvature sensor 100 is applied with adhesive to the thigh 605 .
  • a fracture 610 of the femur 611 has been fixed with a intramedullary nail 612 without exposing the fracture site.
  • a mobile fluoroscope 600 acquires two or more X-ray images 650 of holes 615 in the intramedullary nail 612 .
  • a computer 120 processes X-ray images 650 that include radioopaque markers 102 attached to the flexible fiber optic curvature sensor 100 .
  • An interface box 420 connects to the computer via a cable 470 and has an attachment fixture 422 with a second flexible fiber optic curvature sensor device 430 attached.
  • a second interface box 660 attached to a drill 666 connects to the computer 120 with a second cable 670 .
  • the position of the second flexible fiber optic curvature sensor device 430 is more accurately determined as it is attached to the electronic interface box 420 at one end and to another electronic interface box 660 at the other end.
  • the position of the drill 666 in relation to the holes 615 of the intramedullary nail 612 may be more accurately determined.
  • an IGT/IGS system comprises the functional elements of a 3-D imaging system 710 , a computer image processing system 720 , a curvature sensor system 730 , an image display device 740 and a user interface 750 .
  • the 3-D imaging system 710 which may be a CT or MRI imager, provides a volumetric image digitized data set 715 to the computer image processing system 720 .
  • the curvature sensor system 730 provides digitized information 735 on the 3-D position and orientation of the individual curvature sensors to the computer image processing system 720 .
  • the computer image processing system 720 correlates the image data set and the curvature sensor 3-D position information and provides a video output 745 to the image display device 740 that superimposes an image of the surgical instrument on the correlated volumetric image of the patient. Operator commands 755 is provided from the user interface 750 to the computer image processing system 720 .
  • an imaging study Prior to the operation, the patient undergoes an imaging study (step 800 ) wherein a 3-D internal image set is taken of the portion of the patient's body that will be operated upon.
  • curvature sensor(s) e.g. a curvature sensor garment
  • fiducials e.g. a curvature sensor garment
  • an attachment fixture for the curvature sensor are applied to the patient, such as with adhesive so their positions on the body are recorded in the same imaging study.
  • the imaging study data set is then processed (step 805 ) wherein the computer image processor locates the position of the attachment fixture with respect to the patient's anatomy, the fiducials and, if employed, the curvature sensor garment, and calculates their positions and orientations within the image data set.
  • step 810 the attachment fixture is marked on the image data set.
  • the IGT/IGS system may be employed with a curvature sensor (which incorporates fiducials) applied to the patient, or with only an array of fiducials applied to the patient with no curvature sensor on the patient.
  • a curvature sensor which incorporates fiducials
  • the operation of the system differs slightly as summarized in FIG. 8A .
  • the computer image processing system obtains the 3-D position information from the curvature sensor (step 820 ). Then the computer image processing system calculates the position of the attachment fixture relative to the fiducials (step 830 ) using the known relative positional information of the fiducials to the curvature sensor.
  • the computer image processing system processes the image to detect and locate fiducials (step 825 ) based upon their shape, opacity, geometric position with respect to the attachment fixture (e.g. fiducials coupled in known locations in a garment coupled to the attachment fixture), or other image-recognizable property.
  • the image data set is marked (step 835 ) to make them obvious to the user.
  • the physician may plan the operation using the imagery (step 840 ), wherein cues, templates, guide markers or other visual clues may be provided to the computer image processor for display during the operation.
  • the computer image processor will obtain positional information from the curvature sensors (step 845 ), such as via electronic interface device(s), on a near-real time basis.
  • the computer image processor uses the curvature sensor information and the imaging study data set, and in particular the location of the attachment fixture, to compute (step 850 ) the position and orientation of the surgical instrument relative to the patient.
  • the computer image processor uses this position and orientation information to compute (step 855 ) a near-real time image of the patient with an integrated display of position and orientation of the surgical instrument.
  • the computer also records (step 860 ) the position and orientation of the surgical instrument in a computer database. This stored position/orientation information permits post-operative reconstruction (step 865 ) of the operation for analysis and/or training.
  • this embodiment will enable establishing an attachment fixture-centered frame of reference. Since an attachment fixture-centered frame of reference may be divorced from the operating room coordinate system, this embodiment may be employed to accommodate movements of the patient or an extremity undergoing surgery without requiring extensive re-registration of the instrument or probe to the data set drawn from the image study. This capability would enable a surgeon to reposition a body or extremity during an operation if necessary without interrupting or delaying the procedure, or introducing positional errors into the IGT/IGS system.
  • the direct connection between the patient and the instrument via a second curvature sensor also makes the system far more compact than video tracking systems.
  • IGT/IGS system uses a conventional optical tracking system to track the surgical instrument, but makes use of a curvature sensor, fiducials, and a computer workstation that synthesizes CT or MRI data into usable graphics to provide automatic registration of the object section of the body with the CT or MRI image data set.
  • This embodiment allows the surgeon to guide instruments using the optical tracker, while alleviating several significant disadvantages of existing passive navigation systems for IGT/IGS systems.
  • a plurality of curvature sensor linear elements are laid out in an array and held together with strong filaments to form a mesh or fabric of multiple curvature sensors. This embodiment is illustrated in FIGS. 4 and 5 .
  • the curvature sensors 410 are positioned in parallel to each other and coupled to cross-running filaments 415 to form a mesh.
  • the cross-running filaments may be any number of materials, including any one or combination of plastic, metal wire, metal band, polymer plastic, paper, cloth, nylon, rayon, segmented solid pieces of plastic, metal or wood, or similar formable material.
  • the curvature sensors 410 are coupled to an electronic interface device 420 which sends curvature information to a computer (not shown) via cable 470 .
  • the mesh or fabric may be shaped to conform to a targeted body part much like a garment, which may provide significant clinical advantages.
  • a cranial mesh garment 400 may be shaped like a cap to conform to a patient's head 440 .
  • the curvature sensors 410 are aligned in parallel hoops or bands and held in place with filaments 415 configured in a radial pattern originating at the crown.
  • the curvature sensor mesh may be configured as a pelvic garment 500 that is shaped much like a pair of bicycle shorts with a cutout 590 for the perineum, as illustrated in FIG. 5 .
  • the curved sensors 410 are provided on the exposed portion of the mesh while the patient is lying in bed, usually the anterior or ventral section.
  • the dorsal or posterior portion may be an elastic band or mesh 580 to provide a better fit to the abdomen 540 .
  • the electronic interface device 422 attachments for the plurality of curved sensors 410 in the mesh may be at the edge or may be part of the attachment fixture 422 , as illustrated in FIG. 5 , and/or electronic interface device for a tool 560 tracking second curvature sensor 430 , as is also illustrated in FIG. 5 .
  • the garment may comprise an array of fiducials at regular points in the mesh and a rigidly affixed attachment point or attachment fixture for securing a dynamic frame of reference (DFOR) to a patient.
  • DFOR dynamic frame of reference
  • a fiber optic curvature sensor is attached to the surface of the patient by an adhesive, as may be appropriate in neurosurgical cases.
  • metal pins or screws attach the fiber optic curvature sensor to bone.
  • fiducial markers detectable by X-ray or magnetic resonance imaging such as fiducials made of titanium, may be incorporated into the curvature sensor at known dimensions from a reference point, such as a set distance apart along a flexible wire or tape fixed or attached to an attachment fixture/reference fiducial, or at a set distance along fibers in a fabric, bandage or garment that is fixed or attached to an attachment fixture/reference fiducial.
  • a sensor or emitter appropriate to the waveform tracking device employed may be attached to the device.
  • a three-dimensional data set of the object body section is then obtained. This data set is then transferred to the computer workstation to be used in a surgical procedure.
  • registration is accomplished automatically.
  • the computer ascertains the shape of the fiber optic curvature sensor, which has been positioned on the patient, using the information passed by the electronic interface device.
  • the digitized shape of the curvature sensor is then matched with the shape that is evident in the imaging data set, such as determined by the positions of the fiducials incorporated within the curvature sensor. Registration can then be determined automatically without the surgeon needing to use a probe or direct the use of a fluoroscope.
  • the curvature sensor In an embodiment appropriate for cranial procedures, the curvature sensor, with integrated fiducials, is placed completely around the head in a position such that it will not interfere with the anticipated place for surgical entry.
  • the curvature sensor and fiducials assembly may be adhered to the skin surface, making it very stable and secure, thus assuring that the built-in reference frame remains in the same position.
  • the curvature sensor and fiducials may be in the form of a garment shaped like a cap, or may be in the form of a tape or bandage that is wrapped about the head.
  • Another embodiment of this invention comprises the use of a curvature sensor to provide two-dimensional image tracking using a fluoroscope.
  • a fluoroscope obtains images of the object body section that has attached the flexible fiber optic curvature sensor device comprising radioopaque fiducial markers, which may be at set distances along a wire or at set positions within a grid.
  • the position of the flexible fiber optic curvature sensor device is determined in relation to the object body section.
  • Tracking of surgical tools in relation to the object body section is then accomplished by using an optical tracking system with LEDs attached to the flexible fiber optic curvature sensor device.
  • tracking may be accomplished by attaching the tools to a second flexible fiber optic curvature sensor device, eliminating the need for the reference frame and optical tracking system.
  • further X-rays are not needed as the relative position of the surgical tool to the object body is now recorded.
  • the present invention includes two embodiments associated with two registration options.
  • One embodiment comprises a curvature sensor and fiducials affixed to a dynamic frame of reference (DFOR) wrap or garment.
  • the other embodiment comprises only fiducials affixed to the garment with no curvature sensor.
  • a curvature sensor may be connected to the DFOR wrap or garment at an attachment fixture or fixture whose position is accurately known with respect to the wrap or garment frame of reference. This attachment fixture provides a physical and positional connection between the attachment fixture, which has been preoperatively volumetrically-imaged with the garment frame of reference, and the instrument.
  • the physical interconnect provided by the attachment fixture allows for the continuous tracking of the 6-degrees of freedom state of the instrument without the need for extraneous optical or articulated arm instrument tracking equipment in the operating room.
  • the instrument's kinematic state can then be displayed on a monitor viewable by the surgeon as a computer-generated image embedded in the preoperatively obtained volumetric image.
  • the attachment fixture may comprise a latching mechanism to physically attach one or more curvature sensors to a known reference point on the attachment fixture, and one or more fiducials which may be imaged with CT and/or MRI systems in order to establish the known position of the attachment fixture in the imaged frame of reference.
  • a further embodiment of the present invention comprises a fiber optic curvature sensor-enabled garment which can dynamically track the movements of the fiducials on a patient's moving body. These tracked fiducial points can then be used to dynamically warp the preoperative image to more realistically present an image to the therapist as the patient motion is happening.
  • fiducials and a fiber optic curvature sensor may be affixed to a patient's skin, either adhesively or embedded in a garment, bandage, tape or other structure, at the time of volumetric pre-operative imaging. This garment is then left affixed to the patient for the duration of the treatment or surgery.
  • the fiducials in the garment provide the image frame of reference as well as the attachment fixture for attaching an instrument connection fixture, also herein referred to as an attachment fixture, to a known reference point.
  • the embedded fiber optic curvature sensors provide the dynamic garment frame of reference, not only in the sense of being affixed to the patient, but also in the sense that it can track, in real-time, the location of the fiducials.
  • the real-time, intraoperative location of the fiducials can be used to synchronize the acquisition of instrument tracking data with the preoperative images for improved accuracy in certain dynamic scenarios such as therapy in the chest area.
  • the fiber optic curvature sensor Since the fiber optic curvature sensor is included in the volumetric image data set, its position is known relative to the image and it comprises a set of distributed fiducial points. Since the fiber optic curvature sensor measures its own position, a rigid attachment point provided by the attachment fixture can be part of the curvature sensor device or garment and used for the rigid attachment of a second curvature sensor whose other end is attached to a surgical instrument.
  • the first step in a general framework frame of reference registration is the definition of a relation among the different coordinate systems or, as used herein, frames of reference.
  • Current methods include: reference preoperative images; fiducials; and instruments to a frame of reference affixed to the operating room. Assuming that the errors in establishing these relationships and their registration are independent, then the cumulative error, at least to a first approximation, is the root sum square of the individual errors in estimating these relationships. It is therefore clear that the methods and embodiments disclosed herein that will eliminate or reduce the number of frame registration steps will increase the accuracy of instrument positioning. It is also clear that calibration techniques can outperform registration techniques in terms of accuracy.
  • volumetric imaging was a visualization aid for surgeons and diagnosticians and nonlinearities in the image were relatively unimportant. When these same images are used for navigation, linearity becomes a significant issue. Such nonlinearities or distortion in computer aided surgical navigation with fluoroscopy have been recognized and correction methods have been developed.
  • a second issue associated with the image itself is the spatial quantization of the image. Typically the image is digitally constructed from a series of slices or a helical scan of the patient. The fact that these individual slices have a finite thickness limits the number of samples taken on each fiducial.
  • the accurate estimation of the centroid of the fiducial can become problematic.
  • using a 5 mm sphere and 1 mm scan width yields only 5 spatial samples on the sphere, thereby limiting the accuracy that can be achieved in the image frame of reference (ImFOR) itself.
  • the methods of rigid-body transformations from one frame of reference to another are well known in the art. The difficulty is in accurately establishing the several frames of reference and their relations.
  • these registration errors are significantly reduced by eliminating several registration procedures and exchanging them for one registration step and one calibration procedure. Furthermore, the calibration of the surgical instruments to the garment frame of reference can be done preoperatively, thus minimizing time in the operating room.
  • the direct connection between the patient and the instrument provided by the second curvature sensor eliminates the need for intraoperative video tracking and its associated equipment. Precise positioning of surgical instruments relative to 3-D CT and MRI (volumetric) and 2-D fluoroscopic images is important for the delivery of anti-cancer drugs, localized ablation of tumors, biopsy, and execution of pre-planned surgery. In the embodiment providing surgical instrument navigation, diagnostic and treatment modalities can be done easier and more cost effectively than current means allow. The ease of use of various embodiments of this invention will make it possible to precisely and repeatably place an instrument in particular positions at specified angles by therapists untrained in the details of its operation.
  • This invention may significantly reduce patient morbidity, physician intraoperative difficulties, radiation exposure, and operating time, while at the same time improving repeatability of instrument placement, thus improving the accuracy of therapies and deliveries of medication.
  • This improved accuracy will increase the information obtained from each set of experiments because of the repeatability of the procedures on the same or different individuals.
  • Data on instrument position can also be recorded and associated with each operation to determine which instrument position provides the best effect, as well as for training of other therapists once the procedure is approved.
  • this invention could be used in an out-patient setting, even in a physician's office, enabling precision procedures, such as percutaneous biopsy, to be done accurately and safely.
  • fiducials are positioned at known positions along a flexible fiber, such as a plastic or fabric ribbon, wire, metal band or plastic, fabric tape, that originates at a known reference position, such as an attachment fixture, and can be taped to or wrapped around a patient.
  • a flexible fiber such as a plastic or fabric ribbon, wire, metal band or plastic, fabric tape
  • ribbon refers to a long, narrow flexible structure capable of fixing at least one fiducial in a known position along its length and being bent such as to conform to the contours or wrap about a body.
  • the ribbon may be made of any material that is flexible or semi-rigid so as to be laid on top of or wrapped about a patient, including one or a combination of plastic, metal wire, metal strip, fabric, rubber, synthetic rubber, nylon, thread, glass (including fiber optic glass), or paper (as may be suitable for a pre-sterilized, disposable fiducial wrap).
  • This embodiment significantly facilitates and enhances the registration of CT or MRI images since the fiducials are easily located by a computer because each fiducial is at a known dimension from the next. This reduces inaccuracies associated with using an intensity or threshold value determination of fiducials, which often results in false or missed fiducials since bodily tissues may result in images that are similar to those created by such fiducials.
  • This embodiment enables a method of locating fiducials for an IGT/IGS system, comprising the steps of placing an array of fiducials on the patient, each fiducial within the array of fiducials being located at a known inter-fiducial dimension apart from one another, identifying and locating a reference point on the array of fiducials, such as an attachment fixture, inspecting the image one inter-fiducial length from the reference point and identifying a fiducial using an image recognition means, the identified fiducial becoming a last-identified fiducial, inspecting the image one inter-fiducial length from the last-identified fiducial and identifying a next fiducial using an image recognition means, the identified next fiducial then becoming a last-identified fiducial, and repeating the previous step until all fiducials within the array of fiducials have been identified in the image.
  • the curvature sensor garment is coupled to a communications device, such as a cable to a computer with an internet connection, a radio or cellular telephone data link, or a satellite communication data link, so that the positional and curvature information provided by the curvature sensors are communicated to a remote location.
  • a communications device such as a cable to a computer with an internet connection, a radio or cellular telephone data link, or a satellite communication data link, so that the positional and curvature information provided by the curvature sensors are communicated to a remote location.
  • a communications device such as a cable to a computer with an internet connection, a radio or cellular telephone data link, or a satellite communication data link, so that the positional and curvature information provided by the curvature sensors are communicated to a remote location.
  • Such a garment and communication device would enable remote surgery or therapies.
  • Such a garment and communication device would also allow the dynamic monitoring of a patient, such as while freely walking. This embodiment could have several useful applications for remotely monitoring the movements
  • a curvature sensor garment, mesh or fabric, with or without incorporated fiducials is applied to a patient at a remote location, such as a battlefield medical facility.
  • a data set of the injury is obtained using fluoroscopy or other means to create a digitized volumetric data set of the patient.
  • a second curvature sensor is attached to the curvature sensor garment or fabric at an attachment fixture whose position is registered in the volumetric data set.
  • the volumetric data set is communicated to another location, such as a hospital or a physician's office, where it is loaded on a computer.
  • the precise positional information on the patient's frame of reference, provided by the curvature sensor garment or fabric, and the precise location and orientation of a surgical tool, provided by the second curvature sensor, are communicated by the communication device to the distant location.
  • a computer registers the volumetric data set with the patient's frame of reference and the position and orientation of the surgical tool, and displays the result on a computer monitor in the distant location.
  • a physician at the distant location may then direct or observe the conduct of the remote surgery with greater confidence and precision than would be possible with only a video link between the two locations.
  • This system may incorporate an IGT/IGS at the site of the remote surgery, but is not necessary.
  • the present invention offers several significant advantages over the state of the art. With this invention there is no exposure of additional bone for recording anatomical landmarks or percutaneous attachment of fiducials to bones, both of which require surgery in addition to that which is required for the required surgery or therapy. Intraoperative manual registration is not required because the instrument is directly connected to the patient's frame of reference by a curvature sensor which continually reports its position in 6-D space. The need for articulated mechanical arms or a frame containing multiple video cameras to reduce instrument blind spots is eliminated. The elimination of cumbersome tracking equipment reduces the sterilization problem to one of using disposable fiber optic cables which may attach between the patient's garment and the instrument.
  • a small electronics interface box may be required as a part of the curvature sensor which can be easily draped since it is at one end of the curvature sensor. Since the position of the instrument is measured relative to a frame of reference which is affixed to the patient, patient movement ceases to be a problem. The physical interconnection between the patient and the instrument also reduces position estimation errors by replacing (intraoperative) registration steps with (preoperative) calibration.
  • Various embodiments of the present invention provide a device and method for an IGT/IGS system that is non-invasive, self-contained, passively-navigated, dynamically-referenced, and automatically image-registered, eliminating the need for the surgeon to do registration manually.

Abstract

There is provided a device for generating a frame of reference and tracking the position and orientation of a tool in computer-assisted image guided surgery or therapy system. A first curvature sensor including fiducial markers is provided for positioning on a patient prior to volumetric imaging, and sensing the patient's body position during surgery. A second curvature sensor is coupled to the first curvature sensor at one end and to a tool at the other end to inform the computer-assisted image guided surgery or therapy system of the position and orientation of the tool with respect to the patient's body. A system is provided that incorporates curvature sensors, a garment for sensing the body position of a person, and a method for registering a patient's body to a volumetric image data set in preparation for computer-assisted surgery or other therapeutic interventions. This system can be adapted for remote applications as well.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of application Ser. No. 09/752,557, filed Jan. 3, 2001, which claims the benefit of U.S. provisional patent applications No. 60/174,343 filed Jan. 4, 2000 and 60/179,073 filed Jan. 31, 2000, which are all hereby incorporated by reference in their entirety.
  • FIELD OF THE INVENTION
  • This invention pertains to computer assisted surgery and/or therapy using medical imaging systems, such as computed tomography (CT), fluoroscopy and/or magnetic resonance imaging (MRI) for guidance, and more particularly to a fiducial reference system and position sensing.
  • BACKGROUND OF THE INVENTION
  • The concept of computer-assisted stereotactic methods effectively began in 1979. By 1996 it was generally accepted that volumetric stereotactic procedures were feasible including the use of stereotactically directed instruments with respect to pre- or intraoperative displayed images.
  • Computer assisted image guided stereotactic surgery is beneficial in the biopsy and ablation of primary brain tumors, benign and malignant, as well as in many other intracranial procedures using computed tomography, MRI, positron emission tomography (PET) and single photon emissions tomography (SPECT). It is especially useful in accurate localization of intracranial vital structures. The passive articulated arm has been shown to be useful in the resection of brain metastases. Surgical navigation is used in head and neck tumors, employing MRI and CT imaging. Stereotactic interstitial brachytherapy has been used for inoperable head and neck cancers, with a head holder used for immobilization. Brachytherapy, the insertion of an ablative radioactive material into an otherwise inoperable tumor, can be placed accurately and through a smaller incision using computer assisted image guided stereotactic surgery. Other uses include vaporization of tumor slice and MRI, digital angiography, and computer-resident stereotactic atlases. Such methods are particularly utilized in neurosurgical and otolaryngological procedures of the head and orthopaedic procedures of the pelvis and spine.
  • The insertion of pedicle screws for spine fusion procedures is enhanced by computer-assisted methods. At first, 3-D images from CT scans were used but these have been replaced by computer-assisted fluoroscopy. For the insertion of iliosacral screws for pelvic ring disruption, the use of CT images has been shown to be accurate and safe, and can be employed when conventional X-ray is not useful due to the presence of contrast media in the bowels, or anatomic variations resulting in a narrow passage for the screw.
  • An essential capability of and step in the use of computer assisted surgery is registering the computer system and the digitized CT or MRI image data set to the patient in a common frame of reference in order to correlate the virtual CT or MRI image with the actual body section so imaged. Image-to-instrument registration at its most basic level requires some fiducials distributed in 3-dimensional space to be preoperatively imaged simultaneously with the patient. These fiducials provide an image frame of reference (ImFOR). The fiducials can either be synthetically added or consist of a set of pre-existing anatomical landmarks. There are three current methods of registering the data set of CT or MRI images of the object body segment to the actual body segment in the operating suite.
  • One method of registration uses CT or MRI imageable-markers or “fiducials” that can be recognized in renderings of the data set and also on the object body segment, and by touching or matching them point-to-point with a digitizing probe. Just before and during an operation, digitizing probe with sensors or emitters or reflectors for a waveform tracking system is then touched to each fiducial, enabling a computer to match the fiducials with the same points identified on the reconstructed images, planar and/or three dimensional, on a computer workstation. After a plurality of such fiducial points are matched, the computer program determines if an accurate match is obtained. This manual registration procedure locates the fiducials relative to an instrument frame of reference (InFOR). It is typical to use the operating room as the primary frame of reference (ORFOR) with the InFOR having a measured offset within the ORFOR. Thus the anatomy is registered to the image. This method is referred to as point-to-point registration.
  • A related registration method using fiducials attached to the patient involves mounting a reference frame directly to the patient's skeleton, such as the spine, pelvis or femur. In some instances, the skull can be fixed to a table mounted frame. The position of this frame of reference is optically tracked in real-time using the same video cameras used to track the surgical or therapeutic instrument. With the fiducials' physical location being known relative to the InFOR and the InFOR being known relative to the ORFOR and the fiducials also being known relative to the image, the location of arbitrary points in the image can be located in the physical space. Mathematically there is a bilinear transformation between the two spaces and an isomorphism exists, so operations in one space accurately reflect operations in the other. The instrument is then tracked (passive navigation) using one of several methods.
  • A second method of registration involves touching a segment of the body multiple times with a digitizing probe to obtain a multitude of points that will give a surface or shape that can be matched with the anatomic shape. Another version of this method uses an ultrasound probe with sensors, emitters or reflectors to map the surface of underlying bone to obtain a shape or surface. The computer program then matches the shape of the combined points to match the reconstructed image of the object part of the body. This method is referred to as shape or surface matching registration.
  • A third method of registration involves taking an X-ray of the body segment with a digitizing fluoroscope and matching the two-dimensional images with the three-dimensional data set.
  • The first registration methods require the surgeon to place fiducials on or in a patient before an imaging study and then use a digitizing probe to touch point fiducials or a surface, which is a tedious process. Using anatomic fiducials to register vertebrae for the insertion of pedicle screws has proven tedious and time consuming, so much so that this method has not gained general acceptance by orthopedic surgeons. The second method requires the exposure of a large area of bony surface in some cases, which is contradictory to one of the aims of using small incisions. The third method is more automatic but requires that a portable X-ray fluoroscopy machine be used.
  • Image to patient registration has been performed cutaneously by attaching spheres to a patient's scalp and then intraoperatively imaging these spheres using an ultrasonic sensor. Direct ultrasonic registration of bony tissue with their CT images is being developed.
  • A key component in any IGT/IGS system is the 3-dimensional (3-D) instrument tracker that informs the system computer of where the surgical or therapeutic instrument is and how it is oriented in 3-D space within the frame of reference. Currently there are four approaches to digitizing the position of the surgical or therapeutic instrument relative to some frame of reference: electromechanical; ultra-sonic; tuned, low-frequency, magnetic field transmitter source and a sensor-pointer; and infra-red optical.
  • An early approach to instrument tracking borrowed technology from robotic manipulators. These systems use articulated arms with optical shaft encoders or angle potentiometers to measure the angular displacements of each of the joints. Such measurements were combined to provide a mathematical estimate of the instrument's position and orientation. However, electromechanical passive articulated arms present several disadvantages that have limited their use, including: limited working volume due to constraints on arm weight; difficulties in moving free objects due to joint friction; positional accuracy limitations; the need for multiple manipulator arms in many situations; the inability to detect erroneous readings made by optical encoders at one or more joints; and difficulties associated with sterilizing or draping the large articulated arms.
  • Ultrasonic digitizers utilizing time-difference of arrival techniques have been used to locate instruments, but with limited success due to their: sensitivity to changes in the speed of sound; sensitivity to other operating room noises and echoes; and unacceptable accuracy in large operating volumes.
  • Magnetic field tracking of instruments has been tried, but suffered from operational difficulties caused by interfering fields associated with nearby metal objects and unacceptable positional accuracy for surgical or therapeutic use.
  • SUMMARY OF THE INVENTION
  • According to one aspect of the present invention, there is provided a surgical or device for performing surgery or therapeutic interventions on a patient, comprising a first curvature sensor configured to be placed on the patient, an attachment fixture attached to the first curvature sensor, a computer electronically coupled to the curvature sensor, a plurality of fiducials capable of being detected by a medical imaging system, a second curvature sensor electronically coupled to the computer, the second curvature sensor having a first end and a second end and capable of being coupled to the attachment fixture at the first end, and a tool connector coupled to the second end of the second curvature sensor.
  • According to another aspect of the present invention, there is provided a surgical device for performing surgery or therapeutic intervention on a patient comprising an attachment fixture, at least one fiducial capable of being detected by a medical imaging system, a curvature sensor coupled to the attachment fixture at one end and coupled to a tool connector at the other end, and a computer electronically coupled to the curvature sensor.
  • According to another aspect of the present invention, there is provided a device for use in an image guided therapy or image guided surgery system comprising a curvature sensor configured to be applied to a patient, an attachment fixture coupled to the curvature sensor, and a plurality of fiducials coupled to the curvature sensor.
  • According to another aspect of the present invention, there is provided a device for generating a frame of reference for an image guided therapy and image guided surgery system, comprising a curvature sensor configured to be applied to a patient, an attachment fixture and at least one fiducial.
  • According to another aspect of the present invention, there is provided a device for generating a frame of reference for an image guided therapy and image guided surgery system, comprising a ribbon comprised of one or a combination of plastic, metal wire, metal strip, fabric, rubber, synthetic rubber, nylon, thread, glass, or paper, a plurality of fiducials attached at known inter-fiducial distances along the ribbon, and an attachment fixture coupled to the ribbon at a known position with respect to the plurality of fiducials.
  • According to another aspect of the present invention, there is provided a sensing mesh, comprising at least one curvature sensor, a plurality of filaments coupled to the plurality of curvature sensors, a plurality of fiducials coupled to the curvature sensor(s) or to the plurality of filaments. In a further embodiment of this aspect, the sensing mesh is configured as a garment, such as a cap or as a garment to fit a human pelvis or torso.
  • According to another aspect of the present invention, there is provided a system for monitoring or enabling surgery or therapeutic intervention on a patient at a distance, comprising a first curvature sensor configured to be placed on the patient, an attachment fixture attached to the first curvature sensor, a computer electronically coupled to the curvature sensor, a second curvature sensor electronically coupled to the computer, the second curvature sensor having a first end and a second end and capable of being coupled at the first end to the attachment fixture, a surgical tool capable of being coupled to the second end of the second curvature sensor, and a communication device electronically coupled to the computer.
  • According to another aspect of the present invention, there is provided a device for monitoring the motions of a body, comprising a garment configured to be worn by a body, the garment including at least one curvature sensor(s) and a plurality of filaments coupled to curvature sensor(s) to form a mesh, and a communication device coupled to the curvature sensors and configured to communicate the output of the curvature sensors to a distant receiver.
  • According to another aspect of the present invention, there is provided a method of locating fiducials within a CT or MRI image of a patient comprising the steps of placing an array of fiducials on the patient, each fiducial within the array being located at known inter-fiducial distances apart, imaging the patient, identifying and locating in the image a reference point on the array of fiducials, inspecting the image one inter-fiducial distance from the reference point and identifying a fiducial using an image recognition means, inspecting the image one inter-fiducial distance from the last identified fiducial and identifying a fiducial using an image recognition means, and repeating the last step until all fiducials are located.
  • According to another aspect of the present invention, there is provided a method of registering a patient to an image from a CT or MRI system, comprising the steps of placing a curvature sensor on the patient, the curvature sensor being coupled to at least one fiducial, imaging the patient using a CT or MRI imaging system to produce an imaging study, analyzing the imaging study to create a volumetric data set in a computer database, the data set including identification of the at least one fiducial and the curvature sensor, electronically connecting the computer to the curvature sensor, determining the three-dimensional shape of the curvature sensor by using the computer to analyze the signal produced by the curvature sensor, and correlating the volumetric data set in the computer database to the three-dimensional shape of the curvature sensor by identifying the position of the at least one fiducial as a common point in a frame of reference.
  • According to another aspect of the present invention, there is provided a method for conducting surgery on a body, comprising the steps of placing a first curvature sensor on the body, the first curvature sensor having at least one fiducial in a known position with respect to the first curvature sensor, conducting an imaging study of the body using a CT or MRI system, the imaging study recording the position of the at least one fiducial with respect to the body, processing the imaging study to create an image data set and storing the image data set in a computer, the data set including the position of the at least one fiducial with respect to the body, connecting the first curvature sensor to the computer and using the first curvature sensor information to register the first curvature sensor and the at least one fiducial to the image data set, coupling one end of a second curvature sensor to the body at a known position and orientation with respect to the at least one fiducial and coupling a surgical tool to the other end of the second curvature sensor, displaying an image of the body from the image data set superimposed with an image of the position and orientation of the surgical tool with respect to the body; and using the superimposed image of the surgical tool on the image of the body to guide the surgical tool.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a perspective view of an embodiment of the invention positioned on a human head.
  • FIG. 2 is perspective view of a computer monitor showing graphic images.
  • FIG. 3 is a perspective view of an embodiment of the invention attached to the ilium with metallic screws.
  • FIG. 4 is a perspective view of a human head with a cranial mesh embodiment and fiber optic curvature sensor attachment for tool tracking in accordance with an embodiment of the invention.
  • FIG. 5 is a perspective view of a pelvic region with a mesh embodiment and fiber optic attachment for tool tracking in accordance with an embodiment of the invention.
  • FIG. 6 is a system diagram of an embodiment of the invention applied in a surgery on a femur with an intramedullary nail.
  • FIG. 7 is a system diagram of an image guided therapy/image guided surgery system in accordance with an embodiment of the present invention.
  • FIG. 8A is a flow diagram for producing an imaging study of a relevant portion of a patient wherein a 3-D internal image set is taken in preparation for pre-operative planning and intra-operative usage as per an aspect of an embodiment of the present invention.
  • FIG. 8B is a flow diagram for utilizing the imaging study of FIG. 8A in pre-operative planning, intra-operative use, and post-operative analysis and reconstruction as per an aspect of an embodiment of the present invention.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • The present inventors have realized that the disadvantages of current image guided therapy and image guided surgery (IGT/IGS) systems may be reduced or eliminated by combining curvature sensors to generate an attachment fixture-centered frame of reference for the therapy or surgery system. Curvature sensors are able to precisely measure curvature and output electronic signals encoding their curvature in three-dimensional space with at least one imaging sensor observable position reference marker or fiducial. One or more curvature sensors, as described herein, may be applied to the skin of a patient to electronically measure, in real-time, the precise contour of a portion of the patient's body and provide this three-dimensional surface contour data to a computer where such data may be correlated with a volumetric image data set obtained from a CT or MRI imaging study of the patient. Attaching a positional reference fiducial marker to the curvature sensor(s) at a known position with respect to the curvature sensor(s) permits the curvature sensor to be located in a CT or MRI imaging study in three-dimensional space. With at least one fiducial point located in the imaging study data set and on the curvature sensor, the computer of an image guided therapy or image guided surgery system can easily register (i.e. dimensionally correlate) the data set to the real-time surface contour measurements to create a correlated frame of reference for monitoring the position of a tracked instrument with respect to the patient. With the images thus correlated, the computer generates an image that superimposes the position and orientation of the surgical instrument on the correlated volumetric image of the patient drawn from the imaging study data set. Thus, near-automatic registration of the patient to the image study data set may be accomplished intraoperatively, even when the patient is moved during therapy or surgery.
  • The inventors further realized that by coupling one end of a second curvature sensor to the patient at a known place in three-dimensional space, such as at or near the positional reference fiducial marker or the first curvature sensor, and coupling a surgical instrument to the other end of the second curvature sensor, the position and orientation of the surgical or therapeutic instrument can be registered to the patient and the imaging study data set intraoperatively in real-time. Using a second curvature sensor, anchored or coupled to a registerable known position on the patient, to track the surgical or therapeutic instrument enables a computerized image guided therapy or image guided surgery system that does not require an optical tracking or electromechanical tracking system. Eliminating optical and electromechanical tracking, eliminates the problems and cost associated with such devices. The resulting computer aided therapy or computer aided surgery system according to one embodiment of the present invention comprises at least one fiducial reference point attached to a first curvature sensor for measuring the surface shape of a portion of a patient and/or a set of physically constrained fiducials for computing the surface shape of a portion of a patient, a second curvature sensor configured to hold a surgical or therapeutic tool or instrument on one end and to be attached at the other end to a known position with respect to the fiducial reference point or the first curvature sensor (e.g. fastened to the fiducial reference), and a computer system with software configured to determine the three-dimensional positions of the first and second curvature sensors, to register those positions with respect to an image data set, and to display the image and surgical or therapeutic tool in the same frame of reference for use as a guide for the doctor or therapist.
  • Curvature sensors, as used herein, include any sensor or combination of sensors whose function is to measure the curvature of a linear element in three-dimensional space with respect to a reference point, such as a fixed end, and output a digital signal that communicates the measurements to a computer. The linear element of a curvature sensor may be in the form of a fiber, fiber optic, cable, bundle of fibers, strip, tape, or band, and, as is described in greater detail herein, a plurality of linear elements may be coupled to interconnecting filaments to form a flexible mesh that will measure the 3-D shape of a surface or manifold. The curvature sensor may also comprise an electronic interface device for receiving measurement signals from the sensor's linear element and transforming such measurement signals into a digital signal output readable by a computer. As used herein, the term “curvature sensor” may encompass such an electronic interface device.
  • One curvature sensor that is suitable for use in various embodiments of the present invention and is illustrated in the drawings relies on linear, bipolar modulation of light throughput in specially treated fiber optic loops that are sealed in absorptive layers. This fiber optic curvature sensor consists of paired loops of optical fibers that have been treated on one side to lose light proportional to bending of the fiber. The lost light is contained in absorptive layers that prevent the interaction of light with the environment. An electronics interface box attached to the fiber optics illuminates the loops, measures return light, encodes the measurements and relays information to a computer having software that calculates the 3-D instantaneous shape of the sensor. Using this information, the computer is able to generate a 3-D model of the sensor and display a graphic image of the sensor's linear element on a computer screen. The fiber optic type curvature sensor is disclosed in U.S. Pat. Nos. 5,321,257 and 5,633,494 issued to Danisch, the specifications of which are hereby incorporated by reference in their entirety. A commercial version of the curvature sensor is produced by Measurand Inc. (New Brunswick, Canada), comprising a flexible, fiber-optic linear element that provides position measurements of the entire length and shape of the tape including its endpoint. Position determination is accomplished by the electronic processing of light signals transmitted down the fiber optic cable. Since the curvature sensor uses internally sensed fiber optic cables to determine their position, the sensor can be made of little more fiber-optic fibers surrounded by an absorptive layer, reducing the interconnections between the patient's frame of reference and the instrument to a non-interfering, extremely low-inertia, highly flexible, thin encapsulated glass fiber that is easily sterilized and may be made to be disposable.
  • While such a fiber optic curvature sensor is illustrated in the figures and referenced herein, other types of curvature sensors may also be used and are contemplated as part of this invention. For example, other curvature sensors may employ: conductors whose electrical resistance varies when bent, such as strips of conductive polymers, or flexible strips of semiconductor or metal oxide materials; conductive wires covered by insulator material whose insulation properties vary when subjected to bending stress; or flexible cables, such as special co-axial cables, whose inductive or capacitive properties vary when the cable is bent (e.g. by reducing the gap between a central conductor and one or more surrounding conductors). As with the fiber optic curvature sensors described herein, electrically-based curvature sensors would employ a pulsed or oscillating current and an electronic interface/detector to locate the distance along the sensor to a bend, the amount of bend and direction of a bend in two-dimensions, and output this information in a form readable by a computer. With several alternative types of curvature sensors useable, the term curvature sensor should be understood herein as encompassing any sensor capable of performing the functions of measuring the three-dimensional position of the length of a linear element, either continuously or at intervals (i.e. points along the linear element) with respect to a reference point (e.g. an end or mid point), and providing an output signal that can be read by a computer, including sensors that employ an intermediary electronic interface device to produce a computer-readable output, such that the computer can determine the 3-D positions and orientations of the linear element along its length.
  • A curvature sensor need not be a physical device which determines its own position in 3-D space, but can also include a set of physically constrained fiducial points which are capable of being imaged. The physical constraints interconnecting the fiducial points can be used to aid in the automatic detection and localization of the fiducial points in the image as well as be used for the piecewise-linear (wire-frame) representation of the curvature of the surface which carries the fiducial points.
  • The term “fiducial” as used herein refers to anatomic or anatomically-fixed landmarks recognizable by an imaging system and used to locate known points with respect to a frame of reference, and more particularly to radioopaque (i.e. CT-visible) or MRI-visible markers applied to the skin surface overlying the site of an operation or attached to the underlying bone. Fiducials may be radioopaque spheres (e.g. lead, tungsten, or titanium spheres) for CT-imaging or fatty vitamin pills for MRI imaging, for instance.
  • The term “attachment fixture” as used herein refers to any fixture that is imageable and whose position is accurately known with respect to a set of fiducials and/or a curvature sensor. The function of the attachment fixture is to provide a known point of reference for the fiducials and curvature sensors that can be correlated to the imaging study data set when the patient and data set are registered. The attachment fixture may be as simple as an easily-recognized CT or MRI imageable feature on a garment, bandage, tape, band, wrap, screw or other patient-attachment item. In a preferred embodiment, the attachment fixture is both an easily recognized fiducial at a known position with respect to an array of fiducials on a curvature sensor mesh, and a hard-point for attaching one end of the second curvature sensor to a known 3-D position with respect to the patient so the IGT/IGS can determine the position of the surgical or therapeutic tool with respect to the patient at the other end of the curvature sensor. In this embodiment, the attachment fixture comprises a fiducial, a clip securing one end of the curvature sensor in a known position and orientation, and a means for mounting the attachment fixture on the patient, such as being sewn, stapled or glued to a garment to be worn by the patient. In another embodiment disclosed herein, the attachment fixture is simply an easily recognizable fiducial that is at a known position with respect to the other fiducials, such as a radioopaque metal (e.g. lead, tungsten or titanium) cross attached (e.g. sewn or glued) to a garment comprising an array of fiducials. The clip or latching mechanism for attaching one end of the curvature sensor may be any suitable physical interconnect that will hold one end of the curvature sensor linear element securely in 3-dimensions with a fixed orientation, including a spring clip, threaded connection, clamp, tongue-in-groove connection, or cylindrical cavity with a detent for receiving a grooved rod. Preferably, the clip will permit easy connect and disconnect of curvature sensors to enable patient preparation, sterilization of instruments, movement of the patient, etc. The attachment fixture may be disposable or of a more permanent nature. There may be more than one attachment fixture provided in a particular embodiment. The attachment fixture may provide for attaching a plurality of curvature sensors to the fixture simultaneously. And the attachment fixture may be integrated with other elements of the various embodiments, including, but not limited to, a garment comprising an array of fiducials, a curvature sensor, a curvature sensor garment, an electronics interface device for the curvature sensors, a patient restraint, a medical device holder or positioner, the operating table, a patient monitor (e.g. temperature, blood pressure or pulse monitor), or any combination of devices that will be placed in a fixed position and orientation with respect to the patient during imaging studies and the treatment/operation.
  • The term “medical imaging system” as used herein refers to any imaging capability, device or system capable of obtaining an image of a body, preferably a volumetric image, including without limitation computed tomography (CT), fluoroscopy, magnetic resonance imaging (MRI), positron emission tomography (PET) or single photon emission tomography (SPECT).
  • The terms “tool” and “surgical or therapeutic tool” as used herein refers to any device used in a surgery or therapy, including without limitation any instrument, probe, drill, scalpel, stent, suture, tool, scissors, clamp, or imager (such as a fiberscope). In embodiments not related to medical or therapeutic applications, the term “tool” refers to any device that aids the operator in accomplishing a task, including without limitation any probe, drill, wedge, imager, screwdriver, pick, scissors, clamp, wrench, key or other tool.
  • In a first embodiment of the present invention, an IGT/IGS system is provided employing a first curvature sensor to measure the shape and orientation of a portion of a patient's body, at least one fiducial to enable registration to an imaging study data set, a second curvature sensor configured to hold and track an instrument (e.g. a surgical instrument or probe), a computer to receive the three-dimensional information from the first and second curvature sensors and calculate therefrom their positions and orientations with respect to a frame of reference and register their positions to an imaging study data set, and a computer monitor for displaying the correlated images of the instrument, the patient and the imaging study data set. Using a second curvature sensor to track the position of the instrument enables an IGT/IGS system that measures instrument position directly instead of indirectly as with an optical tracking system. The CT or MRI imaging study data set is obtained with the first curvature sensor device fixed in place (such as with adhesive) on the object portion of the patient's body and imaged simultaneously with the anatomy. The flexible curvature sensor may be in the form of a strip, tape, band or mesh, as described herein, that can be laid upon or wrapped about the patient in the area where surgery is to be performed. CT or MRI readable fiducials may be incorporated within the curvature sensor strip, tape, band or mesh containing the curvature sensor(s), such as at set distances apart (forming an inter-fiducial distance). The relationship of the curvature sensor is thus established in relation to the anatomy seen on the imaging study. In surgery, a second flexible fiber optic curvature sensor device is physically attached at one end to the first curvature sensor, or to a structure that provides a known positional reference point for determining the location and orientation of the end of the second curvature sensor in 3-D space, such as an attachment fixture. In a preferred embodiment, the attachment fixture itself is imageable on the CT or MRI scan, or incorporates an imageable fiducial, so that its frame of reference position is established in the imaging study, thereby providing a known position for the end or origin of the second flexible fiber optic curvature sensor. The attachment device may be separate from or integrated with an electronics interface device that electronically couples the curvature sensor to the computer system. The second curvature sensor device is electronically linked to the computer system either directly or through an electronic interface device, which may be the same electronic interface device coupled to the first curvature sensor or a separate electronic interface device. This second curvature sensor has attached at its other end (i.e. the end not attached to the attachment fixture) a tool connector or holder for holding the surgical or therapeutic tool, instrument or probe to be used in the surgical or therapy procedure. The tool connector may be any structure or mechanism suitable for securely holding a tool, instrument or probe in a fixed orientation with respect to the end of the second curvature sensor, including any one of or a combination of a clasp, slot, opening, flange, or threaded member. The 3-D position of the second curvature sensor, particularly the location and orientation of the tool connector at its end, is measured or calculated in relation to the known reference point on the patient's body, which is registered in the computer system to the imaging study data set and the first curvature sensor which informs the computer of the position and orientation of the object part of the patient's body. Thus, the tool or surgical instrument is tracked and its position and orientation is determined by the computer system in relation to the object anatomy as recorded in the imaging study.
  • In an alternative of this embodiment, the second curvature sensor coupled to a tool holder for holding a surgical or therapeutic instrument or probe is used in an IGT/IGS system that retains an optical tracking system, with the positional information generated by the second curvature sensor used to supplement the instrument tracking provided by an optical tracking system in order to improve tracking accuracy and/or to provide continuity of instrument tracking when an object, such as the physician's body, blocks the lines of sight between the optical trackers and the instrument or probe.
  • This first embodiment may be understood with reference to FIGS. 1 through 7.
  • Referring to FIG. 1, a flexible fiber optic curvature sensor 100 is applied with adhesive to a human head 110. At one end of the curvature sensor 100 is an electronic interface box 105 that transmits, via a cord 109 to a computer 120, the information that specifies the shape of the curvature sensor 100. The light quantity is measured in the electronic interface box 105. The curvature sensor 100 has radioopaque or MRI visible fiducials in the form of bands 102 on each side of the head 110. Atop the electronic interface box 105 is attached a light emitting diode (LED) array 108. The LED array 108 is tracked by a camera 150 attached to the computer 120. The position of the electronic interface box 105, the curvature sensor 100 and the head 110 can be tracked by the computer 120 in all six degrees-of-freedom (DOF). The graphic shape 140 of the curvature sensor 100 is displayed on a computer monitor 130. An outline 142 of the graphic shape 140 corresponds to radioopaque/MRI visible bands 102.
  • Referring now to FIG. 2, a monitor 130 shows the three-dimensional reconstruction 210 of the head 110. Bands 202 are visualized on the reconstruction 210. The computer 120, using shape matching software, unites the fiducial bands 202 with the graphic shape 140 of the curvature sensor 100. The outline 142 of the graphic shape 140 is matched to radioopaque/MRI visible fiducial bands 102. As the position of the actual curvature sensor 100 has been determined by an optical tracking system of a camera 150 and a computer 120, relationships between the reconstruction 210 of the head 110 can now be made.
  • FIG. 3 shows an embodiment applicable to registration of a major bone. Large screws 330 are drilled into an ilium 340 illustrated in a human body 310. Caps 320 are placed over the screws 330. Each cap 320 has a slot 321 through which a flexible curvature sensor 100 is placed. Setscrews 322 hold the curvature sensor 100 firmly after it has been placed under tension through the slots 320. Setscrews 322 also hold the caps 320 firmly against the screws 330. Thus the curvature sensor 100 is held firmly in a fixed position in relation to the ilium 340. A CT scan is then performed that provides a digital data set of the pelvis 340 and the curvature sensor 100 in a fixed relation. The shape of curvature sensor 100 is seen in a reconstruction of the CT data, such as illustrated in FIG. 1, as provided by radioopaque fiducials or bands 102.
  • In FIG. 4, a cranial mesh 400 of flexible fiber optic curvature sensors 410 is held together with small connecting filaments 415, forming a cap on a head 440. An integrated electronics interface box/attachment fixture 420 connects to a computer via cable 470 and attaches to a second flexible fiber optic curvature sensor device 430. The cranial mesh 400 is visualized on the CT scan of the head 440 and also by the graphic representation thereof. The two images are merged or superimposed so that a CT of the cranial mesh 400 is registered with the graphic representation of the cranial mesh 400. Thus, a graphic representation of a cranial mesh 400 is registered to a CT of the head 440. The graphic shape of a second flexible fiber optic curvature sensor device 430 is thereby registered to the CT of the head 440, and a surgical probe 460 is thus registered to the CT of the head 440. Spaces 418 between the flexible fiber optic curvature sensor devices 410 and filaments 415 permit room for a surgical probe 460 to be used in surgical operations on the head 440. Flexible fiber optic curvature sensor devices 410 are wired individually to an electronic interface box 420 so that one flexible fiber optic curvature sensor device 410 can be disconnected and moved if needed for positioning the surgical probe 460 without affecting registration.
  • FIG. 5 illustrates a pelvic mesh 500 of flexible fiber optic curvature sensors 410 held together with small connecting filaments 415, forming a pants-like garment on a pelvis region 540. An electronics interface box 420 connects to the computer via a cable 470 and has an attachment fixture 422 where a second flexible fiber optic curvature sensor device 430 is attached. The pelvic mesh 500 is visualized on the CT scan of the pelvis region 540 and also by the graphic representation thereof. The two images are merged or superimposed so that the CT of the pelvic mesh 500 is registered with the graphic representation of the pelvic mesh 500. Thus, a graphic representation of the pelvic mesh 500 is registered to the CT of the pelvic region 540 and thus, more specifically, to the bony pelvis. The graphic shape of the second flexible fiber optic curvature sensor device 430 is therefore registered to the CT of the pelvic region 500, and a surgical drill 560 is thus registered to the CT of the pelvis 500. Spaces 418 between flexible fiber optic curvature sensor devices 410 and filaments 415 permit room for the surgical drill 560 to be used in surgical operations on the bony pelvis situated in the pelvic region 500. Flexible fiber optic curvature sensor devices 410 are wired individually to an electronic interface box 420 so that one flexible fiber optic curvature sensor device 410 can be disconnected and moved if needed for positioning the surgical drill 560 without affecting registration. The dorsal mesh 580 is elastic to provide a good fit onto the pelvic region 540. The pelvic mesh 500 has an open perineal section 590 for normal excretory functions.
  • Referring to FIG. 6, a flexible fiber optic curvature sensor 100 is applied with adhesive to the thigh 605. A fracture 610 of the femur 611 has been fixed with a intramedullary nail 612 without exposing the fracture site. A mobile fluoroscope 600 acquires two or more X-ray images 650 of holes 615 in the intramedullary nail 612. A computer 120 processes X-ray images 650 that include radioopaque markers 102 attached to the flexible fiber optic curvature sensor 100. An interface box 420 connects to the computer via a cable 470 and has an attachment fixture 422 with a second flexible fiber optic curvature sensor device 430 attached. A second interface box 660 attached to a drill 666 connects to the computer 120 with a second cable 670. The position of the second flexible fiber optic curvature sensor device 430 is more accurately determined as it is attached to the electronic interface box 420 at one end and to another electronic interface box 660 at the other end. Thus, the position of the drill 666 in relation to the holes 615 of the intramedullary nail 612 may be more accurately determined.
  • Referring to FIG. 7, an IGT/IGS system comprises the functional elements of a 3-D imaging system 710, a computer image processing system 720, a curvature sensor system 730, an image display device 740 and a user interface 750. The 3-D imaging system 710, which may be a CT or MRI imager, provides a volumetric image digitized data set 715 to the computer image processing system 720. The curvature sensor system 730 provides digitized information 735 on the 3-D position and orientation of the individual curvature sensors to the computer image processing system 720. The computer image processing system 720 correlates the image data set and the curvature sensor 3-D position information and provides a video output 745 to the image display device 740 that superimposes an image of the surgical instrument on the correlated volumetric image of the patient. Operator commands 755 is provided from the user interface 750 to the computer image processing system 720.
  • The operation and method of using an IGT/IGS embodiment of the present invention may be explained with reference to FIGS. 8A and 8B. Prior to the operation, the patient undergoes an imaging study (step 800) wherein a 3-D internal image set is taken of the portion of the patient's body that will be operated upon. In preparation for this imaging study, curvature sensor(s) (e.g. a curvature sensor garment), fiducials and/or an attachment fixture for the curvature sensor are applied to the patient, such as with adhesive so their positions on the body are recorded in the same imaging study. The imaging study data set is then processed (step 805) wherein the computer image processor locates the position of the attachment fixture with respect to the patient's anatomy, the fiducials and, if employed, the curvature sensor garment, and calculates their positions and orientations within the image data set. Next, (step 810) the attachment fixture is marked on the image data set.
  • As described herein, the IGT/IGS system may be employed with a curvature sensor (which incorporates fiducials) applied to the patient, or with only an array of fiducials applied to the patient with no curvature sensor on the patient. The operation of the system differs slightly as summarized in FIG. 8A.
  • If a curvature sensor is applied to the patient, the computer image processing system obtains the 3-D position information from the curvature sensor (step 820). Then the computer image processing system calculates the position of the attachment fixture relative to the fiducials (step 830) using the known relative positional information of the fiducials to the curvature sensor.
  • If a curvature sensor is not applied to the patient, the computer image processing system processes the image to detect and locate fiducials (step 825) based upon their shape, opacity, geometric position with respect to the attachment fixture (e.g. fiducials coupled in known locations in a garment coupled to the attachment fixture), or other image-recognizable property.
  • With the position of fiducials determined, the image data set is marked (step 835) to make them obvious to the user.
  • Referring to FIG. 8B, the physician may plan the operation using the imagery (step 840), wherein cues, templates, guide markers or other visual clues may be provided to the computer image processor for display during the operation.
  • During the operation, the computer image processor will obtain positional information from the curvature sensors (step 845), such as via electronic interface device(s), on a near-real time basis. The computer image processor then uses the curvature sensor information and the imaging study data set, and in particular the location of the attachment fixture, to compute (step 850) the position and orientation of the surgical instrument relative to the patient. Using this position and orientation information, the computer image processor generates (step 855) a near-real time image of the patient with an integrated display of position and orientation of the surgical instrument. As the instrument position is displayed, the computer also records (step 860) the position and orientation of the surgical instrument in a computer database. This stored position/orientation information permits post-operative reconstruction (step 865) of the operation for analysis and/or training.
  • With a first curvature sensor affixed to the patient and a known reference point provided at the attachment fixture, this embodiment will enable establishing an attachment fixture-centered frame of reference. Since an attachment fixture-centered frame of reference may be divorced from the operating room coordinate system, this embodiment may be employed to accommodate movements of the patient or an extremity undergoing surgery without requiring extensive re-registration of the instrument or probe to the data set drawn from the image study. This capability would enable a surgeon to reposition a body or extremity during an operation if necessary without interrupting or delaying the procedure, or introducing positional errors into the IGT/IGS system. The direct connection between the patient and the instrument via a second curvature sensor also makes the system far more compact than video tracking systems.
  • An alternative embodiment IGT/IGS system uses a conventional optical tracking system to track the surgical instrument, but makes use of a curvature sensor, fiducials, and a computer workstation that synthesizes CT or MRI data into usable graphics to provide automatic registration of the object section of the body with the CT or MRI image data set. This embodiment allows the surgeon to guide instruments using the optical tracker, while alleviating several significant disadvantages of existing passive navigation systems for IGT/IGS systems.
  • In another embodiment of the present invention, a plurality of curvature sensor linear elements (e.g. encased optical fibers) are laid out in an array and held together with strong filaments to form a mesh or fabric of multiple curvature sensors. This embodiment is illustrated in FIGS. 4 and 5.
  • Referring to FIG. 5, the curvature sensors 410 are positioned in parallel to each other and coupled to cross-running filaments 415 to form a mesh. The cross-running filaments may be any number of materials, including any one or combination of plastic, metal wire, metal band, polymer plastic, paper, cloth, nylon, rayon, segmented solid pieces of plastic, metal or wood, or similar formable material. The curvature sensors 410 are coupled to an electronic interface device 420 which sends curvature information to a computer (not shown) via cable 470.
  • The mesh or fabric may be shaped to conform to a targeted body part much like a garment, which may provide significant clinical advantages. For example, referring to FIG. 4, a cranial mesh garment 400 may be shaped like a cap to conform to a patient's head 440. In this embodiment, the curvature sensors 410 are aligned in parallel hoops or bands and held in place with filaments 415 configured in a radial pattern originating at the crown. As another example, the curvature sensor mesh may be configured as a pelvic garment 500 that is shaped much like a pair of bicycle shorts with a cutout 590 for the perineum, as illustrated in FIG. 5. The curved sensors 410 are provided on the exposed portion of the mesh while the patient is lying in bed, usually the anterior or ventral section. The dorsal or posterior portion may be an elastic band or mesh 580 to provide a better fit to the abdomen 540. The electronic interface device 422 attachments for the plurality of curved sensors 410 in the mesh may be at the edge or may be part of the attachment fixture 422, as illustrated in FIG. 5, and/or electronic interface device for a tool 560 tracking second curvature sensor 430, as is also illustrated in FIG. 5. The garment may comprise an array of fiducials at regular points in the mesh and a rigidly affixed attachment point or attachment fixture for securing a dynamic frame of reference (DFOR) to a patient.
  • In another embodiment, a fiber optic curvature sensor is attached to the surface of the patient by an adhesive, as may be appropriate in neurosurgical cases. In another embodiment, metal pins or screws attach the fiber optic curvature sensor to bone. In either embodiment, fiducial markers detectable by X-ray or magnetic resonance imaging, such as fiducials made of titanium, may be incorporated into the curvature sensor at known dimensions from a reference point, such as a set distance apart along a flexible wire or tape fixed or attached to an attachment fixture/reference fiducial, or at a set distance along fibers in a fabric, bandage or garment that is fixed or attached to an attachment fixture/reference fiducial. A sensor or emitter appropriate to the waveform tracking device employed may be attached to the device. A three-dimensional data set of the object body section is then obtained. This data set is then transferred to the computer workstation to be used in a surgical procedure.
  • In one embodiment, registration is accomplished automatically. During surgery, the computer ascertains the shape of the fiber optic curvature sensor, which has been positioned on the patient, using the information passed by the electronic interface device. The digitized shape of the curvature sensor is then matched with the shape that is evident in the imaging data set, such as determined by the positions of the fiducials incorporated within the curvature sensor. Registration can then be determined automatically without the surgeon needing to use a probe or direct the use of a fluoroscope.
  • In an embodiment appropriate for cranial procedures, the curvature sensor, with integrated fiducials, is placed completely around the head in a position such that it will not interfere with the anticipated place for surgical entry. The curvature sensor and fiducials assembly may be adhered to the skin surface, making it very stable and secure, thus assuring that the built-in reference frame remains in the same position. In this embodiment, the curvature sensor and fiducials may be in the form of a garment shaped like a cap, or may be in the form of a tape or bandage that is wrapped about the head.
  • Another embodiment of this invention comprises the use of a curvature sensor to provide two-dimensional image tracking using a fluoroscope. A fluoroscope obtains images of the object body section that has attached the flexible fiber optic curvature sensor device comprising radioopaque fiducial markers, which may be at set distances along a wire or at set positions within a grid. Thus, the position of the flexible fiber optic curvature sensor device is determined in relation to the object body section. Tracking of surgical tools in relation to the object body section is then accomplished by using an optical tracking system with LEDs attached to the flexible fiber optic curvature sensor device. Alternatively, tracking may be accomplished by attaching the tools to a second flexible fiber optic curvature sensor device, eliminating the need for the reference frame and optical tracking system. In either embodiment, once the initial set of X-ray images has been obtained, further X-rays are not needed as the relative position of the surgical tool to the object body is now recorded.
  • The present invention includes two embodiments associated with two registration options. One embodiment comprises a curvature sensor and fiducials affixed to a dynamic frame of reference (DFOR) wrap or garment. The other embodiment comprises only fiducials affixed to the garment with no curvature sensor. In both embodiments, a curvature sensor may be connected to the DFOR wrap or garment at an attachment fixture or fixture whose position is accurately known with respect to the wrap or garment frame of reference. This attachment fixture provides a physical and positional connection between the attachment fixture, which has been preoperatively volumetrically-imaged with the garment frame of reference, and the instrument. The physical interconnect provided by the attachment fixture allows for the continuous tracking of the 6-degrees of freedom state of the instrument without the need for extraneous optical or articulated arm instrument tracking equipment in the operating room. The instrument's kinematic state can then be displayed on a monitor viewable by the surgeon as a computer-generated image embedded in the preoperatively obtained volumetric image. There may be more than one attachment fixture provided on a curvature sensor mesh or garment. The attachment fixture may comprise a latching mechanism to physically attach one or more curvature sensors to a known reference point on the attachment fixture, and one or more fiducials which may be imaged with CT and/or MRI systems in order to establish the known position of the attachment fixture in the imaged frame of reference.
  • Since the entire fiber optic curvature sensor or other fiducials affixed to the garment are imaged, the coordinates of each fiducial are known in real-time, intraoperatively. More importantly, the position of each fiducials is known as the patient moves, such as in the process of breathing and expanding the chest. Since each fiducial's internal coordinates is known during the motion based upon information provided by the curvature sensor, this information can be used to provide a natural approach to warping images in real-time. Thus, a further embodiment of the present invention comprises a fiber optic curvature sensor-enabled garment which can dynamically track the movements of the fiducials on a patient's moving body. These tracked fiducial points can then be used to dynamically warp the preoperative image to more realistically present an image to the therapist as the patient motion is happening.
  • As a non-invasive alternative to registration methods, fiducials and a fiber optic curvature sensor may be affixed to a patient's skin, either adhesively or embedded in a garment, bandage, tape or other structure, at the time of volumetric pre-operative imaging. This garment is then left affixed to the patient for the duration of the treatment or surgery. The fiducials in the garment provide the image frame of reference as well as the attachment fixture for attaching an instrument connection fixture, also herein referred to as an attachment fixture, to a known reference point. The embedded fiber optic curvature sensors provide the dynamic garment frame of reference, not only in the sense of being affixed to the patient, but also in the sense that it can track, in real-time, the location of the fiducials. The real-time, intraoperative location of the fiducials can be used to synchronize the acquisition of instrument tracking data with the preoperative images for improved accuracy in certain dynamic scenarios such as therapy in the chest area.
  • Since the fiber optic curvature sensor is included in the volumetric image data set, its position is known relative to the image and it comprises a set of distributed fiducial points. Since the fiber optic curvature sensor measures its own position, a rigid attachment point provided by the attachment fixture can be part of the curvature sensor device or garment and used for the rigid attachment of a second curvature sensor whose other end is attached to a surgical instrument.
  • The first step in a general framework frame of reference registration is the definition of a relation among the different coordinate systems or, as used herein, frames of reference. Current methods include: reference preoperative images; fiducials; and instruments to a frame of reference affixed to the operating room. Assuming that the errors in establishing these relationships and their registration are independent, then the cumulative error, at least to a first approximation, is the root sum square of the individual errors in estimating these relationships. It is therefore clear that the methods and embodiments disclosed herein that will eliminate or reduce the number of frame registration steps will increase the accuracy of instrument positioning. It is also clear that calibration techniques can outperform registration techniques in terms of accuracy.
  • In addition to frame of reference registration problems, there is also a difficulty associated with the nonlinearities associated with CT, MRI, and other volumetric images. Originally, volumetric imaging was a visualization aid for surgeons and diagnosticians and nonlinearities in the image were relatively unimportant. When these same images are used for navigation, linearity becomes a significant issue. Such nonlinearities or distortion in computer aided surgical navigation with fluoroscopy have been recognized and correction methods have been developed. A second issue associated with the image itself is the spatial quantization of the image. Typically the image is digitally constructed from a series of slices or a helical scan of the patient. The fact that these individual slices have a finite thickness limits the number of samples taken on each fiducial. With a limited number of samples, the accurate estimation of the centroid of the fiducial can become problematic. For example, using a 5 mm sphere and 1 mm scan width yields only 5 spatial samples on the sphere, thereby limiting the accuracy that can be achieved in the image frame of reference (ImFOR) itself. In general, the methods of rigid-body transformations from one frame of reference to another are well known in the art. The difficulty is in accurately establishing the several frames of reference and their relations.
  • In an embodiment of the present invention that provides a dynamic patient-based frame of reference, these registration errors are significantly reduced by eliminating several registration procedures and exchanging them for one registration step and one calibration procedure. Furthermore, the calibration of the surgical instruments to the garment frame of reference can be done preoperatively, thus minimizing time in the operating room.
  • The direct connection between the patient and the instrument provided by the second curvature sensor eliminates the need for intraoperative video tracking and its associated equipment. Precise positioning of surgical instruments relative to 3-D CT and MRI (volumetric) and 2-D fluoroscopic images is important for the delivery of anti-cancer drugs, localized ablation of tumors, biopsy, and execution of pre-planned surgery. In the embodiment providing surgical instrument navigation, diagnostic and treatment modalities can be done easier and more cost effectively than current means allow. The ease of use of various embodiments of this invention will make it possible to precisely and repeatably place an instrument in particular positions at specified angles by therapists untrained in the details of its operation. This invention may significantly reduce patient morbidity, physician intraoperative difficulties, radiation exposure, and operating time, while at the same time improving repeatability of instrument placement, thus improving the accuracy of therapies and deliveries of medication. This improved accuracy will increase the information obtained from each set of experiments because of the repeatability of the procedures on the same or different individuals. Data on instrument position can also be recorded and associated with each operation to determine which instrument position provides the best effect, as well as for training of other therapists once the procedure is approved. These benefits are in addition to the elimination of the need for cumbersome operating room real-time optical tracking systems.
  • The use of this invention could be extended to vertebral biopsy and aspiration of the hip, which now requires an operating room environment with fluoroscopy. Using already obtained MRI data for herniated nucleus pulposis with radiculopathy, injection of corticosteroids around the nerve root could be facilitated. Other procedures requiring accurate needle placement would be more readily done.
  • Being less invasive and less cumbersome than current systems, this invention could be used in an out-patient setting, even in a physician's office, enabling precision procedures, such as percutaneous biopsy, to be done accurately and safely.
  • In an embodiment of the present invention, fiducials are positioned at known positions along a flexible fiber, such as a plastic or fabric ribbon, wire, metal band or plastic, fabric tape, that originates at a known reference position, such as an attachment fixture, and can be taped to or wrapped around a patient. As used herein, the term “ribbon” refers to a long, narrow flexible structure capable of fixing at least one fiducial in a known position along its length and being bent such as to conform to the contours or wrap about a body. The ribbon may be made of any material that is flexible or semi-rigid so as to be laid on top of or wrapped about a patient, including one or a combination of plastic, metal wire, metal strip, fabric, rubber, synthetic rubber, nylon, thread, glass (including fiber optic glass), or paper (as may be suitable for a pre-sterilized, disposable fiducial wrap). This embodiment significantly facilitates and enhances the registration of CT or MRI images since the fiducials are easily located by a computer because each fiducial is at a known dimension from the next. This reduces inaccuracies associated with using an intensity or threshold value determination of fiducials, which often results in false or missed fiducials since bodily tissues may result in images that are similar to those created by such fiducials. This embodiment enables a method of locating fiducials for an IGT/IGS system, comprising the steps of placing an array of fiducials on the patient, each fiducial within the array of fiducials being located at a known inter-fiducial dimension apart from one another, identifying and locating a reference point on the array of fiducials, such as an attachment fixture, inspecting the image one inter-fiducial length from the reference point and identifying a fiducial using an image recognition means, the identified fiducial becoming a last-identified fiducial, inspecting the image one inter-fiducial length from the last-identified fiducial and identifying a next fiducial using an image recognition means, the identified next fiducial then becoming a last-identified fiducial, and repeating the previous step until all fiducials within the array of fiducials have been identified in the image.
  • In a further embodiment of this invention, the curvature sensor garment is coupled to a communications device, such as a cable to a computer with an internet connection, a radio or cellular telephone data link, or a satellite communication data link, so that the positional and curvature information provided by the curvature sensors are communicated to a remote location. Such a garment and communication device would enable remote surgery or therapies. Such a garment and communication device would also allow the dynamic monitoring of a patient, such as while freely walking. This embodiment could have several useful applications for remotely monitoring the movements of athletes in training, patients engaged in physical therapy exercises, soldiers or rescue workers operating in adverse environments, or other applications where the precise, real-time position of body parts and/or tools they are using needs to be known.
  • Another embodiment comprising a communication device is a system for enabling remotely conducted or remotely monitored precision surgery. In this embodiment, a curvature sensor garment, mesh or fabric, with or without incorporated fiducials, is applied to a patient at a remote location, such as a battlefield medical facility. A data set of the injury is obtained using fluoroscopy or other means to create a digitized volumetric data set of the patient. A second curvature sensor is attached to the curvature sensor garment or fabric at an attachment fixture whose position is registered in the volumetric data set. The volumetric data set is communicated to another location, such as a hospital or a physician's office, where it is loaded on a computer. During surgery, the precise positional information on the patient's frame of reference, provided by the curvature sensor garment or fabric, and the precise location and orientation of a surgical tool, provided by the second curvature sensor, are communicated by the communication device to the distant location. In the distant location, a computer registers the volumetric data set with the patient's frame of reference and the position and orientation of the surgical tool, and displays the result on a computer monitor in the distant location. Using a verbal, video, telerobotic or other communication means, a physician at the distant location may then direct or observe the conduct of the remote surgery with greater confidence and precision than would be possible with only a video link between the two locations. This system may incorporate an IGT/IGS at the site of the remote surgery, but is not necessary.
  • The present invention offers several significant advantages over the state of the art. With this invention there is no exposure of additional bone for recording anatomical landmarks or percutaneous attachment of fiducials to bones, both of which require surgery in addition to that which is required for the required surgery or therapy. Intraoperative manual registration is not required because the instrument is directly connected to the patient's frame of reference by a curvature sensor which continually reports its position in 6-D space. The need for articulated mechanical arms or a frame containing multiple video cameras to reduce instrument blind spots is eliminated. The elimination of cumbersome tracking equipment reduces the sterilization problem to one of using disposable fiber optic cables which may attach between the patient's garment and the instrument. A small electronics interface box may be required as a part of the curvature sensor which can be easily draped since it is at one end of the curvature sensor. Since the position of the instrument is measured relative to a frame of reference which is affixed to the patient, patient movement ceases to be a problem. The physical interconnection between the patient and the instrument also reduces position estimation errors by replacing (intraoperative) registration steps with (preoperative) calibration.
  • Various embodiments of the present invention provide a device and method for an IGT/IGS system that is non-invasive, self-contained, passively-navigated, dynamically-referenced, and automatically image-registered, eliminating the need for the surgeon to do registration manually.
  • Numerous non-medicinal applications of the present invention are possible, including veterinary treatment/surgery systems, archeological research, explosive ordinance disposal and other applications where an imaging study is used to precisely guide a tool or device and there is a need for precise image registration and tool tracking or reconstruction of the physical motion of a tool or body part after the fact. This position and tool tracking data could be stored on a storage device associated with a personal computer worn by a person.
  • While the invention has been described in detail and with reference to specific embodiments thereof, it will be apparent to one skilled in the art that various changes and modifications can be made therein without departing from the scope of the invention. Thus, the breadth and scope of the present invention should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.

Claims (23)

1-31. (canceled)
32. A device comprising:
a) a first non-invasive sensor configured to be placed external to an entity, the first non-invasive sensor providing first external location data, the first non-invasive sensor including at least two physically constrained imageable fiducials;
b) a first attachment fixture coupled to the first non-invasive sensor; and
c) a computer configured to receive the first external location data and relate the first external location data to:
i) the location of imageable fiducials; and
ii) a 3-D internal image set of the entity.
33. The device of claim 32, further comprising a second non-invasive sensor providing second external location data, the second non-invasive sensor having a first end and a second end; the first end configured to couple to the first attachment fixture.
34. The device of claim 33, further comprising a tool connector configured to be coupled to the second end.
35. The device of claim 33, further comprising a second attachment fixture configured to be coupled to the second end.
36. The device of claim 33, further comprising a tool connector configured to be coupled between the first end and the second end.
37. The device of claim 34, further comprising a mechanism for displaying the location of the tool connector with respect to the entity.
38. The device of claim 36, further comprising a mechanism for displaying the location of the tool connector with respect to the entity.
39. The device of claim 34, further comprising an optical tracking system electronically coupled to the computer and configured to track the location of the tool connector.
40. The device of claim 34, further comprising an optical tracking system electronically coupled to the computer and configured to track the location of the tool.
41. The device of claim 32, wherein the computer is configured to define a first attachment fixture centered frame of reference using the first external location data.
42. The device of claim 32, wherein the first non-invasive sensor comprises a fiber optic curvature sensor.
43. The device of claim 32, wherein the device is configured to be used during the performance of an operation on the entity.
44. The device of claim 43, wherein the operation is at least one of the following:
a) a surgery;
b) a therapeutic intervention; or
c) a combination of the above.
45. The device of claim 44, wherein the operation is at least one of the following:
a) image guided surgery;
b) image guided therapeutic intervention; or
c) a combination of the above.
46. The device of claim 32, wherein the entity is a patient.
47. The device of claim 32, wherein the first attachment fixture includes at least one imageable fiducial.
48. The device of claim 32, wherein imageable fiducials are configured to be detectable by a medical imaging system.
49. The device of claim 32, wherein the device is configured to define a patient based frame of reference.
50. The device of claim 32, wherein at least two of the imageable fiducials are at known inter-fiducial distances.
51. The device of claim 32, wherein the device is configured to perform an operation on the entity at a distance.
52. The device of claim 32, wherein the first non-invasive sensor is configured as a sensing mesh that includes at least one filament coupled to at least one imageable fiducial.
53. A sensing mesh according to claim 52, wherein the sensing mesh is configured as a garment.
US12/782,108 2000-01-04 2010-05-18 Apparatus for registering and tracking an instrument Abandoned US20110054303A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/782,108 US20110054303A1 (en) 2000-01-04 2010-05-18 Apparatus for registering and tracking an instrument

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US17434300P 2000-01-04 2000-01-04
US17907300P 2000-01-31 2000-01-31
US09/752,557 US7747312B2 (en) 2000-01-04 2001-01-03 System and method for automatic shape registration and instrument tracking
US12/782,108 US20110054303A1 (en) 2000-01-04 2010-05-18 Apparatus for registering and tracking an instrument

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/752,557 Continuation US7747312B2 (en) 2000-01-04 2001-01-03 System and method for automatic shape registration and instrument tracking

Publications (1)

Publication Number Publication Date
US20110054303A1 true US20110054303A1 (en) 2011-03-03

Family

ID=27390407

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/752,557 Expired - Fee Related US7747312B2 (en) 2000-01-04 2001-01-03 System and method for automatic shape registration and instrument tracking
US12/782,108 Abandoned US20110054303A1 (en) 2000-01-04 2010-05-18 Apparatus for registering and tracking an instrument

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US09/752,557 Expired - Fee Related US7747312B2 (en) 2000-01-04 2001-01-03 System and method for automatic shape registration and instrument tracking

Country Status (1)

Country Link
US (2) US7747312B2 (en)

Cited By (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120114201A1 (en) * 2010-11-08 2012-05-10 Cranial Technologies, Inc. Method and apparatus for processing image representative data
US20120113116A1 (en) * 2010-11-08 2012-05-10 Cranial Technologies, Inc. Method and apparatus for preparing image representative data
WO2012168869A1 (en) * 2011-06-10 2012-12-13 Koninklijke Philips Electronics N.V. Medical imaging system with motion detection
WO2012168836A3 (en) * 2011-06-10 2013-01-31 Koninklijke Philips Electronics N.V. Dynamic constraining with optical shape sensing
WO2013057703A1 (en) * 2011-10-21 2013-04-25 Koninklijke Philips Electronics N.V. Body surface feedback for medical interventions
US8435033B2 (en) 2010-07-19 2013-05-07 Rainbow Medical Ltd. Dental navigation techniques
US20130116574A1 (en) * 2010-07-15 2013-05-09 Naviswiss Ag Method for ascertaining spatial coordinates
WO2013119801A3 (en) * 2012-02-07 2015-06-25 Joint Vue, LLC Three-dimensional guided injection device and methods
US20150193946A1 (en) * 2013-03-15 2015-07-09 Hansen Medical, Inc. System and methods for tracking robotically controlled medical instruments
US20180144501A1 (en) * 2015-03-31 2018-05-24 Consejo Superior De Investigaciones Cientificas (Csic) Device for extracting three-dimensional information from x-ray images of an object, method for calibrating said device, and method for generating said x-ray images
US10022192B1 (en) 2017-06-23 2018-07-17 Auris Health, Inc. Automatically-initialized robotic systems for navigation of luminal networks
US10064689B2 (en) 2014-03-17 2018-09-04 Intuitive Surgical Operations, Inc. System and method for aligning with a reference target
US10123755B2 (en) 2013-03-13 2018-11-13 Auris Health, Inc. Reducing incremental measurement sensor error
US10143360B2 (en) 2010-06-24 2018-12-04 Auris Health, Inc. Methods and devices for controlling a shapeable medical device
US10169875B2 (en) 2015-09-18 2019-01-01 Auris Health, Inc. Navigation of tubular networks
US10524866B2 (en) 2018-03-28 2020-01-07 Auris Health, Inc. Systems and methods for registration of location sensors
US10555778B2 (en) 2017-10-13 2020-02-11 Auris Health, Inc. Image-based branch detection and mapping for navigation
US10575756B2 (en) 2014-05-14 2020-03-03 Stryker European Holdings I, Llc Navigation system for and method of tracking the position of a work target
US10806535B2 (en) 2015-11-30 2020-10-20 Auris Health, Inc. Robot-assisted driving systems and methods
US10827913B2 (en) 2018-03-28 2020-11-10 Auris Health, Inc. Systems and methods for displaying estimated location of instrument
US10898286B2 (en) 2018-05-31 2021-01-26 Auris Health, Inc. Path-based navigation of tubular networks
US10898275B2 (en) 2018-05-31 2021-01-26 Auris Health, Inc. Image-based airway analysis and mapping
US10905499B2 (en) 2018-05-30 2021-02-02 Auris Health, Inc. Systems and methods for location sensor-based branch prediction
US11020016B2 (en) 2013-05-30 2021-06-01 Auris Health, Inc. System and method for displaying anatomy and devices on a movable display
US11058493B2 (en) 2017-10-13 2021-07-13 Auris Health, Inc. Robotic system configured for navigation path tracing
US11147633B2 (en) 2019-08-30 2021-10-19 Auris Health, Inc. Instrument image reliability systems and methods
US11160615B2 (en) 2017-12-18 2021-11-02 Auris Health, Inc. Methods and systems for instrument tracking and navigation within luminal networks
US11207141B2 (en) 2019-08-30 2021-12-28 Auris Health, Inc. Systems and methods for weight-based registration of location sensors
US11298195B2 (en) 2019-12-31 2022-04-12 Auris Health, Inc. Anatomical feature identification and targeting
US11324558B2 (en) 2019-09-03 2022-05-10 Auris Health, Inc. Electromagnetic distortion detection and compensation
US11395703B2 (en) 2017-06-28 2022-07-26 Auris Health, Inc. Electromagnetic distortion detection
US11426095B2 (en) 2013-03-15 2022-08-30 Auris Health, Inc. Flexible instrument localization from both remote and elongation sensors
US11490782B2 (en) 2017-03-31 2022-11-08 Auris Health, Inc. Robotic systems for navigation of luminal networks that compensate for physiological noise
US11504187B2 (en) 2013-03-15 2022-11-22 Auris Health, Inc. Systems and methods for localizing, tracking and/or controlling medical instruments
US11503986B2 (en) 2018-05-31 2022-11-22 Auris Health, Inc. Robotic systems and methods for navigation of luminal network that detect physiological noise
US11510736B2 (en) 2017-12-14 2022-11-29 Auris Health, Inc. System and method for estimating instrument location
US11602372B2 (en) 2019-12-31 2023-03-14 Auris Health, Inc. Alignment interfaces for percutaneous access
US11660147B2 (en) 2019-12-31 2023-05-30 Auris Health, Inc. Alignment techniques for percutaneous access
US11771309B2 (en) 2016-12-28 2023-10-03 Auris Health, Inc. Detecting endolumenal buckling of flexible instruments
US11832889B2 (en) 2017-06-28 2023-12-05 Auris Health, Inc. Electromagnetic field generator alignment

Families Citing this family (184)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5803089A (en) * 1994-09-15 1998-09-08 Visualization Technology, Inc. Position tracking and imaging system for use in medical applications
US7366562B2 (en) 2003-10-17 2008-04-29 Medtronic Navigation, Inc. Method and apparatus for surgical navigation
US8644907B2 (en) 1999-10-28 2014-02-04 Medtronic Navigaton, Inc. Method and apparatus for surgical navigation
US8239001B2 (en) 2003-10-17 2012-08-07 Medtronic Navigation, Inc. Method and apparatus for surgical navigation
US6499488B1 (en) * 1999-10-28 2002-12-31 Winchester Development Associates Surgical sensor
US11331150B2 (en) 1999-10-28 2022-05-17 Medtronic Navigation, Inc. Method and apparatus for surgical navigation
US7555333B2 (en) 2000-06-19 2009-06-30 University Of Washington Integrated optical scanning image acquisition and display
US6837892B2 (en) * 2000-07-24 2005-01-04 Mazor Surgical Technologies Ltd. Miniature bone-mounted surgical robot
US8489176B1 (en) 2000-08-21 2013-07-16 Spectrum Dynamics Llc Radioactive emission detector equipped with a position tracking system and utilization thereof with medical systems and in medical procedures
US7826889B2 (en) * 2000-08-21 2010-11-02 Spectrum Dynamics Llc Radioactive emission detector equipped with a position tracking system and utilization thereof with medical systems and in medical procedures
US8909325B2 (en) 2000-08-21 2014-12-09 Biosensors International Group, Ltd. Radioactive emission detector equipped with a position tracking system and utilization thereof with medical systems and in medical procedures
US8565860B2 (en) 2000-08-21 2013-10-22 Biosensors International Group, Ltd. Radioactive emission detector equipped with a position tracking system
WO2004042546A1 (en) * 2002-11-04 2004-05-21 V-Target Technologies Ltd. Apparatus and methods for imaging and attenuation correction
US8036731B2 (en) 2001-01-22 2011-10-11 Spectrum Dynamics Llc Ingestible pill for diagnosing a gastrointestinal tract
US7548865B2 (en) * 2000-10-20 2009-06-16 Arthrex, Inc. Method of selling procedure specific allografts and associated instrumentation
IL157007A0 (en) 2001-01-22 2004-02-08 Target Technologies Ltd V Ingestible device
US7899681B2 (en) * 2002-03-29 2011-03-01 3M Innovative Properties Company Electronic management of sterilization process information
TW528593B (en) * 2002-05-17 2003-04-21 Jang-Min Yang Device for monitoring physiological status and method for using the device
US20040030237A1 (en) * 2002-07-29 2004-02-12 Lee David M. Fiducial marker devices and methods
US7787934B2 (en) * 2002-07-29 2010-08-31 Medtronic, Inc. Fiducial marker devices, tools, and methods
US7720522B2 (en) * 2003-02-25 2010-05-18 Medtronic, Inc. Fiducial marker devices, tools, and methods
EP2151215B1 (en) * 2002-08-09 2012-09-19 Kinamed, Inc. Non-imaging tracking tools for hip replacement surgery
JP4168282B2 (en) * 2002-09-06 2008-10-22 裕之 田井 A simple stereotaxic device and a band used to determine the location of the device on the patient's head
US7869861B2 (en) * 2002-10-25 2011-01-11 Howmedica Leibinger Inc. Flexible tracking article and method of using the same
US8355773B2 (en) * 2003-01-21 2013-01-15 Aesculap Ag Recording localization device tool positional parameters
US20040199072A1 (en) * 2003-04-01 2004-10-07 Stacy Sprouse Integrated electromagnetic navigation and patient positioning device
US7783336B2 (en) * 2003-06-06 2010-08-24 Ethicon Endo-Surgery, Inc. Subcutaneous biopsy cavity marker device
DE10340002B3 (en) * 2003-08-29 2005-04-14 Siemens Ag Positioning device for positioning a patient
US7862570B2 (en) 2003-10-03 2011-01-04 Smith & Nephew, Inc. Surgical positioners
US7840253B2 (en) 2003-10-17 2010-11-23 Medtronic Navigation, Inc. Method and apparatus for surgical navigation
US7764985B2 (en) 2003-10-20 2010-07-27 Smith & Nephew, Inc. Surgical navigation system component fault interfaces and related processes
WO2005048851A1 (en) 2003-11-14 2005-06-02 Smith & Nephew, Inc. Adjustable surgical cutting systems
US7901348B2 (en) * 2003-12-12 2011-03-08 University Of Washington Catheterscope 3D guidance and interface system
US7176466B2 (en) 2004-01-13 2007-02-13 Spectrum Dynamics Llc Multi-dimensional image reconstruction
WO2005118659A2 (en) 2004-06-01 2005-12-15 Spectrum Dynamics Llc Methods of view selection for radioactive emission measurements
US8586932B2 (en) 2004-11-09 2013-11-19 Spectrum Dynamics Llc System and method for radioactive emission measurement
US9470801B2 (en) 2004-01-13 2016-10-18 Spectrum Dynamics Llc Gating with anatomically varying durations
US9040016B2 (en) 2004-01-13 2015-05-26 Biosensors International Group, Ltd. Diagnostic kit and methods for radioimaging myocardial perfusion
WO2007010534A2 (en) 2005-07-19 2007-01-25 Spectrum Dynamics Llc Imaging protocols
US7968851B2 (en) 2004-01-13 2011-06-28 Spectrum Dynamics Llc Dynamic spect camera
US8571881B2 (en) 2004-11-09 2013-10-29 Spectrum Dynamics, Llc Radiopharmaceutical dispensing, administration, and imaging
CA2555473A1 (en) 2004-02-17 2005-09-01 Traxtal Technologies Inc. Method and apparatus for registration, verification, and referencing of internal organs
US8177702B2 (en) * 2004-04-15 2012-05-15 Neuronetics, Inc. Method and apparatus for determining the proximity of a TMS coil to a subject's head
WO2005104978A1 (en) 2004-04-21 2005-11-10 Smith & Nephew, Inc. Computer-aided methods, systems, and apparatuses for shoulder arthroplasty
EP1778957A4 (en) 2004-06-01 2015-12-23 Biosensors Int Group Ltd Radioactive-emission-measurement optimization to specific body structures
US20050277827A1 (en) * 2004-06-09 2005-12-15 Alexandre Carvalho Pleural fluid localization device and method of using
US20050288574A1 (en) * 2004-06-23 2005-12-29 Thornton Thomas M Wireless (disposable) fiducial based registration and EM distoration based surface registration
WO2006047400A2 (en) * 2004-10-25 2006-05-04 Eastern Virginia Medical School System, method and medium for simulating normal and abnormal medical conditions
US7722565B2 (en) 2004-11-05 2010-05-25 Traxtal, Inc. Access system
EP1827505A4 (en) 2004-11-09 2017-07-12 Biosensors International Group, Ltd. Radioimaging
US9943274B2 (en) 2004-11-09 2018-04-17 Spectrum Dynamics Medical Limited Radioimaging using low dose isotope
US8615405B2 (en) 2004-11-09 2013-12-24 Biosensors International Group, Ltd. Imaging system customization using data from radiopharmaceutical-associated data carrier
US9316743B2 (en) 2004-11-09 2016-04-19 Biosensors International Group, Ltd. System and method for radioactive emission measurement
US8000773B2 (en) 2004-11-09 2011-08-16 Spectrum Dynamics Llc Radioimaging
US7751868B2 (en) 2004-11-12 2010-07-06 Philips Electronics Ltd Integrated skin-mounted multifunction device for use in image-guided surgery
US7805269B2 (en) 2004-11-12 2010-09-28 Philips Electronics Ltd Device and method for ensuring the accuracy of a tracking device in a volume
WO2008059489A2 (en) 2006-11-13 2008-05-22 Spectrum Dynamics Llc Radioimaging applications of and novel formulations of teboroxime
DE102004058122A1 (en) * 2004-12-02 2006-07-13 Siemens Ag Medical image registration aid for landmarks by computerized and photon emission tomographies, comprises permeable radioactive substance is filled with the emission tomography as radiation permeable containers, a belt and patient body bowl
EP1844351A4 (en) 2005-01-13 2017-07-05 Biosensors International Group, Ltd. Multi-dimensional image reconstruction and analysis for expert-system diagnosis
US8611983B2 (en) 2005-01-18 2013-12-17 Philips Electronics Ltd Method and apparatus for guiding an instrument to a target in the lung
JP2008531091A (en) 2005-02-22 2008-08-14 スミス アンド ネフュー インコーポレーテッド In-line milling system
US7530948B2 (en) 2005-02-28 2009-05-12 University Of Washington Tethered capsule endoscope for Barrett's Esophagus screening
US20060241638A1 (en) * 2005-04-08 2006-10-26 Zimmer Technology, Inc. Anatomical landmark guide
EP1898775B1 (en) 2005-06-21 2013-02-13 Philips Electronics LTD System and apparatus for navigated therapy and diagnosis
CA2612603C (en) * 2005-06-21 2015-05-19 Traxtal Inc. Device and method for a trackable ultrasound
US20070001905A1 (en) * 2005-06-30 2007-01-04 Esa Eronen Detecting the position of X-ray detector
EP1908011B1 (en) 2005-07-19 2013-09-04 Spectrum Dynamics LLC Reconstruction stabilizer and active vision
US8837793B2 (en) 2005-07-19 2014-09-16 Biosensors International Group, Ltd. Reconstruction stabilizer and active vision
EP1952180B1 (en) * 2005-11-09 2017-01-04 Biosensors International Group, Ltd. Dynamic spect camera
US7498811B2 (en) * 2005-11-16 2009-03-03 Macfarlane Duncan L Apparatus and method for patient movement tracking
US7911207B2 (en) 2005-11-16 2011-03-22 Board Of Regents, The University Of Texas System Method for determining location and movement of a moving object
US7977942B2 (en) * 2005-11-16 2011-07-12 Board Of Regents, The University Of Texas System Apparatus and method for tracking movement of a target
US20070129626A1 (en) * 2005-11-23 2007-06-07 Prakash Mahesh Methods and systems for facilitating surgical procedures
WO2007067163A1 (en) 2005-11-23 2007-06-14 University Of Washington Scanning beam with variable sequential framing using interrupted scanning resonance
WO2007074466A2 (en) 2005-12-28 2007-07-05 Starhome Gmbh Late forwarding to local voicemail system of calls to roaming users
US20070156066A1 (en) * 2006-01-03 2007-07-05 Zimmer Technology, Inc. Device for determining the shape of an anatomic surface
JP2009528128A (en) 2006-03-03 2009-08-06 ユニヴァーシティ オブ ワシントン Multi-clad optical fiber scanner
EP1996108A4 (en) 2006-03-23 2017-06-21 Orthosoft, Inc. Method and system for tracking tools in computer-assisted surgery
US7515690B2 (en) * 2006-05-05 2009-04-07 Mackey J Kevin Radiological scanning orientation indicator
US8894974B2 (en) 2006-05-11 2014-11-25 Spectrum Dynamics Llc Radiopharmaceuticals for diagnosis and therapy
EP2023812B1 (en) 2006-05-19 2016-01-27 The Queen's Medical Center Motion tracking system for real time adaptive imaging and spectroscopy
DE502006003187D1 (en) * 2006-05-31 2009-04-30 Brainlab Ag Registration by means of radiation marking elements
US8560047B2 (en) 2006-06-16 2013-10-15 Board Of Regents Of The University Of Nebraska Method and apparatus for computer aided surgery
US8152726B2 (en) * 2006-07-21 2012-04-10 Orthosoft Inc. Non-invasive tracking of bones for surgery
US9451928B2 (en) * 2006-09-13 2016-09-27 Elekta Ltd. Incorporating internal anatomy in clinical radiotherapy setups
US20080086051A1 (en) * 2006-09-20 2008-04-10 Ethicon Endo-Surgery, Inc. System, storage medium for a computer program, and method for displaying medical images
US9275451B2 (en) 2006-12-20 2016-03-01 Biosensors International Group, Ltd. Method, a system, and an apparatus for using and processing multidimensional data
US20080319307A1 (en) * 2007-06-19 2008-12-25 Ethicon Endo-Surgery, Inc. Method for medical imaging using fluorescent nanoparticles
US8155728B2 (en) * 2007-08-22 2012-04-10 Ethicon Endo-Surgery, Inc. Medical system, method, and storage medium concerning a natural orifice transluminal medical procedure
US8457718B2 (en) * 2007-03-21 2013-06-04 Ethicon Endo-Surgery, Inc. Recognizing a real world fiducial in a patient image data
US20080221388A1 (en) * 2007-03-09 2008-09-11 University Of Washington Side viewing optical fiber endoscope
US20080221434A1 (en) * 2007-03-09 2008-09-11 Voegele James W Displaying an internal image of a body lumen of a patient
US20080234544A1 (en) * 2007-03-20 2008-09-25 Ethicon Endo-Sugery, Inc. Displaying images interior and exterior to a body lumen of a patient
US8081810B2 (en) * 2007-03-22 2011-12-20 Ethicon Endo-Surgery, Inc. Recognizing a real world fiducial in image data of a patient
US8840566B2 (en) 2007-04-02 2014-09-23 University Of Washington Catheter with imaging capability acts as guidewire for cannula tools
US8108025B2 (en) * 2007-04-24 2012-01-31 Medtronic, Inc. Flexible array for use in navigated surgery
US8301226B2 (en) 2007-04-24 2012-10-30 Medtronic, Inc. Method and apparatus for performing a navigated procedure
EP2142130B1 (en) * 2007-04-24 2010-08-11 Medtronic, Inc. Flexible array for use in navigated surgery
US20090012509A1 (en) * 2007-04-24 2009-01-08 Medtronic, Inc. Navigated Soft Tissue Penetrating Laser System
US8311611B2 (en) * 2007-04-24 2012-11-13 Medtronic, Inc. Method for performing multiple registrations in a navigated procedure
US8734466B2 (en) 2007-04-25 2014-05-27 Medtronic, Inc. Method and apparatus for controlled insertion and withdrawal of electrodes
US9289270B2 (en) * 2007-04-24 2016-03-22 Medtronic, Inc. Method and apparatus for performing a navigated procedure
US8320995B2 (en) * 2007-04-26 2012-11-27 Schwamb Jr John P Fiducial marker with rings
US7952718B2 (en) 2007-05-03 2011-05-31 University Of Washington High resolution optical coherence tomography based imaging for intraluminal and interstitial use implemented with a reduced form factor
US9532848B2 (en) * 2007-06-15 2017-01-03 Othosoft, Inc. Computer-assisted surgery system and method
US8249317B2 (en) * 2007-07-20 2012-08-21 Eleckta Ltd. Methods and systems for compensating for changes in anatomy of radiotherapy patients
WO2009012576A1 (en) * 2007-07-20 2009-01-29 Resonant Medical Inc. Methods and systems for guiding the acquisition of ultrasound images
US8135198B2 (en) * 2007-08-08 2012-03-13 Resonant Medical, Inc. Systems and methods for constructing images
AU2008308686B2 (en) 2007-10-02 2015-01-22 Labrador Diagnostics Llc Modular point-of-care devices and uses thereof
CA2606267A1 (en) * 2007-10-11 2009-04-11 Hydro-Quebec System and method for three-dimensional mapping of a structural surface
US8521253B2 (en) 2007-10-29 2013-08-27 Spectrum Dynamics Llc Prostate imaging
CA2706728A1 (en) * 2007-11-26 2009-06-04 Ecole De Technologie Superieure Harness system for kinematic analysis of the knee
TW200930850A (en) * 2008-01-03 2009-07-16 Green Energy Technology Inc Cooling structure for body of crystal growth furnace
US8390291B2 (en) * 2008-05-19 2013-03-05 The Board Of Regents, The University Of Texas System Apparatus and method for tracking movement of a target
US8189738B2 (en) * 2008-06-02 2012-05-29 Elekta Ltd. Methods and systems for guiding clinical radiotherapy setups
EP2153794B1 (en) * 2008-08-15 2016-11-09 Stryker European Holdings I, LLC System for and method of visualizing an interior of a body
WO2010063117A1 (en) 2008-12-02 2010-06-10 Andre Novomir Hladio Method and system for aligning a prosthesis during surgery using active sensors
US8366719B2 (en) 2009-03-18 2013-02-05 Integrated Spinal Concepts, Inc. Image-guided minimal-step placement of screw into bone
US10542962B2 (en) 2009-07-10 2020-01-28 Elekta, LTD Adaptive radiotherapy treatment using ultrasound
US8338788B2 (en) 2009-07-29 2012-12-25 Spectrum Dynamics Llc Method and system of optimized volumetric imaging
BR112012008497A2 (en) * 2009-10-15 2019-09-24 Koninklijke Philips Electrnics N. V. apparatus for measuring a child's cranial deformity method for measuring a child's cranial deformity
US20110172526A1 (en) 2010-01-12 2011-07-14 Martin Lachaine Feature Tracking Using Ultrasound
US9248316B2 (en) 2010-01-12 2016-02-02 Elekta Ltd. Feature tracking using ultrasound
US20140179981A1 (en) * 2010-11-01 2014-06-26 Neuronix Ltd. Method and system for positioning a transcranial magnetic stimulation (tms) device
CN103402450A (en) 2010-12-17 2013-11-20 阿韦尼尔医药公司 Method and system for aligning a prosthesis during surgery
AR085087A1 (en) 2011-01-21 2013-09-11 Theranos Inc SYSTEMS AND METHODS TO MAXIMIZE THE USE OF SAMPLES
EP2518436B1 (en) * 2011-04-28 2015-06-17 Storz Endoskop Produktions GmbH Bend sensor
US9498231B2 (en) 2011-06-27 2016-11-22 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
US11911117B2 (en) 2011-06-27 2024-02-27 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
CN106913366B (en) 2011-06-27 2021-02-26 内布拉斯加大学评议会 On-tool tracking system and computer-assisted surgery method
US9606209B2 (en) 2011-08-26 2017-03-28 Kineticor, Inc. Methods, systems, and devices for intra-scan motion correction
US9664702B2 (en) 2011-09-25 2017-05-30 Theranos, Inc. Fluid handling apparatus and configurations
US9268915B2 (en) * 2011-09-25 2016-02-23 Theranos, Inc. Systems and methods for diagnosis or treatment
US9619627B2 (en) 2011-09-25 2017-04-11 Theranos, Inc. Systems and methods for collecting and transmitting assay results
US20140170735A1 (en) 2011-09-25 2014-06-19 Elizabeth A. Holmes Systems and methods for multi-analysis
US9632102B2 (en) 2011-09-25 2017-04-25 Theranos, Inc. Systems and methods for multi-purpose analysis
US8435738B2 (en) 2011-09-25 2013-05-07 Theranos, Inc. Systems and methods for multi-analysis
US8840838B2 (en) 2011-09-25 2014-09-23 Theranos, Inc. Centrifuge configurations
US8475739B2 (en) 2011-09-25 2013-07-02 Theranos, Inc. Systems and methods for fluid handling
US9810704B2 (en) 2013-02-18 2017-11-07 Theranos, Inc. Systems and methods for multi-analysis
US9250229B2 (en) 2011-09-25 2016-02-02 Theranos, Inc. Systems and methods for multi-analysis
US10012664B2 (en) 2011-09-25 2018-07-03 Theranos Ip Company, Llc Systems and methods for fluid and component handling
WO2013116140A1 (en) 2012-02-03 2013-08-08 Intuitive Surgical Operations, Inc. Steerable flexible needle with embedded shape sensing
US9314188B2 (en) 2012-04-12 2016-04-19 Intellijoint Surgical Inc. Computer-assisted joint replacement surgery and navigation systems
US10561861B2 (en) * 2012-05-02 2020-02-18 Viewray Technologies, Inc. Videographic display of real-time medical treatment
US9483122B2 (en) 2012-05-10 2016-11-01 Koninklijke Philips N.V. Optical shape sensing device and gesture control
CN104334085A (en) * 2012-05-24 2015-02-04 皇家飞利浦有限公司 Image generation apparatus
US9008757B2 (en) 2012-09-26 2015-04-14 Stryker Corporation Navigation system including optical and non-optical sensors
JP6246213B2 (en) 2012-10-01 2017-12-13 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Alignment system, method and computer program
US10327708B2 (en) 2013-01-24 2019-06-25 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US9305365B2 (en) 2013-01-24 2016-04-05 Kineticor, Inc. Systems, devices, and methods for tracking moving targets
CN105392423B (en) 2013-02-01 2018-08-17 凯内蒂科尔股份有限公司 The motion tracking system of real-time adaptive motion compensation in biomedical imaging
KR20140108047A (en) * 2013-02-28 2014-09-05 삼성전자주식회사 Method for tracking a moving object and a controlling apparatus capable of tracking a moving object
US9168002B2 (en) * 2013-03-14 2015-10-27 Malecare, Inc. Device and method for measuring radiation exposure
US9247998B2 (en) 2013-03-15 2016-02-02 Intellijoint Surgical Inc. System and method for intra-operative leg position measurement
US10105149B2 (en) 2013-03-15 2018-10-23 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
CN105050525B (en) * 2013-03-15 2018-07-31 直观外科手术操作公司 Shape sensor system and application method for tracking intervention apparatus
WO2015010859A1 (en) * 2013-07-23 2015-01-29 Koninklijke Philips N.V. Registration system for registering an imaging device with a tracking device
IL229527A (en) * 2013-11-21 2015-06-30 Elbit Systems Ltd Medical wide field of view optical tracking system
US10292772B2 (en) * 2014-01-31 2019-05-21 Edda Technology, Inc. Method and system for determining optimal timing for surgical instrument insertion in image-guided surgical procedures
US10524723B2 (en) * 2014-07-23 2020-01-07 Alphatec Spine, Inc. Method for measuring the displacements of a vertebral column
EP3188660A4 (en) 2014-07-23 2018-05-16 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
EP3188662A4 (en) * 2014-09-05 2018-05-23 Procept Biorobotics Corporation Physician controlled tissue resection integrated with treatment mapping of target organ images
CN106793977A (en) * 2014-10-17 2017-05-31 安德曼有限公司 Improvement to position feedback device
WO2016114834A2 (en) * 2014-10-22 2016-07-21 Think Surgical, Inc. Actively controlled optical tracker with a robot
IL236003A (en) 2014-11-30 2016-02-29 Ben-Yishai Rani Model registration system and method
US10660711B2 (en) 2015-02-25 2020-05-26 Mako Surgical Corp. Navigation systems and methods for reducing tracking interruptions during a surgical procedure
US10631793B1 (en) * 2015-04-14 2020-04-28 Eric Levell Luster Impact indicator
US9943247B2 (en) 2015-07-28 2018-04-17 The University Of Hawai'i Systems, devices, and methods for detecting false movements for motion correction during a medical imaging scan
US20180279916A1 (en) * 2015-10-02 2018-10-04 Mas Innovation (Pvt) Limited System and Method for Monitoring the Running Technique of a User
US10716515B2 (en) 2015-11-23 2020-07-21 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US10667868B2 (en) 2015-12-31 2020-06-02 Stryker Corporation System and methods for performing surgery on a patient at a target site defined by a virtual object
CN109219415B (en) * 2016-02-23 2021-10-26 桑尼布鲁克研究所 Patient-specific headphones for diagnosis and treatment of transcranial procedures
US10271851B2 (en) * 2016-04-01 2019-04-30 Ethicon Llc Modular surgical stapling system comprising a display
EP3448257A4 (en) * 2016-04-26 2019-12-04 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US20180028159A1 (en) * 2016-07-29 2018-02-01 Butterfly Network, Inc. Rearward acoustic diffusion for ultrasound-on-a-chip transducer array
CH712867A1 (en) * 2016-08-30 2018-03-15 Medivation Ag Portable immobilization and fixation device with calibration unit for x-ray stereomicrographs.
WO2018185729A1 (en) 2017-04-07 2018-10-11 Orthosoft Inc. Non-invasive system and method for tracking bones
EP3621545B1 (en) 2017-05-10 2024-02-21 MAKO Surgical Corp. Robotic spine surgery system
US10292774B2 (en) 2017-07-28 2019-05-21 Zimmer, Inc. Bone and tool tracking with optical waveguide modeling system in computer-assisted surgery using patient-attached multicore optical fiber
US11612345B2 (en) * 2018-03-15 2023-03-28 Ricoh Company, Ltd. Input device, measurement system, and computer-readable medium
CA3053904A1 (en) 2018-08-31 2020-02-29 Orthosoft Inc. System and method for tracking bones
US11576729B2 (en) * 2019-06-17 2023-02-14 Koninklijke Philips N.V. Cranial surgery using optical shape sensing
CN111329587A (en) * 2020-02-19 2020-06-26 上海理工大学 Surgical registration system using shape sensing fiber optic mesh

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5873822A (en) * 1994-09-15 1999-02-23 Visualization Technology, Inc. Automatic registration system for use with position tracking and imaging system for use in medical applications
US6127672A (en) * 1997-05-23 2000-10-03 Canadian Space Agency Topological and motion measuring tool
US6314310B1 (en) * 1997-02-14 2001-11-06 Biosense, Inc. X-ray guided surgical location system with extended mapping volume
US20020156363A1 (en) * 1999-10-28 2002-10-24 Hunter Mark W. Registration of human anatomy integrated for electromagnetic localization

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6611700B1 (en) * 1999-12-30 2003-08-26 Brainlab Ag Method and apparatus for positioning a body for radiation using a position sensor

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5873822A (en) * 1994-09-15 1999-02-23 Visualization Technology, Inc. Automatic registration system for use with position tracking and imaging system for use in medical applications
US6314310B1 (en) * 1997-02-14 2001-11-06 Biosense, Inc. X-ray guided surgical location system with extended mapping volume
US6127672A (en) * 1997-05-23 2000-10-03 Canadian Space Agency Topological and motion measuring tool
US20020156363A1 (en) * 1999-10-28 2002-10-24 Hunter Mark W. Registration of human anatomy integrated for electromagnetic localization

Cited By (83)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10143360B2 (en) 2010-06-24 2018-12-04 Auris Health, Inc. Methods and devices for controlling a shapeable medical device
US11051681B2 (en) 2010-06-24 2021-07-06 Auris Health, Inc. Methods and devices for controlling a shapeable medical device
US11857156B2 (en) 2010-06-24 2024-01-02 Auris Health, Inc. Methods and devices for controlling a shapeable medical device
US20130116574A1 (en) * 2010-07-15 2013-05-09 Naviswiss Ag Method for ascertaining spatial coordinates
US8435033B2 (en) 2010-07-19 2013-05-07 Rainbow Medical Ltd. Dental navigation techniques
US8494237B2 (en) * 2010-11-08 2013-07-23 Cranial Technologies, Inc Method and apparatus for processing digital image representations of a head shape
US20120113116A1 (en) * 2010-11-08 2012-05-10 Cranial Technologies, Inc. Method and apparatus for preparing image representative data
US20120114201A1 (en) * 2010-11-08 2012-05-10 Cranial Technologies, Inc. Method and apparatus for processing image representative data
US8442288B2 (en) * 2010-11-08 2013-05-14 Cranial Technologies, Inc. Method and apparatus for processing three-dimensional digital mesh image representative data of three-dimensional subjects
US20140088377A1 (en) * 2011-06-10 2014-03-27 Koninklijke Philips N.V. Dynamic constraining with optical shape sensing
CN103596494A (en) * 2011-06-10 2014-02-19 皇家飞利浦有限公司 Medical imaging system with motion detection
CN103607949A (en) * 2011-06-10 2014-02-26 皇家飞利浦有限公司 Dynamic constraining with optical shape sensing
WO2012168836A3 (en) * 2011-06-10 2013-01-31 Koninklijke Philips Electronics N.V. Dynamic constraining with optical shape sensing
JP2014525764A (en) * 2011-06-10 2014-10-02 コーニンクレッカ フィリップス エヌ ヴェ Dynamic constraints associated with optical shape sensing
WO2012168869A1 (en) * 2011-06-10 2012-12-13 Koninklijke Philips Electronics N.V. Medical imaging system with motion detection
EP2578148A1 (en) * 2011-10-04 2013-04-10 Koninklijke Philips Electronics N.V. Medical imaging system with motion detection
WO2013057703A1 (en) * 2011-10-21 2013-04-25 Koninklijke Philips Electronics N.V. Body surface feedback for medical interventions
JP2014534848A (en) * 2011-10-21 2014-12-25 コーニンクレッカ フィリップス エヌ ヴェ Body surface feedback for medical intervention
CN103889259A (en) * 2011-10-21 2014-06-25 皇家飞利浦有限公司 Body surface feedback for medical interventions
WO2013119801A3 (en) * 2012-02-07 2015-06-25 Joint Vue, LLC Three-dimensional guided injection device and methods
EP2812050A4 (en) * 2012-02-07 2016-07-20 Joint Vue Llc Three-dimensional guided injection device and methods
US10492741B2 (en) 2013-03-13 2019-12-03 Auris Health, Inc. Reducing incremental measurement sensor error
US10123755B2 (en) 2013-03-13 2018-11-13 Auris Health, Inc. Reducing incremental measurement sensor error
US11241203B2 (en) 2013-03-13 2022-02-08 Auris Health, Inc. Reducing measurement sensor error
US10531864B2 (en) 2013-03-15 2020-01-14 Auris Health, Inc. System and methods for tracking robotically controlled medical instruments
US20150193946A1 (en) * 2013-03-15 2015-07-09 Hansen Medical, Inc. System and methods for tracking robotically controlled medical instruments
US10130345B2 (en) * 2013-03-15 2018-11-20 Auris Health, Inc. System and methods for tracking robotically controlled medical instruments
US11129602B2 (en) 2013-03-15 2021-09-28 Auris Health, Inc. Systems and methods for tracking robotically controlled medical instruments
US9710921B2 (en) * 2013-03-15 2017-07-18 Hansen Medical, Inc. System and methods for tracking robotically controlled medical instruments
US11504187B2 (en) 2013-03-15 2022-11-22 Auris Health, Inc. Systems and methods for localizing, tracking and/or controlling medical instruments
US11426095B2 (en) 2013-03-15 2022-08-30 Auris Health, Inc. Flexible instrument localization from both remote and elongation sensors
US11020016B2 (en) 2013-05-30 2021-06-01 Auris Health, Inc. System and method for displaying anatomy and devices on a movable display
US11707337B2 (en) 2014-03-17 2023-07-25 Intuitive Surgical Operations, Inc. System and method for maintaining a tool position and orientation
US10575910B2 (en) 2014-03-17 2020-03-03 Intuitive Surgical Operations, Inc. System and method for maintaining a tool position and orientation
US10610316B2 (en) 2014-03-17 2020-04-07 Intuitive Surgical Operations, Inc. System and method for aligning with a reference target
US10064689B2 (en) 2014-03-17 2018-09-04 Intuitive Surgical Operations, Inc. System and method for aligning with a reference target
US11129684B2 (en) 2014-03-17 2021-09-28 Intuitive Surgical Operations, Inc. System and method for maintaining a tool position and orientation
US10070931B2 (en) * 2014-03-17 2018-09-11 Intuitive Surgical Operations, Inc. System and method for maintaining a tool pose
US11540742B2 (en) 2014-05-14 2023-01-03 Stryker European Operations Holdings Llc Navigation system for and method of tracking the position of a work target
US10575756B2 (en) 2014-05-14 2020-03-03 Stryker European Holdings I, Llc Navigation system for and method of tracking the position of a work target
US10672145B2 (en) * 2015-03-31 2020-06-02 Consejo Superior De Investigaciones Cientificas (Csic) Device for extracting three-dimensional information from X-ray images of an object, method for calibrating said device, and method for generating said X-ray images
US20180144501A1 (en) * 2015-03-31 2018-05-24 Consejo Superior De Investigaciones Cientificas (Csic) Device for extracting three-dimensional information from x-ray images of an object, method for calibrating said device, and method for generating said x-ray images
US11403759B2 (en) 2015-09-18 2022-08-02 Auris Health, Inc. Navigation of tubular networks
US10796432B2 (en) 2015-09-18 2020-10-06 Auris Health, Inc. Navigation of tubular networks
US10169875B2 (en) 2015-09-18 2019-01-01 Auris Health, Inc. Navigation of tubular networks
US10482599B2 (en) 2015-09-18 2019-11-19 Auris Health, Inc. Navigation of tubular networks
US11464591B2 (en) 2015-11-30 2022-10-11 Auris Health, Inc. Robot-assisted driving systems and methods
US10813711B2 (en) 2015-11-30 2020-10-27 Auris Health, Inc. Robot-assisted driving systems and methods
US10806535B2 (en) 2015-11-30 2020-10-20 Auris Health, Inc. Robot-assisted driving systems and methods
US11771309B2 (en) 2016-12-28 2023-10-03 Auris Health, Inc. Detecting endolumenal buckling of flexible instruments
US11490782B2 (en) 2017-03-31 2022-11-08 Auris Health, Inc. Robotic systems for navigation of luminal networks that compensate for physiological noise
US11759266B2 (en) 2017-06-23 2023-09-19 Auris Health, Inc. Robotic systems for determining a roll of a medical device in luminal networks
US10022192B1 (en) 2017-06-23 2018-07-17 Auris Health, Inc. Automatically-initialized robotic systems for navigation of luminal networks
US11278357B2 (en) 2017-06-23 2022-03-22 Auris Health, Inc. Robotic systems for determining an angular degree of freedom of a medical device in luminal networks
US10159532B1 (en) 2017-06-23 2018-12-25 Auris Health, Inc. Robotic systems for determining a roll of a medical device in luminal networks
US11832889B2 (en) 2017-06-28 2023-12-05 Auris Health, Inc. Electromagnetic field generator alignment
US11395703B2 (en) 2017-06-28 2022-07-26 Auris Health, Inc. Electromagnetic distortion detection
US11850008B2 (en) 2017-10-13 2023-12-26 Auris Health, Inc. Image-based branch detection and mapping for navigation
US10555778B2 (en) 2017-10-13 2020-02-11 Auris Health, Inc. Image-based branch detection and mapping for navigation
US11058493B2 (en) 2017-10-13 2021-07-13 Auris Health, Inc. Robotic system configured for navigation path tracing
US11510736B2 (en) 2017-12-14 2022-11-29 Auris Health, Inc. System and method for estimating instrument location
US11160615B2 (en) 2017-12-18 2021-11-02 Auris Health, Inc. Methods and systems for instrument tracking and navigation within luminal networks
US10524866B2 (en) 2018-03-28 2020-01-07 Auris Health, Inc. Systems and methods for registration of location sensors
US10898277B2 (en) 2018-03-28 2021-01-26 Auris Health, Inc. Systems and methods for registration of location sensors
US11712173B2 (en) 2018-03-28 2023-08-01 Auris Health, Inc. Systems and methods for displaying estimated location of instrument
US10827913B2 (en) 2018-03-28 2020-11-10 Auris Health, Inc. Systems and methods for displaying estimated location of instrument
US11950898B2 (en) 2018-03-28 2024-04-09 Auris Health, Inc. Systems and methods for displaying estimated location of instrument
US11576730B2 (en) 2018-03-28 2023-02-14 Auris Health, Inc. Systems and methods for registration of location sensors
US10905499B2 (en) 2018-05-30 2021-02-02 Auris Health, Inc. Systems and methods for location sensor-based branch prediction
US11793580B2 (en) 2018-05-30 2023-10-24 Auris Health, Inc. Systems and methods for location sensor-based branch prediction
US11759090B2 (en) 2018-05-31 2023-09-19 Auris Health, Inc. Image-based airway analysis and mapping
US11503986B2 (en) 2018-05-31 2022-11-22 Auris Health, Inc. Robotic systems and methods for navigation of luminal network that detect physiological noise
US10898275B2 (en) 2018-05-31 2021-01-26 Auris Health, Inc. Image-based airway analysis and mapping
US10898286B2 (en) 2018-05-31 2021-01-26 Auris Health, Inc. Path-based navigation of tubular networks
US11864850B2 (en) 2018-05-31 2024-01-09 Auris Health, Inc. Path-based navigation of tubular networks
US11147633B2 (en) 2019-08-30 2021-10-19 Auris Health, Inc. Instrument image reliability systems and methods
US11944422B2 (en) 2019-08-30 2024-04-02 Auris Health, Inc. Image reliability determination for instrument localization
US11207141B2 (en) 2019-08-30 2021-12-28 Auris Health, Inc. Systems and methods for weight-based registration of location sensors
US11324558B2 (en) 2019-09-03 2022-05-10 Auris Health, Inc. Electromagnetic distortion detection and compensation
US11864848B2 (en) 2019-09-03 2024-01-09 Auris Health, Inc. Electromagnetic distortion detection and compensation
US11660147B2 (en) 2019-12-31 2023-05-30 Auris Health, Inc. Alignment techniques for percutaneous access
US11602372B2 (en) 2019-12-31 2023-03-14 Auris Health, Inc. Alignment interfaces for percutaneous access
US11298195B2 (en) 2019-12-31 2022-04-12 Auris Health, Inc. Anatomical feature identification and targeting

Also Published As

Publication number Publication date
US7747312B2 (en) 2010-06-29
US20020087101A1 (en) 2002-07-04

Similar Documents

Publication Publication Date Title
US7747312B2 (en) System and method for automatic shape registration and instrument tracking
Zamorano et al. Interactive intraoperative localization using an infrared-based system
US9320569B2 (en) Systems and methods for implant distance measurement
EP0501993B1 (en) Probe-correlated viewing of anatomical image data
US8682413B2 (en) Systems and methods for automated tracker-driven image selection
US8239001B2 (en) Method and apparatus for surgical navigation
US6259943B1 (en) Frameless to frame-based registration system
US8271069B2 (en) Method and apparatus for surgical navigation
US7831096B2 (en) Medical navigation system with tool and/or implant integration into fluoroscopic image projections and method of use
US5483961A (en) Magnetic field digitizer for stereotactic surgery
US20040015176A1 (en) Stereotactic localizer system with dental impression
US20080119725A1 (en) Systems and Methods for Visual Verification of CT Registration and Feedback
US20220378508A1 (en) Non-invasive system and method for tracking bones
US20060030771A1 (en) System and method for sensor integration
US20080119712A1 (en) Systems and Methods for Automated Image Registration
CN117323001A (en) Optical fiber shape sensing system
US20080154120A1 (en) Systems and methods for intraoperative measurements on navigated placements of implants
WO1996032059A1 (en) Magnetic field digitizer for stereotactic surgery
KR20030053039A (en) Wireless position sensor
Schmerber et al. Accuracy evaluation of a CAS system: laboratory protocol and results with 6D localizers, and clinical experiences in otorhinolaryngology
US8067726B2 (en) Universal instrument calibration system and method of use
WO2004075716A2 (en) A stereotactic localizer system with dental impression
WO2011158113A1 (en) A device for magnetic localization and tracking
Franceschini et al. Computer-Aided Surgery in Otolaryngology
EP1594406A2 (en) A stereotactic localizer system with dental impression

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION