US20090221907A1 - Location system with virtual touch screen - Google Patents

Location system with virtual touch screen Download PDF

Info

Publication number
US20090221907A1
US20090221907A1 US12/039,779 US3977908A US2009221907A1 US 20090221907 A1 US20090221907 A1 US 20090221907A1 US 3977908 A US3977908 A US 3977908A US 2009221907 A1 US2009221907 A1 US 2009221907A1
Authority
US
United States
Prior art keywords
responsively
display device
medical instrument
signals
magnetic fields
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US12/039,779
Other versions
US8926511B2 (en
Inventor
Meir Bar-tal
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Biosense Webster Inc
Original Assignee
Biosense Webster Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Biosense Webster Inc filed Critical Biosense Webster Inc
Priority to US12/039,779 priority Critical patent/US8926511B2/en
Assigned to BIOSENSE WEBSTER, INC. reassignment BIOSENSE WEBSTER, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BAR-TAL, MEIR
Priority to IL197318A priority patent/IL197318A/en
Priority to AU2009200770A priority patent/AU2009200770B2/en
Priority to CN201510029456.5A priority patent/CN104605855B/en
Priority to CA2656309A priority patent/CA2656309C/en
Priority to KR1020090016871A priority patent/KR101612278B1/en
Priority to EP09250552.8A priority patent/EP2096523B1/en
Priority to CN200910130767A priority patent/CN101530325A/en
Priority to JP2009045364A priority patent/JP5436886B2/en
Priority to BRPI0901476A priority patent/BRPI0901476B8/en
Priority to MX2009002363A priority patent/MX350265B/en
Publication of US20090221907A1 publication Critical patent/US20090221907A1/en
Publication of US8926511B2 publication Critical patent/US8926511B2/en
Application granted granted Critical
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/061Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
    • A61B5/062Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body using magnetic field
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B18/18Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by applying electromagnetic radiation, e.g. microwaves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/061Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
    • A61B5/064Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body using markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • A61B2090/502Headgear, e.g. helmet, spectacles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/7445Display arrangements, e.g. multiple display units
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7475User input or interface means, e.g. keyboard, pointing device, joystick

Definitions

  • This invention relates to systems for invasive medical procedures. More particularly, this invention relates to using magnetic fields to track a medical instrument within a living body.
  • Magnetic tracking systems for medical application use magnetic fields to detect locations both of points in the patient's body and of invasive devices, such as catheters and surgical tools, that are in proximity to or inside the body.
  • a magnetic field generator produces a field in and around an area of the body, and sensors in the body and in the invasive device detect the field.
  • a system console receives the sensor signals and displays the location of the invasive device relative to the body.
  • An integral processing and display unit includes a plurality of radiator coils, along with processing circuitry and a display.
  • the radiator coils generate electromagnetic fields in a vicinity of the tissue, thereby causing currents to flow in the sensor coils.
  • the processing circuitry processes the currents to determine coordinates of the tag relative to the medical device.
  • the display is driven by the processing circuitry so as to present a visual indication to an operator of the medical device of an orientation of the device relative to the tag.
  • U.S. Pat. No. 5,913,820 issued to Bladen, et al., and which is herein incorporated by reference, discloses methods and apparatus for locating the position, preferably in three dimensions, of a sensor by generating magnetic fields, which are detected at the sensor.
  • the magnetic fields are generated from a plurality of locations and, in one embodiment of the invention, enable both the orientation and location of a single coil sensor to be determined.
  • the system allows an operator to wear small, single coil sensors about his body to enable his movements to be detected and interpreted by a machine without requiring physical contact between the operator and the machine.
  • the positioning system could enable an operator to interact with images on a television or computer screen without the use of a conventional keyboard, mouse or stylus.
  • U.S. Pat. No. 6,427,079 issued to Schneider, et al., and which is herein incorporated by reference, discloses a remote location determination system that uses splines of magnetic field values to determine location parameters.
  • the location determination system is used on a laser catheter that is operable to perform myocardial revascularization.
  • An automatic calibration technique compensates for any variations in gain in a sensor and related components. Methods for reducing the effects of eddy currents in surrounding conductive objects are used in electromagnetic position and orientation measurement systems.
  • the system operator in order to interact with the console, the system operator, such as a physician, must generally use a conventional user interface device, e.g., a keyboard, mouse or touch screen.
  • a conventional user interface device e.g., a keyboard, mouse or touch screen.
  • the operator may have to disengage from manipulating the invasive device, and move to a different position to work the user interface. Alternatively, he must instruct an assistant to take the necessary actions.
  • Embodiments of the present invention provide new methods and devices for user interaction with a system for medical treatment and/or diagnosis that uses magnetic position tracking. These methods and devices permit the system operator to interact with the console without leaving his normal operating position.
  • the operator is provided with a stylus or other user interface device containing a magnetic sensor, which is linked to the console.
  • the interface device may itself have a dual function as an invasive medical instrument. As long as the stylus is near the patient's body, the sensor senses the fields generated by the magnetic field generator. In other embodiments, the interface device and the medical instrument generate magnetic fields, which are sensed by an external position sensor.
  • a position processor in the console is thus able to determine the location of the stylus just as it determines the locations of the other elements of the system.
  • the system console displays a cursor on a screen, which moves as the operator moves the stylus.
  • the operator can use this cursor to actuate on-screen controls, to draw lines on the screen, and to mark points and otherwise interact with images and maps that are displayed on the screen.
  • the effect of the stylus and magnetic tracking system is to provide a “virtual touch screen” that the system operator can use conveniently while operating on the patient.
  • Some embodiments of the present invention permit the system operator to view a virtual image of an anatomical structure, in the actual location of the structure, using a “virtual reality” or “augmented reality” display, and to use the stylus to interact with the image.
  • the display with which the operator interacts using the stylus may be presented on goggles worn by the system operator.
  • the goggles contain a position sensor, so that the display is registered with the body of the patient.
  • An embodiment of the invention provides apparatus for invasive medical operations in the body of a living subject.
  • the apparatus includes one or more field generating elements disposed at known locations for generating magnetic fields at respective frequencies, and a medical instrument adapted for insertion into the body.
  • the medical instrument has a first magnetic position sensor coupled thereto that emits first signals responsively to the magnetic fields.
  • An interface device has a second magnetic position sensor coupled thereto that emits second signals responsively to the magnetic fields.
  • the apparatus includes a position processor operative to receive the first signals and the second signals and to determine respective positions of the interface device and the medical instrument relative to the known locations, responsively to the first signals and the second signals, and a display device operative to display an image responsively to the position of the medical instrument.
  • the display device has a cursor moveable thereon under control of the position processor responsively to changes in the position of the interface device.
  • the display device has a display control that is actuated responsively to a superimposition of the cursor thereon.
  • the display device has a display control that is actuated responsively to a displacement of the interface device generally toward the display device while the cursor is superimposed on the display control.
  • the display device is a virtual reality display device having a third magnetic position sensor that emits third signals responsively to the magnetic fields.
  • positioning controls are provided for the medical instrument, and the interface device is disposed within reach of an operator of the positioning controls.
  • the first magnetic position sensor and the second magnetic position sensor comprise at least two sensor coils.
  • FIG. 1 is a pictorial illustration of a system for medical imaging using a virtual touch screen, in accordance with a disclosed embodiment of the invention
  • FIG. 2 is a pictorial illustration of a catheter that may be used in the system shown in FIG. 1 , in accordance with an embodiment of the present invention
  • FIG. 3 is a pictorial illustration of an interface device that may be used in the system shown in FIG. 1 , in accordance with an alternate embodiment of the invention
  • FIG. 4 is a pictorial illustration of a device that produces a virtual reality display that may be used in the system shown in FIG. 1 , in accordance with another alternate embodiment of the invention
  • FIG. 5 is a flow chart showing a method for performing invasive medical operations with the assistance of a virtual touch screen, in accordance with a disclosed embodiment of the invention.
  • FIG. 6 is a flow chart showing a method for imaging an anatomical structure on the virtual reality display of FIG. 4 , in accordance with a disclosed embodiment of the invention.
  • Software programming code which embodies aspects of the present invention, is typically maintained in permanent storage, such as a computer readable medium.
  • software programming code may be stored on a client or a server.
  • the software programming code may be embodied on any of a variety of known media for use with a data processing system, such as a diskette, or hard drive, or CD-ROM.
  • the code may be distributed on such media, or may be distributed to users from the memory or storage of one computer system over a network of some type to other computer systems for use by users of such other systems.
  • FIG. 1 is a pictorial illustration of a system 20 that tracks and operates a medical instrument within a living body using a virtual touch screen, which is constructed and operative in accordance with a disclosed embodiment of the invention.
  • An operator for example a physician 22 may use system 20 to obtain medical images using a probe, such as a catheter 23 , which may be inserted into an internal body cavity, such as a chamber of a heart 24 of a subject 26 .
  • catheter 23 is used for diagnostic or therapeutic medical procedures, such as mapping electrical potentials in the heart or performing ablation of heart tissue.
  • the catheter or other intra-body device may alternatively be used for other purposes, by itself or in conjunction with other treatment devices.
  • the cardiac application described with respect to FIG. 1 is exemplary. The principles of the invention are applicable to many invasive medical and surgical procedures throughout the body.
  • FIG. 2 is a pictorial illustration of catheter 23 , in accordance with an embodiment of the present invention.
  • the catheter shown is exemplary; many other types of catheters may be used as catheter 23 .
  • Catheter 23 typically comprises positioning controls 27 on a handle 28 to enable the physician to steer, locate and orient, and operate a distal end 29 of catheter 23 as desired.
  • a pointing device e.g., joystick 52 is attached to handle 28 .
  • handle 28 comprises one or more touch-activated switches, shown as buttons 56 .
  • buttons 56 may be located on joystick 52 .
  • Joystick 52 and buttons 56 are used for controlling system 20 , as described in detail herein below.
  • Distal end 29 and joystick 52 include position sensors 32 and 54 respectively, each comprising sensor coils 35 as described herein below.
  • distal end 29 comprises an ultrasonic imaging sensor 39 .
  • Ultrasonic imaging sensor 39 typically transmits a short burst of ultrasound energy and converts the reflected ultrasound into electrical signals, which are transmitted via cables 33 to console 34 ( FIG. 1 ), as is known in the art.
  • distal end 29 also comprises at least one electrode 42 for performing diagnostic functions, therapeutic functions, or both, such as electro-physiological mapping and radiofrequency (RF) ablation.
  • electrode 42 is used for sensing local electrical potentials. The electrical potentials measured by electrode 42 may be used in mapping the local electrical activity on the endocardial surface.
  • the electrode measures the local electrical potential at that point. The measured potentials are converted into electrical signals and sent through catheter 23 to an image processor 43 ( FIG. 1 ), which converts the signals into an electro-anatomical map.
  • electrode 42 may be used to measure parameters different from the electrical potentials described above, such as various tissue characteristics, temperature, and blood flow.
  • system 20 comprises a positioning subsystem 30 that measures location and orientation coordinates of distal end 29 of catheter 23 .
  • location refers to the spatial coordinates of an object
  • orientation refers to angular coordinates of the object
  • position refers to the full positional information of the object, comprising both location and orientation coordinates.
  • positioning subsystem 30 comprises a magnetic position tracking system that determines the position of distal end 29 of catheter 23 .
  • Positioning subsystem 30 typically comprises a set of external radiators, such as field generating elements, e.g., coils 31 , which are in fixed, known locations external to the subject. Coils 31 generate fields, typically magnetic fields, in the vicinity of heart 24 .
  • position sensor 32 senses the fields generated by coils 31 and transmits, in response to the sensed fields, position-related electrical signals over cables 33 running through catheter 23 to console 34 ( FIG. 1 ). Alternatively, position sensor 32 may transmit signals to the console over a wireless link.
  • position sensor 32 comprises at least two, and preferably three, sensor coils 35 , adapted to the frequency of one of coils 31 as is known in the art.
  • Sensor coils 35 are wound on either air cores or cores of material.
  • the axes of sensor coils 35 should be non-parallel and preferably mutually orthogonal.
  • Position sensor 54 which is located in the joystick 52 , preferably in the handle, is similar to position sensor 32 . Position sensor 54 senses the fields generated by coils 31 , and is used to determine the position of the handle of joystick 52 including its angular orientation in space. Position sensor 54 requires at least one sensing coil, and preferably has three coils.
  • console 34 comprises a position processor 36 that calculates the location and orientation of distal end 29 of catheter 23 based on the signals sent by position sensor 32 ( FIG. 2 ).
  • Position processor 36 typically receives, amplifies, filters, digitizes, and otherwise processes signals from catheter 23 .
  • System 20 and position processor 36 may also be realized as elements of the CARTO XP EP Navigation and Ablation System, available from Biosense Webster, Inc., 3333 Diamond Canyon Road, Diamond Bar, Calif. 91765, and suitably modified to execute the principles of the pre-sent invention.
  • image processor 43 uses the electrical signals received from ultrasonic imaging sensor 39 ( FIG. 2 ) and positional information received from position sensor 32 in distal end 29 of catheter 23 to produce an image of a target structure of the subject's heart.
  • the images may be enhanced using electrical information derived from electrode 42 .
  • image processor 43 may not produce a medical image, but may merely produce an image of distal end 29 of catheter 23 overlaid on a representation of subject 26 , or may simply show the position of distal end 29 with respect to a target within the subject, in order to assist physician 22 with a medical procedure.
  • Images produced by image processor 43 are output on a display device 44 .
  • FIG. 1 shows an image 46 of part of heart 24 .
  • System 20 typically provides display controls, for example a GUI (Graphical User Interface), comprising windows, icons and menus, for manipulating and viewing images produced by image processor 43 .
  • An interface device is used to move a cursor 48 on display device 44 .
  • the interface device comprises joystick 52 ( FIG. 2 ), which is within reach of physician 22 when he is using operating controls 27 .
  • rotation of the joystick may continuously control a parameter such as the edge threshold in an edge detection algorithm.
  • Other joystick motions and button commands may be user-assigned in order to control other aspects of the operation of the system 20 .
  • physician 22 moves joystick 52
  • the location of position sensor 54 is tracked by the position processor 36 ( FIG. 1 ) transmitted to console 34 , where it is registered on the display 44 .
  • the position processor 36 translates joystick movements into movements of cursor 48 on display device 44 .
  • FIG. 3 is a diagram of an exemplary interface device 60 for use with system 20 ( FIG. 1 ), in accordance with an alternate embodiment of the invention.
  • Interface device 60 may be a wand or stylus, and is shaped to be easily graspable and manipulable by physician 22 ( FIG. 1 ).
  • Interface device 60 comprises position sensor 54 and buttons 56 , as described above.
  • Position sensor 54 senses magnetic fields produced by coils 31 ( FIG. 1 ) and transmits, in response to the sensed fields, position-related electrical signals over cables 63 to console 34 .
  • position sensor 54 may transmit signals to the console over a wireless link. In this way, system 20 is able to determine the position of interface device 60 .
  • a 3-dimensional spatial region 61 including screen 62 of display 40 is mapped by the position processor 36 to a spatial region 67 near or including device 60 .
  • a displacement of device 60 in the region 67 that changes its XY coordinates in coordinate system 65 produces a corresponding movement of a cursor on the screen 62 .
  • the device 60 is displaced so as to change its Z-coordinate and intersect virtual plane 70 . This event stimulates the graphical user interface of the display 40 as though a physical touch screen were contacted at a point corresponding to the XY coordinate of the intersection in the plane 70 .
  • Icons and menus (not shown) on the display 40 are actuated by superimposing the cursor on them.
  • the icons and menus are actuated by passing the cursor over them while pressing one of buttons 56 . This causes an electrical signal to be transmitted along cables 33 to console 34 , where the processor interprets the signal to activate the icon or menu.
  • the tracking of a pointing device for a GUI is well known in the art, and is not described further here.
  • physician 22 may move cursor 48 from a first position to a second position, in order to draw a corresponding line via the GUI from the first position to the second position, mark points using buttons 56 , and otherwise interact with images and maps that are displayed on the display device.
  • the images are displayed on a virtual reality display rather than a conventional display monitor.
  • FIG. 4 is a pictorial illustration of a device that produces a virtual reality display, in accordance with an alternate embodiment of the invention.
  • Virtual reality goggles 100 comprise at least one, and typically two, display devices 105 , supported by a frame 110 , constructed so that physician 22 ( FIG. 1 ) may wear goggles 100 with display devices 105 in front of his eyes.
  • Display devices 105 show virtual images, for example, of a part of heart 24 ( FIG. 1 ) and distal end 29 of catheter 23 ( FIG. 2 ), as described herein below.
  • display devices 105 may be transparent, or partially transparent, in order to provide augmented reality images in which the virtual images are superimposed on the body of subject 26 ( FIG. 1 ).
  • Goggles 100 comprise a position sensor 132 , similar to position sensor 32 , which senses magnetic fields produced by coils 31 ( FIG. 1 ) and transmits, in response to the sensed fields, position-related electrical signals to console 34 ( FIG. 1 ), using a wireless transmitter 140 .
  • Wireless transmitter 140 may also be used as a receiver for images to be displayed on display devices 105 . Alternatively, the transmitter may be wired to the console.
  • Position sensor 132 is similar to position sensor 32 , but may comprise a miniaturized position sensor, for example as described in U.S. Pat. No. 6,201,387, issued to Govari, which is incorporated herein by reference.
  • position sensor 132 may comprise a wireless position sensor.
  • a suitable device is described in U.S. Patent Application Publication No. 2005/0099290, which is incorporated herein by reference.
  • wireless transmitter 140 acts solely as a receiver for images from image processor 43 ( FIG. 1 ).
  • position sensor 132 may transmit signals to the console over a cable (not shown). However, this alternative is less convenient. Similarly, images to be displayed on display devices 105 may be received over cables (not shown). Because the positions of display devices 105 are fixed in relation to position sensor 132 , system 20 is able to determine the positions of each of display devices 105 . Using the information provided by the position sensor 132 , the position processor 36 ( FIG. 1 ) can register the virtual reality display with the body of the patient. In this manner, the operator can view an image of an organ superimposed on an image of the patient's body in the proper position and orientation, and can use the device 60 ( FIG. 3 ) to interact with the images as described above.
  • each of display devices 105 may be attached to its own position sensor 132 . This allows greater flexibility of movement of the goggles, since the relative positions of display devices 105 need not be constant.
  • FIG. 4 shows each position sensor 132 connected to a separate wireless transmitter 140 , a single wireless transmitter 140 may be used.
  • the virtual reality image may be manipulated using many combinations of interface devices such as joystick 52 or interface device 60 , as described above.
  • interface devices such as joystick 52 or interface device 60 , as described above.
  • some embodiments may become less convenient than others. For example, some phases may be hazardous, e.g., taking place under conditions of radiation exposure, and requiring hands-off actuation of the medical instrument on the part of the physician 22 . In such cases the use of goggles 100 may be preferable. In other situations, the lighting conditions in the operatory may be unsuitable for use of goggles 100 .
  • position sensors 32 , 54 , 132 may be replaced by radiators, e.g., coils, that generate magnetic fields, which are received by sensors outside the subject's body.
  • the external sensors generate the position-related electrical signals.
  • FIG. 5 is a flow chart showing a method for performing invasive medical operations with the assistance of a virtual touch screen, in accordance with a disclosed embodiment of the invention.
  • the method begins at an initial step 150 , where the position of distal end 29 ( FIG. 1 ) of catheter 23 is determined, typically using the magnetic fields produced by coils 31 and sensed by position sensor 32 ( FIG. 2 ).
  • the position of distal end 29 may be determined by external position sensors that detect magnetic fields generated at a fixed position relative to distal end 29 .
  • an image for example image 46 , is acquired and displayed on display 44 .
  • the image may be an image of subject 26 , which may be obtained, for example, using catheter 23 .
  • the image may be an image of distal end 29 overlaid on a representation of subject 26 .
  • the image may show the position of distal end 29 with respect to a target within the subject. Steps 150 and 152 may be repeated as distal end 29 moves.
  • the position of the interface device is determined, for example by position sensor 54 ( FIG. 2 ).
  • position sensors 32 , 54 may be replaced by a radiator, which is used to as a reference establish coordinates for the system.
  • the same external sensors are used to detect the positions of the distal end of the catheter and the interface device.
  • cursor 48 is positioned on display 44 .
  • the initial position may be predefined or random.
  • step 165 typically performed after a time delay, or after an interrupt, the position of the interface device is determined, as in step 155 .
  • step 170 it is determined whether the interface device has moved since the previous iteration of step 165 , or step 155 if this is the first iteration. If the determination at determination step 170 is negative, then control proceeds to a decision step 175 , described below.
  • step 170 determines whether the determination at decision step 170 is affirmative. If the determination at decision step 170 is affirmative, then control proceeds to step 180 . Cursor 48 is repositioned on display 44 in response to the displacement of the interface device relative to its previous position. Control proceeds to decision step 175 .
  • display controls for example a GUI as described above, appear on display 44 .
  • decision step 175 it is determined whether the cursor is superimposed on one of the display controls. If the determination at decision step 175 is negative, then control returns to step 165 .
  • step 185 control proceeds to step 185 .
  • the display control is actuated. This may cause a change in the orientation or scale of the image on display 44 , or other changes to the display of the image or may actuate a function of catheter 23 , according to a computer application that is controlled via the GUI.
  • decision step 190 it is determined whether the procedure is complete. Typically, this is indicated by the actuation of an appropriate display control at step 185 . If the determination at decision step 190 is negative, then control returns to step 165 .
  • control proceeds to final step 195 , where the method ends.
  • FIG. 6 is a flow chart showing a method for imaging an anatomical structure on the virtual reality display of FIG. 4 , in accordance with a disclosed embodiment of the invention.
  • the process steps are shown in a particular linear sequence in FIG. 6 for clarity of presentation. However, it will be evident that many of them can be performed in parallel, asynchronously, or in different orders. For example, acquiring the image and locating the display devices may be performed in either order, or simultaneously.
  • the method begins at initial step 205 , where an image, typically three-dimensional, of a part of an anatomical structure is acquired.
  • an image typically three-dimensional, of a part of an anatomical structure is acquired.
  • this may be performed as described for example, in U.S. Patent Application Publication No. 2006/0241445, which is incorporated herein by reference.
  • one or more position sensors 132 determine the positions of display devices 105 .
  • the position information is transmitted to console 34 .
  • image processor 43 uses position information from step 220 and standard geometrical techniques to obtain, for each of display devices 105 , a 2-dimensional projection of the image.
  • the projections are transmitted to display devices 105 ( FIG. 4 ) and displayed.

Abstract

Control of an invasive medical instrument during a medical procedure is achieved using a system that includes magnetic field-based location facilities. Magnetic field sensors are placed in a medical instrument, e.g., a probe, and in an interface device to enable respective positions of the probe and the interface device to be ascertained by a location processor when the sensors are exposed to a magnetic field. The interface device is disposed such that an operator can control the medical instrument and the interface device concurrently. A display device, which can comprise a virtual reality display, is responsive to movements of the interface device as determined by the location processor to control the medical instrument, invoke various functions of the system, e.g., image manipulation, and otherwise facilitate the medical procedure via a graphical user interface.

Description

    BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • This invention relates to systems for invasive medical procedures. More particularly, this invention relates to using magnetic fields to track a medical instrument within a living body.
  • 2. Description of the Related Art
  • Magnetic tracking systems for medical application use magnetic fields to detect locations both of points in the patient's body and of invasive devices, such as catheters and surgical tools, that are in proximity to or inside the body. For this purpose, a magnetic field generator produces a field in and around an area of the body, and sensors in the body and in the invasive device detect the field. A system console receives the sensor signals and displays the location of the invasive device relative to the body.
  • For example, commonly assigned U.S. Pat. No. 7,174,201, issued to Govari, et al., and which is herein incorporated by reference, discloses apparatus for performing a medical procedure within a subject, which includes a wireless tag fixed to the tissue and which includes a first sensor coil. A second sensor coil is fixed to a medical device for use in performing the procedure.
  • An integral processing and display unit includes a plurality of radiator coils, along with processing circuitry and a display. The radiator coils generate electromagnetic fields in a vicinity of the tissue, thereby causing currents to flow in the sensor coils. The processing circuitry processes the currents to determine coordinates of the tag relative to the medical device. The display is driven by the processing circuitry so as to present a visual indication to an operator of the medical device of an orientation of the device relative to the tag.
  • U.S. Pat. No. 5,913,820, issued to Bladen, et al., and which is herein incorporated by reference, discloses methods and apparatus for locating the position, preferably in three dimensions, of a sensor by generating magnetic fields, which are detected at the sensor. The magnetic fields are generated from a plurality of locations and, in one embodiment of the invention, enable both the orientation and location of a single coil sensor to be determined. The system allows an operator to wear small, single coil sensors about his body to enable his movements to be detected and interpreted by a machine without requiring physical contact between the operator and the machine. For example, the positioning system could enable an operator to interact with images on a television or computer screen without the use of a conventional keyboard, mouse or stylus.
  • U.S. Pat. No. 6,129,668, issued to Haynor et al., and which is herein incorporated by reference, discloses a device to detect the location of a magnet coupled to an indwelling medical device within a patient using three or more sets of magnetic sensors each having sensor elements arranged in a known fashion. Each sensor element senses the magnetic field strength generated by the magnet and provides data indicative of the direction of the magnet in a three-dimensional space.
  • U.S. Pat. No. 6,427,079, issued to Schneider, et al., and which is herein incorporated by reference, discloses a remote location determination system that uses splines of magnetic field values to determine location parameters. The location determination system is used on a laser catheter that is operable to perform myocardial revascularization. An automatic calibration technique compensates for any variations in gain in a sensor and related components. Methods for reducing the effects of eddy currents in surrounding conductive objects are used in electromagnetic position and orientation measurement systems.
  • SUMMARY OF THE INVENTION
  • In systems such as the one disclosed in the above-noted U.S. Pat. No. 7,174,201, in order to interact with the console, the system operator, such as a physician, must generally use a conventional user interface device, e.g., a keyboard, mouse or touch screen. The operator may have to disengage from manipulating the invasive device, and move to a different position to work the user interface. Alternatively, he must instruct an assistant to take the necessary actions.
  • Embodiments of the present invention provide new methods and devices for user interaction with a system for medical treatment and/or diagnosis that uses magnetic position tracking. These methods and devices permit the system operator to interact with the console without leaving his normal operating position. In some of these embodiments, the operator is provided with a stylus or other user interface device containing a magnetic sensor, which is linked to the console. The interface device may itself have a dual function as an invasive medical instrument. As long as the stylus is near the patient's body, the sensor senses the fields generated by the magnetic field generator. In other embodiments, the interface device and the medical instrument generate magnetic fields, which are sensed by an external position sensor. A position processor in the console is thus able to determine the location of the stylus just as it determines the locations of the other elements of the system. The system console displays a cursor on a screen, which moves as the operator moves the stylus. The operator can use this cursor to actuate on-screen controls, to draw lines on the screen, and to mark points and otherwise interact with images and maps that are displayed on the screen.
  • In other words, the effect of the stylus and magnetic tracking system is to provide a “virtual touch screen” that the system operator can use conveniently while operating on the patient.
  • Some embodiments of the present invention permit the system operator to view a virtual image of an anatomical structure, in the actual location of the structure, using a “virtual reality” or “augmented reality” display, and to use the stylus to interact with the image. For example, the display with which the operator interacts using the stylus may be presented on goggles worn by the system operator. The goggles contain a position sensor, so that the display is registered with the body of the patient.
  • An embodiment of the invention provides apparatus for invasive medical operations in the body of a living subject. The apparatus includes one or more field generating elements disposed at known locations for generating magnetic fields at respective frequencies, and a medical instrument adapted for insertion into the body. The medical instrument has a first magnetic position sensor coupled thereto that emits first signals responsively to the magnetic fields. An interface device has a second magnetic position sensor coupled thereto that emits second signals responsively to the magnetic fields. The apparatus includes a position processor operative to receive the first signals and the second signals and to determine respective positions of the interface device and the medical instrument relative to the known locations, responsively to the first signals and the second signals, and a display device operative to display an image responsively to the position of the medical instrument. The display device has a cursor moveable thereon under control of the position processor responsively to changes in the position of the interface device.
  • According to an aspect of the apparatus, the display device has a display control that is actuated responsively to a superimposition of the cursor thereon.
  • According to another aspect of the apparatus, the display device has a display control that is actuated responsively to a displacement of the interface device generally toward the display device while the cursor is superimposed on the display control.
  • According to one aspect of the apparatus, the display device is a virtual reality display device having a third magnetic position sensor that emits third signals responsively to the magnetic fields.
  • In yet another aspect of the apparatus, positioning controls are provided for the medical instrument, and the interface device is disposed within reach of an operator of the positioning controls.
  • According to a further aspect of the apparatus, the first magnetic position sensor and the second magnetic position sensor comprise at least two sensor coils.
  • Other embodiments of the invention provide methods that are carried out by the above-described apparatus.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • For a better understanding of the present invention, reference is made to the detailed description of the invention, by way of example, which is to be read in conjunction with the following drawings, wherein like elements are given like reference numerals, and wherein:
  • FIG. 1 is a pictorial illustration of a system for medical imaging using a virtual touch screen, in accordance with a disclosed embodiment of the invention;
  • FIG. 2 is a pictorial illustration of a catheter that may be used in the system shown in FIG. 1, in accordance with an embodiment of the present invention;
  • FIG. 3 is a pictorial illustration of an interface device that may be used in the system shown in FIG. 1, in accordance with an alternate embodiment of the invention;
  • FIG. 4 is a pictorial illustration of a device that produces a virtual reality display that may be used in the system shown in FIG. 1, in accordance with another alternate embodiment of the invention;
  • FIG. 5 is a flow chart showing a method for performing invasive medical operations with the assistance of a virtual touch screen, in accordance with a disclosed embodiment of the invention; and
  • FIG. 6 is a flow chart showing a method for imaging an anatomical structure on the virtual reality display of FIG. 4, in accordance with a disclosed embodiment of the invention.
  • DETAILED DESCRIPTION OF THE INVENTION
  • In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It will be apparent to one skilled in the art, however, that the present invention may be practiced without these specific details. In other instances, well-known circuits, control logic, and the details of computer program instructions for conventional algorithms and processes have not been shown in detail in order not to obscure the present invention unnecessarily.
  • Software programming code, which embodies aspects of the present invention, is typically maintained in permanent storage, such as a computer readable medium. In a client/server environment, such software programming code may be stored on a client or a server. The software programming code may be embodied on any of a variety of known media for use with a data processing system, such as a diskette, or hard drive, or CD-ROM. The code may be distributed on such media, or may be distributed to users from the memory or storage of one computer system over a network of some type to other computer systems for use by users of such other systems.
  • Turning now to the drawings, reference is initially made to FIG. 1, which is a pictorial illustration of a system 20 that tracks and operates a medical instrument within a living body using a virtual touch screen, which is constructed and operative in accordance with a disclosed embodiment of the invention. An operator, for example a physician 22 may use system 20 to obtain medical images using a probe, such as a catheter 23, which may be inserted into an internal body cavity, such as a chamber of a heart 24 of a subject 26. Typically, catheter 23 is used for diagnostic or therapeutic medical procedures, such as mapping electrical potentials in the heart or performing ablation of heart tissue. The catheter or other intra-body device may alternatively be used for other purposes, by itself or in conjunction with other treatment devices. The cardiac application described with respect to FIG. 1 is exemplary. The principles of the invention are applicable to many invasive medical and surgical procedures throughout the body.
  • Reference is now made to FIG. 2, which is a pictorial illustration of catheter 23, in accordance with an embodiment of the present invention. The catheter shown is exemplary; many other types of catheters may be used as catheter 23. Catheter 23 typically comprises positioning controls 27 on a handle 28 to enable the physician to steer, locate and orient, and operate a distal end 29 of catheter 23 as desired.
  • A pointing device, e.g., joystick 52 is attached to handle 28. In some embodiments, handle 28 comprises one or more touch-activated switches, shown as buttons 56. Alternatively, buttons 56 may be located on joystick 52. Joystick 52 and buttons 56 are used for controlling system 20, as described in detail herein below.
  • Distal end 29 and joystick 52 include position sensors 32 and 54 respectively, each comprising sensor coils 35 as described herein below.
  • In some embodiments, distal end 29 comprises an ultrasonic imaging sensor 39. Ultrasonic imaging sensor 39 typically transmits a short burst of ultrasound energy and converts the reflected ultrasound into electrical signals, which are transmitted via cables 33 to console 34 (FIG. 1), as is known in the art.
  • In some embodiments, distal end 29 also comprises at least one electrode 42 for performing diagnostic functions, therapeutic functions, or both, such as electro-physiological mapping and radiofrequency (RF) ablation. In one embodiment, electrode 42 is used for sensing local electrical potentials. The electrical potentials measured by electrode 42 may be used in mapping the local electrical activity on the endocardial surface. When electrode 42 is brought into contact or proximity with a point on the inner surface of heart 24 (FIG. 1), the electrode measures the local electrical potential at that point. The measured potentials are converted into electrical signals and sent through catheter 23 to an image processor 43 (FIG. 1), which converts the signals into an electro-anatomical map.
  • Alternatively, electrode 42 may be used to measure parameters different from the electrical potentials described above, such as various tissue characteristics, temperature, and blood flow.
  • Referring again to FIG. 1, system 20 comprises a positioning subsystem 30 that measures location and orientation coordinates of distal end 29 of catheter 23. As used herein, the term “location” refers to the spatial coordinates of an object, the term “orientation” refers to angular coordinates of the object, and the term “position” refers to the full positional information of the object, comprising both location and orientation coordinates.
  • In one embodiment, positioning subsystem 30 comprises a magnetic position tracking system that determines the position of distal end 29 of catheter 23. Positioning subsystem 30 typically comprises a set of external radiators, such as field generating elements, e.g., coils 31, which are in fixed, known locations external to the subject. Coils 31 generate fields, typically magnetic fields, in the vicinity of heart 24.
  • Referring again to FIG. 2, position sensor 32 senses the fields generated by coils 31 and transmits, in response to the sensed fields, position-related electrical signals over cables 33 running through catheter 23 to console 34 (FIG. 1). Alternatively, position sensor 32 may transmit signals to the console over a wireless link.
  • In order to determine six positional coordinates (X, Y, Z directions and pitch yaw and roll orientations), position sensor 32 comprises at least two, and preferably three, sensor coils 35, adapted to the frequency of one of coils 31 as is known in the art. Sensor coils 35 are wound on either air cores or cores of material. The axes of sensor coils 35 should be non-parallel and preferably mutually orthogonal.
  • In some applications, where fewer position coordinates are required, only a single sensor coil 35 may be necessary in position sensor 32.
  • Position sensor 54, which is located in the joystick 52, preferably in the handle, is similar to position sensor 32. Position sensor 54 senses the fields generated by coils 31, and is used to determine the position of the handle of joystick 52 including its angular orientation in space. Position sensor 54 requires at least one sensing coil, and preferably has three coils.
  • Referring again to FIG. 1, console 34 comprises a position processor 36 that calculates the location and orientation of distal end 29 of catheter 23 based on the signals sent by position sensor 32 (FIG. 2). Position processor 36 typically receives, amplifies, filters, digitizes, and otherwise processes signals from catheter 23. System 20 and position processor 36 may also be realized as elements of the CARTO XP EP Navigation and Ablation System, available from Biosense Webster, Inc., 3333 Diamond Canyon Road, Diamond Bar, Calif. 91765, and suitably modified to execute the principles of the pre-sent invention.
  • Some position tracking systems that may be used in embodiments of the present invention are described, for example, in U.S. Pat. Nos. 6,690,963, 6,618,612 and 6,332,089, and U.S. Patent Application Publications 2004/0147920 and 2004/0068178, all of which are incorporated herein by reference.
  • In some embodiments, image processor 43 uses the electrical signals received from ultrasonic imaging sensor 39 (FIG. 2) and positional information received from position sensor 32 in distal end 29 of catheter 23 to produce an image of a target structure of the subject's heart. The images may be enhanced using electrical information derived from electrode 42.
  • In other embodiments, image processor 43 may not produce a medical image, but may merely produce an image of distal end 29 of catheter 23 overlaid on a representation of subject 26, or may simply show the position of distal end 29 with respect to a target within the subject, in order to assist physician 22 with a medical procedure.
  • Images produced by image processor 43 are output on a display device 44. For example, FIG. 1 shows an image 46 of part of heart 24. System 20 typically provides display controls, for example a GUI (Graphical User Interface), comprising windows, icons and menus, for manipulating and viewing images produced by image processor 43. An interface device is used to move a cursor 48 on display device 44.
  • In one embodiment the interface device comprises joystick 52 (FIG. 2), which is within reach of physician 22 when he is using operating controls 27. For example, in a medical procedure involving realtime image processing, rotation of the joystick may continuously control a parameter such as the edge threshold in an edge detection algorithm. Other joystick motions and button commands may be user-assigned in order to control other aspects of the operation of the system 20. As physician 22 moves joystick 52, the location of position sensor 54 is tracked by the position processor 36 (FIG. 1) transmitted to console 34, where it is registered on the display 44. The position processor 36 translates joystick movements into movements of cursor 48 on display device 44.
  • Alternatively, the interface device may be a separate device, distinct from catheter 23 or any other medical device. Reference is now made to FIG. 3, which is a diagram of an exemplary interface device 60 for use with system 20 (FIG. 1), in accordance with an alternate embodiment of the invention. Interface device 60 may be a wand or stylus, and is shaped to be easily graspable and manipulable by physician 22 (FIG. 1). Interface device 60 comprises position sensor 54 and buttons 56, as described above. Position sensor 54 senses magnetic fields produced by coils 31 (FIG. 1) and transmits, in response to the sensed fields, position-related electrical signals over cables 63 to console 34. Alternatively, position sensor 54 may transmit signals to the console over a wireless link. In this way, system 20 is able to determine the position of interface device 60.
  • A 3-dimensional spatial region 61 including screen 62 of display 40 is mapped by the position processor 36 to a spatial region 67 near or including device 60. A displacement of device 60 in the region 67 that changes its XY coordinates in coordinate system 65 produces a corresponding movement of a cursor on the screen 62. When the device 60 is displaced so as to change its Z-coordinate and intersect virtual plane 70, physical contact with the screen 62 is emulated. This event stimulates the graphical user interface of the display 40 as though a physical touch screen were contacted at a point corresponding to the XY coordinate of the intersection in the plane 70.
  • Icons and menus (not shown) on the display 40 are actuated by superimposing the cursor on them. In an alternate embodiment, the icons and menus are actuated by passing the cursor over them while pressing one of buttons 56. This causes an electrical signal to be transmitted along cables 33 to console 34, where the processor interprets the signal to activate the icon or menu. The tracking of a pointing device for a GUI is well known in the art, and is not described further here.
  • Similarly, physician 22 may move cursor 48 from a first position to a second position, in order to draw a corresponding line via the GUI from the first position to the second position, mark points using buttons 56, and otherwise interact with images and maps that are displayed on the display device.
  • In some embodiments of the invention, the images are displayed on a virtual reality display rather than a conventional display monitor. Reference is now made to FIG. 4, which is a pictorial illustration of a device that produces a virtual reality display, in accordance with an alternate embodiment of the invention.
  • Virtual reality goggles 100 comprise at least one, and typically two, display devices 105, supported by a frame 110, constructed so that physician 22 (FIG. 1) may wear goggles 100 with display devices 105 in front of his eyes. Display devices 105 show virtual images, for example, of a part of heart 24 (FIG. 1) and distal end 29 of catheter 23 (FIG. 2), as described herein below. Alternatively, display devices 105 may be transparent, or partially transparent, in order to provide augmented reality images in which the virtual images are superimposed on the body of subject 26 (FIG. 1).
  • Methods for display of virtual reality and augmented reality images are well known in the art. An exemplary disclosure is U.S. Pat. No. 6,695,779, issued to Sauer et al., which is incorporated herein by reference.
  • Goggles 100 comprise a position sensor 132, similar to position sensor 32, which senses magnetic fields produced by coils 31 (FIG. 1) and transmits, in response to the sensed fields, position-related electrical signals to console 34 (FIG. 1), using a wireless transmitter 140. Wireless transmitter 140 may also be used as a receiver for images to be displayed on display devices 105. Alternatively, the transmitter may be wired to the console.
  • Position sensor 132 is similar to position sensor 32, but may comprise a miniaturized position sensor, for example as described in U.S. Pat. No. 6,201,387, issued to Govari, which is incorporated herein by reference.
  • Alternatively, position sensor 132 may comprise a wireless position sensor. A suitable device is described in U.S. Patent Application Publication No. 2005/0099290, which is incorporated herein by reference. In this case, wireless transmitter 140 acts solely as a receiver for images from image processor 43 (FIG. 1).
  • Further alternatively, position sensor 132 may transmit signals to the console over a cable (not shown). However, this alternative is less convenient. Similarly, images to be displayed on display devices 105 may be received over cables (not shown). Because the positions of display devices 105 are fixed in relation to position sensor 132, system 20 is able to determine the positions of each of display devices 105. Using the information provided by the position sensor 132, the position processor 36 (FIG. 1) can register the virtual reality display with the body of the patient. In this manner, the operator can view an image of an organ superimposed on an image of the patient's body in the proper position and orientation, and can use the device 60 (FIG. 3) to interact with the images as described above.
  • Alternatively, as shown in FIG. 4, each of display devices 105 may be attached to its own position sensor 132. This allows greater flexibility of movement of the goggles, since the relative positions of display devices 105 need not be constant. Although FIG. 4 shows each position sensor 132 connected to a separate wireless transmitter 140, a single wireless transmitter 140 may be used.
  • The virtual reality image may be manipulated using many combinations of interface devices such as joystick 52 or interface device 60, as described above. As conditions of the medical procedure change, some embodiments may become less convenient than others. For example, some phases may be hazardous, e.g., taking place under conditions of radiation exposure, and requiring hands-off actuation of the medical instrument on the part of the physician 22. In such cases the use of goggles 100 may be preferable. In other situations, the lighting conditions in the operatory may be unsuitable for use of goggles 100.
  • In an alternate embodiment, position sensors 32, 54, 132 may be replaced by radiators, e.g., coils, that generate magnetic fields, which are received by sensors outside the subject's body. The external sensors generate the position-related electrical signals.
  • Reference is now made to FIG. 5, which is a flow chart showing a method for performing invasive medical operations with the assistance of a virtual touch screen, in accordance with a disclosed embodiment of the invention.
  • The method begins at an initial step 150, where the position of distal end 29 (FIG. 1) of catheter 23 is determined, typically using the magnetic fields produced by coils 31 and sensed by position sensor 32 (FIG. 2). Alternatively, as described above, the position of distal end 29 may be determined by external position sensors that detect magnetic fields generated at a fixed position relative to distal end 29.
  • Next, at step 152, an image, for example image 46, is acquired and displayed on display 44. The image may be an image of subject 26, which may be obtained, for example, using catheter 23. Alternatively, the image may be an image of distal end 29 overlaid on a representation of subject 26. Further alternatively, the image may show the position of distal end 29 with respect to a target within the subject. Steps 150 and 152 may be repeated as distal end 29 moves.
  • At step 155, typically performed concurrently with steps 150 and 152, the position of the interface device is determined, for example by position sensor 54 (FIG. 2). Alternatively, one of position sensors 32, 54 may be replaced by a radiator, which is used to as a reference establish coordinates for the system. In this case, the same external sensors are used to detect the positions of the distal end of the catheter and the interface device.
  • Next, at step 160, cursor 48 is positioned on display 44. The initial position may be predefined or random.
  • At step 165, typically performed after a time delay, or after an interrupt, the position of the interface device is determined, as in step 155.
  • Next, at decision step 170, it is determined whether the interface device has moved since the previous iteration of step 165, or step 155 if this is the first iteration. If the determination at determination step 170 is negative, then control proceeds to a decision step 175, described below.
  • If the determination at decision step 170 is affirmative, then control proceeds to step 180. Cursor 48 is repositioned on display 44 in response to the displacement of the interface device relative to its previous position. Control proceeds to decision step 175.
  • In some embodiments of the invention, display controls, for example a GUI as described above, appear on display 44. At decision step 175, it is determined whether the cursor is superimposed on one of the display controls. If the determination at decision step 175 is negative, then control returns to step 165.
  • If the determination at decision step 175 is affirmative, then control proceeds to step 185. The display control is actuated. This may cause a change in the orientation or scale of the image on display 44, or other changes to the display of the image or may actuate a function of catheter 23, according to a computer application that is controlled via the GUI.
  • Next at decision step 190, it is determined whether the procedure is complete. Typically, this is indicated by the actuation of an appropriate display control at step 185. If the determination at decision step 190 is negative, then control returns to step 165.
  • If the determination at decision step 190 is affirmative, then control proceeds to final step 195, where the method ends.
  • Reference is now made to FIG. 6, which is a flow chart showing a method for imaging an anatomical structure on the virtual reality display of FIG. 4, in accordance with a disclosed embodiment of the invention. The process steps are shown in a particular linear sequence in FIG. 6 for clarity of presentation. However, it will be evident that many of them can be performed in parallel, asynchronously, or in different orders. For example, acquiring the image and locating the display devices may be performed in either order, or simultaneously.
  • The method begins at initial step 205, where an image, typically three-dimensional, of a part of an anatomical structure is acquired. For an ultrasound image, this may be performed as described for example, in U.S. Patent Application Publication No. 2006/0241445, which is incorporated herein by reference.
  • Next, at step 220, one or more position sensors 132 (FIG. 4) determine the positions of display devices 105. The position information is transmitted to console 34.
  • Next, at step 222, image processor 43 uses position information from step 220 and standard geometrical techniques to obtain, for each of display devices 105, a 2-dimensional projection of the image.
  • At final step 225, the projections are transmitted to display devices 105 (FIG. 4) and displayed.
  • It will be appreciated by persons skilled in the art that the present invention is not limited to what has been particularly shown and described hereinabove. Rather, the scope of the present invention includes both combinations and sub-combinations of the various features described hereinabove, as well as variations and modifications thereof that are not in the prior art, which would occur to persons skilled in the art upon reading the foregoing description.

Claims (25)

1. Apparatus for invasive medical operations in the body of a living subject, comprising:
one or more field generating elements disposed at known locations for generating magnetic fields at respective frequencies;
a medical instrument adapted for insertion into said body, and having a first magnetic position sensor coupled thereto that emits first signals responsively to said magnetic fields;
an interface device having a second magnetic position sensor coupled thereto that emits second signals responsively to said magnetic fields;
a position processor operative to receive said first signals and said second signals and to determine respective positions of said interface device and said medical instrument relative to said known locations, responsively to said first signals and said second signals; and
a display device operative to display an image responsively to said position of said medical instrument, said display device having a cursor moveable thereon under control of said position processor responsively to changes in said position of said interface device.
2. The apparatus according to claim 1, wherein said display device has a display control that is actuated responsively to a superimposition of said cursor thereon.
3. The apparatus according to claim 1, wherein said display device has a display control that is actuated responsively to a displacement of said interface device generally toward said display device while said cursor is superimposed on said display control.
4. The apparatus according to claim 1, wherein said display device comprises a virtual reality display device having a third magnetic position sensor that emits third signals responsively to said magnetic fields.
5. The apparatus according to claim 1, further comprising positioning controls for said medical instrument, and wherein said interface device is within reach of an operator of said positioning controls.
6. The apparatus according to claim 1, wherein said first magnetic position sensor and said second magnetic position sensor comprise at least two sensor coils.
7. Apparatus for invasive medical operations in the body of a living subject, comprising:
a medical instrument adapted for insertion into an anatomical structure in said body, and having one or more first field generating elements for generating first magnetic fields at respective frequencies;
an interface device having one or more second field generating elements for generating second magnetic fields at respective frequencies;
a magnetic position sensor at a known location that emits first signals responsively to said first magnetic fields, and that emits second signals responsively to said second magnetic fields;
a position processor operative to receive said first signals and said second signals and to determine respective positions of said interface device and said medical instrument relative to said known location, responsively to said first signals and said second signals; and
a display device operative to display an image responsively to said position of said medical instrument, said display device having a cursor moveable thereon under control of said position processor, movements of said cursor being controlled by said position processor responsively to changes in said position of said interface device.
8. The apparatus according to claim 7, wherein said display device has a display control that is actuated responsively to a superimposition of said cursor thereon.
9. The apparatus according to claim 7, wherein said display device has a display control that is actuated responsively to a displacement of said interface device generally toward said display device while said cursor is superimposed on said display control.
10. The apparatus according to claim 7, wherein said display device comprises a virtual reality display device having one or more third field generating elements for generating third magnetic fields, said magnetic position sensor being responsive to said third magnetic fields.
11. The apparatus according to claim 7, wherein said display device is operative to draw a line from a first position to a second position on said display device when said cursor is moved from said first position to said second position.
12. The apparatus according to claim 7, further comprising positioning controls for said medical instrument, and wherein said interface device is within reach of an operator of said positioning controls.
13. The apparatus according to claim 7, wherein said magnetic position sensor comprises at least two sensor coils.
14. A computer aided method for performing invasive medical operations in the body of a living subject, comprising the steps of:
generating magnetic fields in the vicinity of said body;
inserting a medical instrument into said body;
determining a position of said medical instrument, responsively to said magnetic fields;
determining a position of an interface device, responsively to said magnetic fields;
displaying an image on a display device, responsively to said position of said medical instrument; and
displaying a cursor, moveable responsively to changes in said position of said interface device, on said display device.
15. The method according to claim 14, wherein said step of generating comprises generating said magnetic fields in known locations, wherein said step of determining said position of said medical instrument comprises the steps of:
sensing said magnetic fields at fixed positions relative to said medical instrument; and
determining said position of said medical instrument relative to said known locations, and wherein said step of determining said position of said interface device comprises the steps of:
sensing said magnetic fields at fixed positions relative to said interface device; and
determining said position of said interface device relative to said known locations.
16. The method according to claim 14, wherein said step of generating comprises the steps of:
generating first magnetic fields at fixed positions relative to said medical instrument; and
generating second magnetic fields at fixed positions relative to said interface device, wherein said step of determining said position of said medical instrument comprises the steps of:
sensing said first magnetic fields at known locations; and
determining said position of said medical instrument relative to said known locations, and wherein said step of determining said position of said interface device comprises the steps of:
sensing said second magnetic fields at said known locations; and
determining said position of said interface device relative to said known locations.
17. The method according to claim 14, further comprising the steps of:
displaying a display control on said display device;
detecting that said cursor is superimposed on said display control; and
actuating said display control, responsively to said step of detecting.
18. The method according to claim 14, further comprising the steps of:
displaying a display control on said display device;
detecting that said cursor is superimposed on said display control;
responsively to said step of detecting displacing said interface device to intersect a predefined plane; and
thereafter actuating said display control.
19. The method according to claim 14, wherein said display device comprises a virtual reality display device having a third magnetic position sensor coupled thereto that emits third signals responsively to said magnetic fields.
20. A method for imaging an anatomical structure on virtual reality goggles, comprising the steps of:
inserting a medical instrument into said anatomical structure, in proximity to one or more field generating elements disposed at known locations for generating magnetic fields at respective frequencies, wherein said medical instrument has a first magnetic position sensor coupled thereto that emits first signals responsively to said magnetic fields;
determining a position of said medical instrument, relative to said known locations, responsively to said first signals;
acquiring an image of said anatomical structure using said medical instrument, responsively to said position of said medical instrument;
receiving second signals from one or more second magnetic position sensors, coupled to at least one display device, wherein said virtual reality goggles comprise said at least one display device;
determining positions of said at least one display device, relative to said known locations, responsively to said second signals;
obtaining two-dimensional projections of said image responsively to said positions of said at least one display device; and
displaying said two-dimensional projections on said at least one display device.
21. A system for imaging an anatomical structure, comprising:
one or more field generating elements disposed at known locations for generating magnetic fields at respective frequencies;
a medical instrument adapted for insertion into said anatomical structure, and having a first magnetic position sensor coupled thereto that emits first signals responsively to said magnetic fields;
virtual reality goggles that comprise one or more second magnetic position sensors, that emit second signals responsively to said magnetic fields, and at least one display device; and
a position processor operative to receive said first signals and said second signals, to determine a position of said medical instrument relative to said known locations, responsively to said first signals, to acquire an image of said anatomical structure using said medical instrument, responsively to said position of said medical instrument, to determine positions of said at least one display device, relative to said known locations, responsively to said second signals, to obtain two-dimensional projections of said image responsively to said positions of said at least one display device, and to display said two-dimensional projections on said at least one display device.
22. The system according to claim 21, further comprising an interface device having a third magnetic position sensor coupled thereto that emits third signals responsively to said magnetic fields, and wherein said position processor is operative to receive said third signals and to determine a position of said interface device, relative to said known locations, responsively to said third signals, and wherein said at least one display device has a cursor moveable thereon, movements of said cursor being controlled by said position processor responsively to changes in said position of said interface device.
23. The system according to claim 22, wherein said at least one display device has a display control that are actuated responsively to a superimposition of said cursor thereon.
24. The system according to claim 22, wherein said at least one display device has a display control that is actuated responsively to a displacement of said interface device generally toward said display device while said cursor is superimposed on said display control.
25. The system according to claim 22, further comprising positioning controls for said medical instrument, and wherein said interface device is within reach of an operator of said positioning controls.
US12/039,779 2008-02-29 2008-02-29 Location system with virtual touch screen Active 2031-11-23 US8926511B2 (en)

Priority Applications (11)

Application Number Priority Date Filing Date Title
US12/039,779 US8926511B2 (en) 2008-02-29 2008-02-29 Location system with virtual touch screen
IL197318A IL197318A (en) 2008-02-29 2009-02-26 Location system with virtual touch screen
AU2009200770A AU2009200770B2 (en) 2008-02-29 2009-02-26 Location system with virtual touch screen
EP09250552.8A EP2096523B1 (en) 2008-02-29 2009-02-27 Location system with virtual touch screen
CA2656309A CA2656309C (en) 2008-02-29 2009-02-27 Location system with virtual touch screen
KR1020090016871A KR101612278B1 (en) 2008-02-29 2009-02-27 Location system with virtual touch screen
CN201510029456.5A CN104605855B (en) 2008-02-29 2009-02-27 Alignment system with virtual touch screen
CN200910130767A CN101530325A (en) 2008-02-29 2009-02-27 Location system with virtual touch screen
JP2009045364A JP5436886B2 (en) 2008-02-29 2009-02-27 Positioning system with virtual touch screen
BRPI0901476A BRPI0901476B8 (en) 2008-02-29 2009-02-27 apparatus for medical operations
MX2009002363A MX350265B (en) 2008-02-29 2009-03-02 Location system with virtual touch screen.

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US12/039,779 US8926511B2 (en) 2008-02-29 2008-02-29 Location system with virtual touch screen

Publications (2)

Publication Number Publication Date
US20090221907A1 true US20090221907A1 (en) 2009-09-03
US8926511B2 US8926511B2 (en) 2015-01-06

Family

ID=40672316

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/039,779 Active 2031-11-23 US8926511B2 (en) 2008-02-29 2008-02-29 Location system with virtual touch screen

Country Status (10)

Country Link
US (1) US8926511B2 (en)
EP (1) EP2096523B1 (en)
JP (1) JP5436886B2 (en)
KR (1) KR101612278B1 (en)
CN (2) CN104605855B (en)
AU (1) AU2009200770B2 (en)
BR (1) BRPI0901476B8 (en)
CA (1) CA2656309C (en)
IL (1) IL197318A (en)
MX (1) MX350265B (en)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090264966A1 (en) * 2004-11-02 2009-10-22 Pixeloptics, Inc. Device for Inductive Charging of Implanted Electronic Devices
US20110218550A1 (en) * 2010-03-08 2011-09-08 Tyco Healthcare Group Lp System and method for determining and adjusting positioning and orientation of a surgical device
US20110238034A1 (en) * 2010-03-25 2011-09-29 Medtronic, Inc. Method and Apparatus for Guiding an External Needle to an Implantable Device
US20110237937A1 (en) * 2010-03-25 2011-09-29 Medtronic, Inc. Method and Apparatus for Guiding an External Needle to an Implantable Device
US20110237936A1 (en) * 2010-03-25 2011-09-29 Medtronic, Inc. Method and Apparatus for Guiding an External Needle to an Implantable Device
US20110237935A1 (en) * 2010-03-25 2011-09-29 Medtronic, Inc. Method and Apparatus for Guiding an External Needle to an Implantable Device
DE102010027526A1 (en) * 2010-07-16 2012-01-19 Gottfried Wilhelm Leibniz Universität Hannover Hand-guided measurement and projection system for projecting images of patient e.g. human, has data processing system correcting given images regarding surface course and projecting corrected image on surface
US20140171785A1 (en) * 2012-12-17 2014-06-19 Biosense Webster (Israel), Ltd. Recognizing which instrument is currently active
US8778022B2 (en) 2004-11-02 2014-07-15 E-Vision Smart Optics Inc. Electro-active intraocular lenses
CN103989520A (en) * 2014-04-30 2014-08-20 西安云合生物科技有限公司 Multifunctional touch electrotome
EP2875780A1 (en) 2013-11-21 2015-05-27 Biosense Webster (Israel), Ltd. Tracking of catheter using impedance measurements
US9743835B2 (en) 2010-08-12 2017-08-29 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US9801709B2 (en) 2004-11-02 2017-10-31 E-Vision Smart Optics, Inc. Electro-active intraocular lenses
CN107485388A (en) * 2016-06-09 2017-12-19 韦伯斯特生物官能(以色列)有限公司 The dual-functional sensor of basket catheter
US10354050B2 (en) 2009-03-17 2019-07-16 The Board Of Trustees Of Leland Stanford Junior University Image processing method for determining patient-specific cardiovascular information
CN110533999A (en) * 2018-09-05 2019-12-03 南京阿波罗机器人科技有限公司 A kind of teaching robot's calibration method and teaching robot
USD882633S1 (en) 2017-07-06 2020-04-28 Biosense Webster (Israel) Ltd. Display screen or portion thereof with icon
US20210196105A1 (en) * 2019-12-31 2021-07-01 Biosense Webster (Israel) Ltd. Wiring of trocar having movable camera and fixed position sensor
US11107587B2 (en) 2008-07-21 2021-08-31 The Board Of Trustees Of The Leland Stanford Junior University Method for tuning patient-specific cardiovascular simulations
US11103174B2 (en) 2013-11-13 2021-08-31 Biosense Webster (Israel) Ltd. Reverse ECG mapping
US11786319B2 (en) 2017-12-14 2023-10-17 Verb Surgical Inc. Multi-panel graphical user interface for a robotic surgical system

Families Citing this family (71)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2316328B1 (en) 2003-09-15 2012-05-09 Super Dimension Ltd. Wrap-around holding device for use with bronchoscopes
ES2432616T3 (en) 2003-09-15 2013-12-04 Covidien Lp Accessory system for use with bronchoscopes
US8764725B2 (en) 2004-02-09 2014-07-01 Covidien Lp Directional anchoring mechanism, method and applications thereof
US8905920B2 (en) 2007-09-27 2014-12-09 Covidien Lp Bronchoscope adapter and method
US8641663B2 (en) 2008-03-27 2014-02-04 St. Jude Medical, Atrial Fibrillation Division, Inc. Robotic catheter system input device
US8684962B2 (en) 2008-03-27 2014-04-01 St. Jude Medical, Atrial Fibrillation Division, Inc. Robotic catheter device cartridge
US9241768B2 (en) 2008-03-27 2016-01-26 St. Jude Medical, Atrial Fibrillation Division, Inc. Intelligent input device controller for a robotic catheter system
US8317744B2 (en) 2008-03-27 2012-11-27 St. Jude Medical, Atrial Fibrillation Division, Inc. Robotic catheter manipulator assembly
US9161817B2 (en) 2008-03-27 2015-10-20 St. Jude Medical, Atrial Fibrillation Division, Inc. Robotic catheter system
US8343096B2 (en) 2008-03-27 2013-01-01 St. Jude Medical, Atrial Fibrillation Division, Inc. Robotic catheter system
US8641664B2 (en) 2008-03-27 2014-02-04 St. Jude Medical, Atrial Fibrillation Division, Inc. Robotic catheter system with dynamic response
WO2009122273A2 (en) 2008-04-03 2009-10-08 Superdimension, Ltd. Magnetic interference detection system and method
US8473032B2 (en) 2008-06-03 2013-06-25 Superdimension, Ltd. Feature-based registration method
US8218847B2 (en) 2008-06-06 2012-07-10 Superdimension, Ltd. Hybrid registration method
US8932207B2 (en) 2008-07-10 2015-01-13 Covidien Lp Integrated multi-functional endoscopic tool
US8611984B2 (en) 2009-04-08 2013-12-17 Covidien Lp Locatable catheter
US9439736B2 (en) 2009-07-22 2016-09-13 St. Jude Medical, Atrial Fibrillation Division, Inc. System and method for controlling a remote medical device guidance system in three-dimensions using gestures
US9330497B2 (en) 2011-08-12 2016-05-03 St. Jude Medical, Atrial Fibrillation Division, Inc. User interface devices for electrophysiology lab diagnostic and therapeutic equipment
KR101113219B1 (en) * 2009-12-08 2012-02-20 삼성메디슨 주식회사 Augmented reality ultrasonograph system and image forming method of the same
US20120249797A1 (en) 2010-02-28 2012-10-04 Osterhout Group, Inc. Head-worn adaptive display
US10180572B2 (en) 2010-02-28 2019-01-15 Microsoft Technology Licensing, Llc AR glasses with event and user action control of external applications
US20120194553A1 (en) * 2010-02-28 2012-08-02 Osterhout Group, Inc. Ar glasses with sensor and user action based control of external devices with feedback
US20150309316A1 (en) 2011-04-06 2015-10-29 Microsoft Technology Licensing, Llc Ar glasses with predictive control of external device based on event input
EP2539759A1 (en) 2010-02-28 2013-01-02 Osterhout Group, Inc. Local advertising content on an interactive head-mounted eyepiece
US9888973B2 (en) 2010-03-31 2018-02-13 St. Jude Medical, Atrial Fibrillation Division, Inc. Intuitive user interface control for remote catheter navigation and 3D mapping and visualization systems
WO2011159834A1 (en) 2010-06-15 2011-12-22 Superdimension, Ltd. Locatable expandable working channel and method
TWI501130B (en) * 2010-10-18 2015-09-21 Ind Tech Res Inst Virtual touch control system
CN102920509A (en) * 2012-10-30 2013-02-13 华南理工大学 Real-time wireless surgical navigation device based on ultrasonic
RU2695598C2 (en) * 2013-08-23 2019-07-24 ЭТИКОН ЭНДО-СЕРДЖЕРИ, ЭлЭлСи Interactive displays
US10952593B2 (en) 2014-06-10 2021-03-23 Covidien Lp Bronchoscope adapter
JP6429618B2 (en) * 2014-12-22 2018-11-28 オリンパス株式会社 Endoscope insertion shape observation device
US10181219B1 (en) * 2015-01-21 2019-01-15 Google Llc Phone control and presence in virtual reality
US10180734B2 (en) 2015-03-05 2019-01-15 Magic Leap, Inc. Systems and methods for augmented reality
US10838207B2 (en) 2015-03-05 2020-11-17 Magic Leap, Inc. Systems and methods for augmented reality
WO2016141373A1 (en) * 2015-03-05 2016-09-09 Magic Leap, Inc. Systems and methods for augmented reality
US10426555B2 (en) 2015-06-03 2019-10-01 Covidien Lp Medical instrument with sensor for use in a system and method for electromagnetic navigation
KR101647467B1 (en) * 2015-06-05 2016-08-11 주식회사 메드릭스 3d surgical glasses system using augmented reality
US9947091B2 (en) * 2015-11-16 2018-04-17 Biosense Webster (Israel) Ltd. Locally applied transparency for a CT image
KR20180090355A (en) 2015-12-04 2018-08-10 매직 립, 인코포레이티드 Recirculation systems and methods
CN105395252A (en) * 2015-12-10 2016-03-16 哈尔滨工业大学 Wearable three-dimensional image navigation device for vascular intervention operation and realizing man-machine interaction
JP6509374B2 (en) * 2015-12-17 2019-05-08 オリンパス株式会社 Ultrasonic observation apparatus, processing apparatus, operation method of ultrasonic observation apparatus, and operation program of ultrasonic observation apparatus
EP3411779A4 (en) * 2016-02-05 2019-02-20 Magic Leap, Inc. Systems and methods for augmented reality
CA3016176A1 (en) * 2016-03-17 2017-09-21 Becton, Dickinson And Company Medical record system using a patient avatar
CN105788390A (en) * 2016-04-29 2016-07-20 吉林医药学院 Medical anatomy auxiliary teaching system based on augmented reality
US10478254B2 (en) 2016-05-16 2019-11-19 Covidien Lp System and method to access lung tissue
EP3494549A4 (en) 2016-08-02 2019-08-14 Magic Leap, Inc. Fixed-distance virtual and augmented reality systems and methods
CN106251752A (en) * 2016-10-25 2016-12-21 深圳市科创数字显示技术有限公司 The medical science training system that AR and VR combines
US10751126B2 (en) 2016-10-28 2020-08-25 Covidien Lp System and method for generating a map for electromagnetic navigation
US10517505B2 (en) 2016-10-28 2019-12-31 Covidien Lp Systems, methods, and computer-readable media for optimizing an electromagnetic navigation system
US10638952B2 (en) 2016-10-28 2020-05-05 Covidien Lp Methods, systems, and computer-readable media for calibrating an electromagnetic navigation system
US10418705B2 (en) 2016-10-28 2019-09-17 Covidien Lp Electromagnetic navigation antenna assembly and electromagnetic navigation system including the same
US10615500B2 (en) 2016-10-28 2020-04-07 Covidien Lp System and method for designing electromagnetic navigation antenna assemblies
US10446931B2 (en) 2016-10-28 2019-10-15 Covidien Lp Electromagnetic navigation antenna assembly and electromagnetic navigation system including the same
US10792106B2 (en) 2016-10-28 2020-10-06 Covidien Lp System for calibrating an electromagnetic navigation system
US10722311B2 (en) 2016-10-28 2020-07-28 Covidien Lp System and method for identifying a location and/or an orientation of an electromagnetic sensor based on a map
KR101908016B1 (en) 2016-12-08 2018-12-11 제주대학교 산학협력단 System and method for providing online to offline fugitive pursuit game based on internet of things
US10918445B2 (en) * 2016-12-19 2021-02-16 Ethicon Llc Surgical system with augmented reality display
US10812936B2 (en) 2017-01-23 2020-10-20 Magic Leap, Inc. Localization determination for mixed reality systems
KR102366781B1 (en) 2017-03-17 2022-02-22 매직 립, 인코포레이티드 Mixed reality system with color virtual content warping and method for creating virtual content using same
CA3054617A1 (en) 2017-03-17 2018-09-20 Magic Leap, Inc. Mixed reality system with multi-source virtual content compositing and method of generating virtual content using same
JP7055815B2 (en) 2017-03-17 2022-04-18 マジック リープ, インコーポレイテッド A mixed reality system that involves warping virtual content and how to use it to generate virtual content
EP3612126A1 (en) * 2017-04-20 2020-02-26 The Cleveland Clinic Foundation System and method for holographic image-guided non-vascular percutaneous procedures
US10390891B2 (en) 2017-06-13 2019-08-27 Biosense Webster (Israel) Ltd. Hologram lens for positioning an orthopedic implant
US11219489B2 (en) 2017-10-31 2022-01-11 Covidien Lp Devices and systems for providing sensors in parallel with medical tools
WO2019117926A1 (en) * 2017-12-14 2019-06-20 Verb Surgical Inc. Multi-panel graphical user interface for a robotic surgical system
EP3827299A4 (en) 2018-07-23 2021-10-27 Magic Leap, Inc. Mixed reality system with virtual content warping and method of generating virtual content using same
CN117711284A (en) 2018-07-23 2024-03-15 奇跃公司 In-field subcode timing in a field sequential display
EP3840645A4 (en) * 2018-08-22 2021-10-20 Magic Leap, Inc. Patient viewing system
US10832392B2 (en) * 2018-12-19 2020-11-10 Siemens Healthcare Gmbh Method, learning apparatus, and medical imaging apparatus for registration of images
CN113180574A (en) * 2021-04-06 2021-07-30 重庆博仕康科技有限公司 Endoscope insert structure soon and endoscope
CN113610853B (en) * 2021-10-11 2022-01-28 北京工业大学 Emotional state display method, device and system based on resting brain function image

Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5417210A (en) * 1992-05-27 1995-05-23 International Business Machines Corporation System and method for augmentation of endoscopic surgery
US5729129A (en) * 1995-06-07 1998-03-17 Biosense, Inc. Magnetic location system with feedback adjustment of magnetic field generator
US5913820A (en) * 1992-08-14 1999-06-22 British Telecommunications Public Limited Company Position location system
US6129668A (en) * 1997-05-08 2000-10-10 Lucent Medical Systems, Inc. System and method to determine the location and orientation of an indwelling medical device
US6201387B1 (en) * 1997-10-07 2001-03-13 Biosense, Inc. Miniaturized position sensor having photolithographic coils for tracking a medical probe
US6332089B1 (en) * 1996-02-15 2001-12-18 Biosense, Inc. Medical procedures and apparatus using intrabody probes
US20020049375A1 (en) * 1999-05-18 2002-04-25 Mediguide Ltd. Method and apparatus for real time quantitative three-dimensional image reconstruction of a moving organ and intra-body navigation
US6427079B1 (en) * 1999-08-09 2002-07-30 Cormedica Corporation Position and orientation measuring with magnetic fields
US6618612B1 (en) * 1996-02-15 2003-09-09 Biosense, Inc. Independently positionable transducers for location system
US6690963B2 (en) * 1995-01-24 2004-02-10 Biosense, Inc. System for determining the location and orientation of an invasive medical instrument
US20040068178A1 (en) * 2002-09-17 2004-04-08 Assaf Govari High-gradient recursive locating system
US20040147920A1 (en) * 2002-10-21 2004-07-29 Yaron Keidar Prediction and assessment of ablation of cardiac tissue
US20040193006A1 (en) * 1995-07-24 2004-09-30 Chen David T. Anatomical visualization system
US20040267125A1 (en) * 2003-06-26 2004-12-30 Skyba Danny M. Adaptive processing of contrast enhanced ultrasonic diagnostic images
US20050099290A1 (en) * 2003-11-11 2005-05-12 Biosense Webster Inc. Digital wireless position sensor
US20060100505A1 (en) * 2004-10-26 2006-05-11 Viswanathan Raju R Surgical navigation using a three-dimensional user interface
US20060241445A1 (en) * 2005-04-26 2006-10-26 Altmann Andres C Three-dimensional cardial imaging using ultrasound contour reconstruction
US7174201B2 (en) * 1999-03-11 2007-02-06 Biosense, Inc. Position sensing system with integral location pad and position display
US7285117B2 (en) * 2002-03-15 2007-10-23 Boston Scientific Scimed, Inc. Medical device control systems
US20090080738A1 (en) * 2007-05-01 2009-03-26 Dror Zur Edge detection in ultrasound images
US7697973B2 (en) * 1999-05-18 2010-04-13 MediGuide, Ltd. Medical imaging and navigation system

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6233476B1 (en) * 1999-05-18 2001-05-15 Mediguide Ltd. Medical positioning system
US9572519B2 (en) 1999-05-18 2017-02-21 Mediguide Ltd. Method and apparatus for invasive device tracking using organ timing signal generated from MPS sensors
JP3520062B2 (en) 2001-08-10 2004-04-19 日清食品株式会社 Raw type instant noodles with reduced acidity and method for producing the same
US6695779B2 (en) 2001-08-16 2004-02-24 Siemens Corporate Research, Inc. Method and apparatus for spatiotemporal freezing of ultrasound images in augmented reality visualization
US7324085B2 (en) * 2002-01-25 2008-01-29 Autodesk, Inc. Techniques for pointing to locations within a volumetric display
US7769427B2 (en) * 2002-07-16 2010-08-03 Magnetics, Inc. Apparatus and method for catheter guidance control and imaging
CN1747679B (en) * 2003-02-04 2012-10-03 奥林巴斯株式会社 Medical apparatus guiding system and control method thereof
US7280863B2 (en) * 2003-10-20 2007-10-09 Magnetecs, Inc. System and method for radar-assisted catheter guidance and control
CN1901835A (en) * 2003-11-14 2007-01-24 通用电气公司 System and method for distortion reduction in an electromagnetic tracker
US8473869B2 (en) 2004-11-16 2013-06-25 Koninklijke Philips Electronics N.V. Touchless manipulation of images for regional enhancement
EP1895899A4 (en) * 2005-05-06 2009-10-28 Stereotaxis Inc User interfaces and navigation methods for vascular navigation
CN101375173A (en) * 2006-01-30 2009-02-25 皇家飞利浦电子股份有限公司 Automated system for interventional breast magnetic resonance imaging
JP4533863B2 (en) 2006-03-28 2010-09-01 本田技研工業株式会社 Work positioning table and machine tool provided with the work positioning table

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5417210A (en) * 1992-05-27 1995-05-23 International Business Machines Corporation System and method for augmentation of endoscopic surgery
US5913820A (en) * 1992-08-14 1999-06-22 British Telecommunications Public Limited Company Position location system
US6690963B2 (en) * 1995-01-24 2004-02-10 Biosense, Inc. System for determining the location and orientation of an invasive medical instrument
US5729129A (en) * 1995-06-07 1998-03-17 Biosense, Inc. Magnetic location system with feedback adjustment of magnetic field generator
US20040193006A1 (en) * 1995-07-24 2004-09-30 Chen David T. Anatomical visualization system
US6618612B1 (en) * 1996-02-15 2003-09-09 Biosense, Inc. Independently positionable transducers for location system
US6332089B1 (en) * 1996-02-15 2001-12-18 Biosense, Inc. Medical procedures and apparatus using intrabody probes
US6129668A (en) * 1997-05-08 2000-10-10 Lucent Medical Systems, Inc. System and method to determine the location and orientation of an indwelling medical device
US6201387B1 (en) * 1997-10-07 2001-03-13 Biosense, Inc. Miniaturized position sensor having photolithographic coils for tracking a medical probe
US7174201B2 (en) * 1999-03-11 2007-02-06 Biosense, Inc. Position sensing system with integral location pad and position display
US20020049375A1 (en) * 1999-05-18 2002-04-25 Mediguide Ltd. Method and apparatus for real time quantitative three-dimensional image reconstruction of a moving organ and intra-body navigation
US7697973B2 (en) * 1999-05-18 2010-04-13 MediGuide, Ltd. Medical imaging and navigation system
US6427079B1 (en) * 1999-08-09 2002-07-30 Cormedica Corporation Position and orientation measuring with magnetic fields
US7285117B2 (en) * 2002-03-15 2007-10-23 Boston Scientific Scimed, Inc. Medical device control systems
US20040068178A1 (en) * 2002-09-17 2004-04-08 Assaf Govari High-gradient recursive locating system
US20040147920A1 (en) * 2002-10-21 2004-07-29 Yaron Keidar Prediction and assessment of ablation of cardiac tissue
US20040267125A1 (en) * 2003-06-26 2004-12-30 Skyba Danny M. Adaptive processing of contrast enhanced ultrasonic diagnostic images
US20050099290A1 (en) * 2003-11-11 2005-05-12 Biosense Webster Inc. Digital wireless position sensor
US20060100505A1 (en) * 2004-10-26 2006-05-11 Viswanathan Raju R Surgical navigation using a three-dimensional user interface
US20060241445A1 (en) * 2005-04-26 2006-10-26 Altmann Andres C Three-dimensional cardial imaging using ultrasound contour reconstruction
US20090080738A1 (en) * 2007-05-01 2009-03-26 Dror Zur Edge detection in ultrasound images

Cited By (66)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090264966A1 (en) * 2004-11-02 2009-10-22 Pixeloptics, Inc. Device for Inductive Charging of Implanted Electronic Devices
US9801709B2 (en) 2004-11-02 2017-10-31 E-Vision Smart Optics, Inc. Electro-active intraocular lenses
US10729539B2 (en) 2004-11-02 2020-08-04 E-Vision Smart Optics, Inc. Electro-chromic ophthalmic devices
US8778022B2 (en) 2004-11-02 2014-07-15 E-Vision Smart Optics Inc. Electro-active intraocular lenses
US11107587B2 (en) 2008-07-21 2021-08-31 The Board Of Trustees Of The Leland Stanford Junior University Method for tuning patient-specific cardiovascular simulations
US10354050B2 (en) 2009-03-17 2019-07-16 The Board Of Trustees Of Leland Stanford Junior University Image processing method for determining patient-specific cardiovascular information
US20110218550A1 (en) * 2010-03-08 2011-09-08 Tyco Healthcare Group Lp System and method for determining and adjusting positioning and orientation of a surgical device
US20110237936A1 (en) * 2010-03-25 2011-09-29 Medtronic, Inc. Method and Apparatus for Guiding an External Needle to an Implantable Device
US8475407B2 (en) 2010-03-25 2013-07-02 Medtronic, Inc. Method and apparatus for guiding an external needle to an implantable device
US8483802B2 (en) 2010-03-25 2013-07-09 Medtronic, Inc. Method and apparatus for guiding an external needle to an implantable device
US20110237935A1 (en) * 2010-03-25 2011-09-29 Medtronic, Inc. Method and Apparatus for Guiding an External Needle to an Implantable Device
US20110237937A1 (en) * 2010-03-25 2011-09-29 Medtronic, Inc. Method and Apparatus for Guiding an External Needle to an Implantable Device
US9113812B2 (en) 2010-03-25 2015-08-25 Medtronic, Inc. Method and apparatus for guiding an external needle to an implantable device
US9216257B2 (en) 2010-03-25 2015-12-22 Medtronic, Inc. Method and apparatus for guiding an external needle to an implantable device
US9339601B2 (en) * 2010-03-25 2016-05-17 Medtronic, Inc. Method and apparatus for guiding an external needle to an implantable device
US9586012B2 (en) 2010-03-25 2017-03-07 Medtronic, Inc. Method and apparatus for guiding an external needle to an implantable device
US20110238034A1 (en) * 2010-03-25 2011-09-29 Medtronic, Inc. Method and Apparatus for Guiding an External Needle to an Implantable Device
DE102010027526B4 (en) * 2010-07-16 2012-04-19 Gottfried Wilhelm Leibniz Universität Hannover Hand-held surveying and projection system and method
DE102010027526A1 (en) * 2010-07-16 2012-01-19 Gottfried Wilhelm Leibniz Universität Hannover Hand-guided measurement and projection system for projecting images of patient e.g. human, has data processing system correcting given images regarding surface course and projecting corrected image on surface
US10154883B2 (en) 2010-08-12 2018-12-18 Heartflow, Inc. Method and system for image processing and patient-specific modeling of blood flow
US10327847B2 (en) 2010-08-12 2019-06-25 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US9743835B2 (en) 2010-08-12 2017-08-29 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US9839484B2 (en) 2010-08-12 2017-12-12 Heartflow, Inc. Method and system for image processing and patient-specific modeling of blood flow
US11793575B2 (en) 2010-08-12 2023-10-24 Heartflow, Inc. Method and system for image processing to determine blood flow
US9855105B2 (en) 2010-08-12 2018-01-02 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US9861284B2 (en) 2010-08-12 2018-01-09 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US9888971B2 (en) 2010-08-12 2018-02-13 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US10052158B2 (en) 2010-08-12 2018-08-21 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US10080613B2 (en) 2010-08-12 2018-09-25 Heartflow, Inc. Systems and methods for determining and visualizing perfusion of myocardial muscle
US10080614B2 (en) 2010-08-12 2018-09-25 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US10092360B2 (en) 2010-08-12 2018-10-09 Heartflow, Inc. Method and system for image processing and patient-specific modeling of blood flow
US10149723B2 (en) 2010-08-12 2018-12-11 Heartflow, Inc. Method and system for image processing and patient-specific modeling of blood flow
US11583340B2 (en) 2010-08-12 2023-02-21 Heartflow, Inc. Method and system for image processing to determine blood flow
US10159529B2 (en) 2010-08-12 2018-12-25 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US10166077B2 (en) 2010-08-12 2019-01-01 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US10321958B2 (en) 2010-08-12 2019-06-18 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US11090118B2 (en) 2010-08-12 2021-08-17 Heartflow, Inc. Method and system for image processing and patient-specific modeling of blood flow
US11298187B2 (en) 2010-08-12 2022-04-12 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US10376317B2 (en) 2010-08-12 2019-08-13 Heartflow, Inc. Method and system for image processing and patient-specific modeling of blood flow
US10441361B2 (en) 2010-08-12 2019-10-15 Heartflow, Inc. Method and system for image processing and patient-specific modeling of blood flow
US10478252B2 (en) 2010-08-12 2019-11-19 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US10492866B2 (en) 2010-08-12 2019-12-03 Heartflow, Inc. Method and system for image processing to determine blood flow
US11154361B2 (en) 2010-08-12 2021-10-26 Heartflow, Inc. Method and system for image processing to determine blood flow
US10531923B2 (en) 2010-08-12 2020-01-14 Heartflow, Inc. Method and system for image processing to determine blood flow
US11135012B2 (en) 2010-08-12 2021-10-05 Heartflow, Inc. Method and system for image processing to determine patient-specific blood flow characteristics
US10682180B2 (en) 2010-08-12 2020-06-16 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US10702340B2 (en) 2010-08-12 2020-07-07 Heartflow, Inc. Image processing and patient-specific modeling of blood flow
US10702339B2 (en) 2010-08-12 2020-07-07 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US11116575B2 (en) 2010-08-12 2021-09-14 Heartflow, Inc. Method and system for image processing to determine blood flow
US11033332B2 (en) 2010-08-12 2021-06-15 Heartflow, Inc. Method and system for image processing to determine blood flow
US9801689B2 (en) 2010-08-12 2017-10-31 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US11083524B2 (en) 2010-08-12 2021-08-10 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US20140171785A1 (en) * 2012-12-17 2014-06-19 Biosense Webster (Israel), Ltd. Recognizing which instrument is currently active
US11103174B2 (en) 2013-11-13 2021-08-31 Biosense Webster (Israel) Ltd. Reverse ECG mapping
EP2875780A1 (en) 2013-11-21 2015-05-27 Biosense Webster (Israel), Ltd. Tracking of catheter using impedance measurements
US9629570B2 (en) 2013-11-21 2017-04-25 Biosense Webster (Israel) Ltd. Tracking of catheter from insertion point to heart using impedance measurements
CN103989520A (en) * 2014-04-30 2014-08-20 西安云合生物科技有限公司 Multifunctional touch electrotome
CN107485388A (en) * 2016-06-09 2017-12-19 韦伯斯特生物官能(以色列)有限公司 The dual-functional sensor of basket catheter
USD931335S1 (en) 2017-07-06 2021-09-21 Biosense Webster (Israel) Ltd. Display screen or portion thereof with icon
USD976956S1 (en) 2017-07-06 2023-01-31 Biosense Webster (Israel) Ltd. Display screen or portion thereof with icon
USD882633S1 (en) 2017-07-06 2020-04-28 Biosense Webster (Israel) Ltd. Display screen or portion thereof with icon
USD1000466S1 (en) 2017-07-06 2023-10-03 Biosense Webster (Israel) Ltd. Display screen or portion thereof with icon
US11786319B2 (en) 2017-12-14 2023-10-17 Verb Surgical Inc. Multi-panel graphical user interface for a robotic surgical system
CN110533999A (en) * 2018-09-05 2019-12-03 南京阿波罗机器人科技有限公司 A kind of teaching robot's calibration method and teaching robot
US11723517B2 (en) * 2019-12-31 2023-08-15 Biosense Webster (Israel) Ltd. Wiring of trocar having movable camera and fixed position sensor
US20210196105A1 (en) * 2019-12-31 2021-07-01 Biosense Webster (Israel) Ltd. Wiring of trocar having movable camera and fixed position sensor

Also Published As

Publication number Publication date
EP2096523B1 (en) 2013-07-10
CN104605855B (en) 2017-09-08
MX2009002363A (en) 2009-08-31
CA2656309A1 (en) 2009-08-29
CN101530325A (en) 2009-09-16
BRPI0901476B1 (en) 2019-11-26
CN104605855A (en) 2015-05-13
CA2656309C (en) 2016-11-22
KR20090093877A (en) 2009-09-02
IL197318A0 (en) 2009-12-24
BRPI0901476B8 (en) 2021-06-22
JP5436886B2 (en) 2014-03-05
US8926511B2 (en) 2015-01-06
EP2096523A1 (en) 2009-09-02
AU2009200770A1 (en) 2009-09-17
IL197318A (en) 2015-06-30
MX350265B (en) 2017-08-31
JP2009207895A (en) 2009-09-17
AU2009200770B2 (en) 2014-11-27
KR101612278B1 (en) 2016-04-14
BRPI0901476A2 (en) 2010-01-26

Similar Documents

Publication Publication Date Title
US8926511B2 (en) Location system with virtual touch screen
US20220346886A1 (en) Systems and methods of pose estimation and calibration of perspective imaging system in image guided surgery
EP2928408B1 (en) Medical device navigation system
US8213693B1 (en) System and method to track and navigate a tool through an imaged subject
EP2769689B1 (en) Computer-implemented technique for calculating a position of a surgical device
US20060116576A1 (en) System and use thereof to provide indication of proximity between catheter and location of interest in 3-D space
US8364242B2 (en) System and method of combining ultrasound image acquisition with fluoroscopic image acquisition
US8527032B2 (en) Imaging system and method of delivery of an instrument to an imaged subject
CA2586818C (en) Enhanced ultrasound image display
CA2644886C (en) Flashlight view of an anatomical structure
US20080287805A1 (en) System and method to guide an instrument through an imaged subject
US20080287783A1 (en) System and method of tracking delivery of an imaging probe
US7940972B2 (en) System and method of extended field of view image acquisition of an imaged subject
WO2006060421A2 (en) System for registering an image with a navigational reference catheter
JP2011515178A (en) Target localization of X-ray images
US8948476B2 (en) Determination of cardiac geometry responsive to doppler based imaging of blood flow characteristics
US20230263580A1 (en) Method and system for tracking and visualizing medical devices
WO2012001550A1 (en) Method and system for creating physician-centric coordinate system

Legal Events

Date Code Title Description
AS Assignment

Owner name: BIOSENSE WEBSTER, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BAR-TAL, MEIR;REEL/FRAME:020901/0016

Effective date: 20080413

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551)

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8