US20080226148A1 - Method of image quality assessment to produce standardized imaging data - Google Patents

Method of image quality assessment to produce standardized imaging data Download PDF

Info

Publication number
US20080226148A1
US20080226148A1 US12/075,910 US7591008A US2008226148A1 US 20080226148 A1 US20080226148 A1 US 20080226148A1 US 7591008 A US7591008 A US 7591008A US 2008226148 A1 US2008226148 A1 US 2008226148A1
Authority
US
United States
Prior art keywords
image
region
interest
area
assessment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US12/075,910
Other versions
US8295565B2 (en
Inventor
Jia Gu
Wenjing Li
John Hargrove
Rolf Holger Wolters
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
STI Medical Systems LLC
Original Assignee
STI Medical Systems LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by STI Medical Systems LLC filed Critical STI Medical Systems LLC
Priority to US12/075,910 priority Critical patent/US8295565B2/en
Publication of US20080226148A1 publication Critical patent/US20080226148A1/en
Assigned to STI MEDICAL SYSTEMS, LLC reassignment STI MEDICAL SYSTEMS, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GU, JIA, HARGROVE, JOHN TAYLOR, LI, WENJING, WOLTERS, ROLF HOLGER
Application granted granted Critical
Publication of US8295565B2 publication Critical patent/US8295565B2/en
Assigned to CADES SCHUTTE A LIMITED LIABILITY LAW PARTNERSHIP LLP reassignment CADES SCHUTTE A LIMITED LIABILITY LAW PARTNERSHIP LLP UCC FINANCING STATEMENT Assignors: STI MEDICAL SYSTEMS, LLC
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration by the use of histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • G06T5/90
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0082Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
    • A61B5/0084Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for introduction into the body, e.g. by catheters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10141Special mode during image acquisition
    • G06T2207/10152Varying illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Definitions

  • This invention generally relates to medical imaging and more specifically to image processing to achieve high-quality standardized digital imagery for use in archive-quality medical records and Computer-Aided-Diagnosis (CAD) systems.
  • CAD Computer-Aided-Diagnosis
  • Uterine cervical cancer is the second most common cancer in women worldwide, with nearly 500,000 new cases and over 270,000 deaths annually (http://www-depdb.iarc.fr/globocan2002.htm, incorporated herein by reference)
  • Colposcopy is a diagnostic method used to detect cancer precursors and cancer of the uterine cervix (B. S. Apgar, Brotzman, G. L. and Spitzer, M., Colposcopy: Principles and Practice, W. B. Saunders Company: Philadelphia, 2002, incorporated herein by reference).
  • CAD for colposcopy represents a new application of medical image processing.
  • the inventors have developed a CAD system that mimics or emulates the diagnostic process used by colposcopists to assess the severity of abnormalities (Lange H. and Ferris, Daron G.; Computer-Aided-Diagnosis (CAD) for colposcopy; SPIE Medical Imaging 2005; SPIE Proc. 5747, 2005, incorporated herein by reference). Scoring schemes, like the Reid's colposcopic index are an aid for making colposcopic diagnoses (Reid R, Scalzi P. Genital warts and cervical cancer. VII. An improved colposcopic index for differentiating benign papillomaviral infection from high-grade cervical intraepithelial neoplasia.
  • This invention includes a systematic framework of algorithms that automatically assesses cervical images acquired from a digital colposcope. This assessment results in a filtered dataset of images that can then be used for CAD algorithms. This invention can be used to control image acquisition, which guarantees quality of input imagery for CAD systems and archive-quality medical records, and can also be used in telemedicine cervical cancer diagnosis.
  • the limited quality of cervical imagery can be attributed to several factors, including: incorrect instrument settings, incorrect instrument positioning, glint, blur due to poor focus, and physical contaminants.
  • Glint specular reflection
  • Specular reflection is perfect, mirror-like reflection of light from a surface, in which light from a single incoming direction (a ray) is reflected into a single outgoing direction.
  • a pixel is a single point in a graphic image and is the smallest single element of an image. Each pixel in an image has its own value that correlates to its brightness or intensity.
  • each pixel can be described using its hue, saturation, and value (HSV) or hue, saturation, lightness (HSL), but is usually represented instead as the red, green, and blue (RGB) intensities.
  • Hue, saturation, and intensity (HSI) and hue, saturation, and brightness (HSB) are alternative names for HSV and HSL.
  • HSL and HSV can be used to represent colors as points in a cylinder whose central axis ranges from black at the bottom to white at the top with neutral colors between them, where the angle around the axis corresponds to “hue”, distance from the axis corresponds to “saturation”, and distance along the axis corresponds to “lightness”, “value”, and “brightness”.
  • Instrument settings that result in an inadequate dynamic range (defined below) or overly constrained (too small) region of interest can reduce or eliminate pixel information and thus make image analysis algorithms unreliable. Poor focus causes image blur with a consequent loss of texture information.
  • a variety of physical contaminants, such as blood, can obscure the desired scene, and reduce or eliminate diagnostic information from affected areas.
  • the present invention proposes a series of image quality assessment algorithms called Active Image Quality Assessment (AIQA), which include locating a region of interest, region assessment of the image, contrast assessment of the image, blur assessment of the image and contamination detection.
  • AIQA Active Image Quality Assessment
  • These algorithms are specifically designed for cervical imaging, but can be applied to other types of tissue imaging as well. While many of the algorithms described herein are well-known in the art, the inventors are unaware of any other image processing method that uses the specific blur assessment algorithm of this invention, or its application to CAD technology.
  • the following patents and patent applications may be considered relevant to the field of the invention:
  • U.S. Pat. No.7,298,883 to Giger et al. discloses a computer-aided diagnosis (CAD) scheme to aid in the detection, characterization, diagnosis, and/or assessment of normal and diseased states (including lesions and/or images).
  • the scheme employs lesion features for characterizing the lesion and includes a non-parametric classification, to aid in the development of CAD methods in a limited database scenario to distinguish between malignant and benign lesions.
  • the non-parametric classification is robust to kernel size.
  • U.S. Pat. No. 7,272,252 to De La Torre-Bueno and McBride discloses a method and apparatus for automated analysis of transmitted and fluorescently labeled biological samples, wherein the apparatus automatically scans at a low magnification to acquire images which are analyzed to determine candidate cell objects of interest. Once candidate objects of interest are identified, further analysis is conducted automatically to process and collect data from samples having different staining agents.
  • a method for automated decision support for medical imaging includes obtaining image data, extracting feature data from the image data, and automatically performing anatomy identification, view identification and/or determining a diagnostic quality of the image data, using the extracted feature data.
  • CAD computer-aided diagnosis
  • the CAD systems implement machine-learning techniques that use a set of training data obtained (learned) from a database of labeled patient cases in one or more relevant clinical domains and/or expert interpretations of such data to enable the CAD systems to “learn” to analyze patient data and make proper diagnostic assessments and decisions for assisting physician workflow.
  • U.S. Pat. No.6,687,329 to Hsieh et al. discloses a method for processing image data comprising: (a) acquiring first image data via an imaging system; (b) processing the first image data in accordance with a CAD algorithm, the CAD algorithm performing at least one of segmenting, identifying and classifying a feature of interest in the first image data; and (c) prescribing acquisition of at least second image data based upon results of the CAD algorithm.
  • U.S. Patent Publication No. 2004/006,8167 to Hsieh, Jiang et al., incorporated herein by reference, discloses a method and system for generating processing image data based on the analysis of an initial image by a CAD algorithm which may perform various analyses such as segmentation, edge and structure identification.
  • the post-processing may enhance a feature of interest in the image as identified by the CAD analysis.
  • Image enhancement may include highlighting a feature of interest and changing the spatial resolution (e.g. zoom).
  • U.S. Pat. No. 6,147,705 to Krauter et al. discloses an apparatus and method for a video colposcope with electronic green filter.
  • a video camera obtains a subject electronic image of a subject object, and using algorithm-driven digital signal processing circuitry (DSP), color saturation, hue, and intensity levels of the subject electronic image are modified according to DSP reference filter algorithm and reference color balance levels as stored, thus producing a modified electronic image corresponding to the subject electronic image.
  • the modified electronic image is outputted to a display in continuous real time as the corresponding subject image is obtained by the video camera.
  • This modified electronic image emulates that obtained through an optical green filter and incorporates a simulated white balance.
  • U.S. Pat. No. 5,982,917 to Clarke, et al. discloses a computer-assisted diagnostic (CAD) method and apparatus are described for the enhancement and detection of suspicious regions in digital X-ray images, with particular emphasis on early cancer detection using digital mammography.
  • One objective is to improve the sensitivity of detection of suspicious areas such as masses, while maintaining a low false positive detection rate, and to classify masses as benign or malignant.
  • a modular CAD technique has been developed as a potentially automatic and/or second-opinion method for mass detection and classification in digital mammography that may in turn be readily modified for application with different digital X-ray detectors with varying gray-scale and resolution characteristics.
  • the method consists of using a plurality of CAD modules to preprocess and enhance image features in the gray-level, the directional texture, and the morphological domains.
  • U.S. Pat. No. 5,740,801 to Branson discloses a system for acquiring images during a medical procedure and using the acquired images which includes a storage device for storing, for each one of a plurality of users of the system, or for each one of a plurality of medical procedures, or for each one of a plurality of input or output devices, information that indicates one or more processing operations to be performed on images obtained by an input device.
  • a system processor responds to an identity the user who is currently using the system by performing processing operations on the obtained images and applying the images to an output device based on the stored information that corresponds to the current user.
  • the present invention consists of a system framework of assessment algorithms to produce standardized imaging data suitable for archive-quality electronic medical records and for CAD systems.
  • glare free image data is collected using a digital imager.
  • a region of interest is then located within the image using an image classification algorithm.
  • Region assessment is applied to the located region of interest to evaluate instrument settings, preferably incorrect camera zooming, incorrect camera positioning, and the existence of any obstructions in the image.
  • the region of interest is then assessed for proper contrast using a histogram-based algorithm.
  • a blur assessment is performed without the use of a reference image.
  • the region of interest is divided into non-overlapping blocks.
  • a local measurement is computed for each of the blocks based on frequency information using an image power spectrum to create two-dimensional display.
  • the two dimensional display is then converted to a one-dimensional display that is separated into three corresponding areas: low frequency, high frequency and noise.
  • the degree of blur for each of the blocks is then calculated as the ratio of the low-frequency to high frequency areas.
  • the degree of blur is then used to determine if the block is a blurred block.
  • the total percentage of blurred blocks in the image to determine if said image is blurry.
  • the invention detects contamination (obstructions) in the image using a training stage and a classification algorithm.
  • FIG. 1 shows the system framework of automatic image quality assessment algorithms.
  • FIG. 2 shows the algorithm framework of probability based cervix region detection.
  • FIG. 3( a ), FIG. 3( b ) and FIG. 3( c ) show the HSI transformation: FIG. 3( a ) is the original RGB image, FIG. 3( b ) is the transformed hue image, and FIG. 3( c ) is the histogram of hue values.
  • FIG. 4( a ) and FIG. 4( b ) show the hue shifting: FIG. 4( a ) is the histogram of shifted hue values and FIG. 4( b ) is the shifted hue image.
  • FIG. 5( a ), FIG. 5( b ), FIG. 5( c ), FIG. 5( d ), FIG. 5( e ) and FIG. 5( f ) show the histogram smoothing process: FIG. 5( a ) shows an RGB color image after each red, green, and blue component is smoothed separately; FIG. 5( b ) shows the corresponding hue image after each red, green, and blue component is smoothed separately; FIG. 5( c ) shows the histogram of hue values of the RGB image before each red, green, and blue component is smoothed separately; FIG. 5( d ) shows the histogram of the hue values of the RGB image after smoothing; FIG. 5( e ) shows the histogram of hue values before the whole histogram itself is smoothed, FIG. 5( f ) shows the histogram of hue values after the whole histogram itself is smoothed.
  • FIG. 6 shows the expected histogram of hue values wherein the histogram has a peak (light gray) in the region corresponding to the cervix and vaginal sidewalls, and a very close peak (dark gray) to the right of it when the body parts outside the cervix and vaginal sidewalls are visible.
  • FIG. 7( a ) and FIG. 7( b ) shows the fitted Gaussian model and the classification result: FIG. 7( a ) Fitted Gaussian model (2 classes); FIG. 7( b ) Segmentation result
  • FIG. 8( a ), FIG. 8( b ), and FIG. 8( c ) show post-processing results: FIG. 8( a ) Cervix region—obtained by classification, FIG. 8 ( b ) Cervix region—closed holes. FIG. 8( c ) Cervix region—cleaned up small regions.
  • FIG. 9( a ) and FIG. 9( b ) show the region assessment examples. Circles represent the ellipse fitting, rectangles represent the surrounding box and the crosses represent the mass center of the cervix region.
  • FIG. 10( a ), FIG. 10( b ), FIG. 10( c )( 1 ), FIG. 10( c )( 2 ), FIG. 10( d )( 1 ), FIG. 10( d )( 2 ), FIG. 10( e )( 1 ) and FIG. 10( e )( 2 ) show some results of region assessment: FIG. 10( a ) Incorrect camera zooming (the cervical region is too small).
  • FIG. 10( b ) Incorrect camera positioning (the cervical region is not well centered).
  • FIG. 10( c )- FIG. 10( e ) show 3 examples of improper existence of obstacles or partial visible cervix region: FIG. 10( c )( 1 ) and FIG.
  • FIG. 10( c )( 2 ) show the existence of a cotton swab
  • FIG. 10( d )( 1 ) and FIG. 10( d )( 2 ) show the existence of a speculum
  • FIG. 10( e )( 1 ) and FIG. 10( e )( 2 ) show the extraordinary strong illumination caused by an abnormal camera setting, which impaired the image visibility. All the 3 examples above indicate poor image quality and that the entire cervix region cannot be seen clearly.
  • FIG. 11( a ) and FIG. 11( b ) show an example of image contrast analysis: FIG. 11( a ) An example of good contrast image, and FIG. 11( b ) its corresponding histogram in red (R) channel.
  • FIG. 12( a ) and FIG. 12( b ) show an example of false peak removal: FIG. 12( a ) Input image; FIG. 12( b ) shows the corresponding histogram in R Channel (False peak exists); FIG. 12( c ) Gaussian fitted histogram (preferably keeping only the rightmost peak).
  • FIG. 13 shows a flowchart of the blur assessment.
  • FIG. 14( a ) and FIG. 14( b ) shows the image power spectrum: FIG. 14( a ) 2D display of image power spectrum; FIG. 14( b ) Conversion of image power spectrum into ID diagram.
  • FIG. 15 shows the Blur detection result (black squares represent areas being detected as blur).
  • FIG. 16( a ), FIG. 16( b ), FIG. 16( c ) and FIG. 16( d ) show 2 examples of contamination detection results: FIG. 16( a ): A portion of cervical image; FIG. 16( b ): Contamination detection (gray area) for the cervical image in FIG. 16( a ); FIG. 16( c ): A portion of cervical image; FIG. 16( d ): Contamination detection (gray area) results for the image in FIG. 16( c ).
  • the presently preferred embodiment of the invention described herein preferably starts from an RGB (Red-Green-Blue) color space image from a digital colposcope.
  • the input image is a glare free RGB image of a uterine cervix.
  • Glare free imagery can be obtained either by cross-polarized (XP) image acquisition or glare removal pre-processing (Lange H.; Automatic glare removal in reflectance imagery of the uterine cervix; SPIE Medical Imaging 2005; SPIE Proc. 5747, 2005, incorporated herein by reference).
  • the invention preferably comprises a framework of robust, real-time, industry-oriented algorithms to carry out the invention using statistical, morphological and signal processing methods.
  • a Region-Of-Interest preferably the cervix
  • an adaptive peak-removing histogram equalization algorithm is used to assess the contrast.
  • a frequency-based method is applied to fulfill the blur assessment.
  • the contamination detection algorithm is accomplished by machine learning and a classification algorithm.
  • FIG. 1 The framework for the invention is shown in FIG. 1 .
  • ROI Region of Interest
  • the ROI is detected using a hue color classifier that discriminates between the cervix and the background.
  • a hue color classifier that discriminates between the cervix and the background.
  • the glare free RGB image is transformed into an HSI (Hue-Saturation-Intensity) image through HSI transformation.
  • a histogram of the hue values is created and histogram smoothing is performed to reduce the inherent noise of the hue values.
  • EM Expectation-Maximization
  • segmentation is performed based on the likelihood of a pixel belonging to one of the peaks.
  • a Gaussian peak is the peak of a fitted curve, assuming that the histogram is an approximated Gaussian distribution (also known as a normal distribution).
  • Segmentation refers to the process of partitioning a digital image into multiple regions (sets of pixels). The goal of segmentation is to simplify and/or change the representation of an image into something that is more meaningful and easier to analyze.
  • An EM algorithm is used in statistics for finding maximum likelihood estimates of parameters in probabilistic modes, where the model depends on unobserved latent variables. EM alternates between performing an expectation (E) step, which computes an expectation of the likelihood by including the latent variable as if they were observed, and a maximization (M) step, which computes the maximum likelihood estimates of the parameters by maximizing the expected likelihood found on the E step. The parameters found on the M step are then used to begin another E step, and the process is repeated. Once this is completed, post-processing on the ROI is performed. The ROI is then analyzed using region assessment and contrast assessment.
  • the preferred algorithm framework of ROI detection is shown in FIG. 2 .
  • the invention preferably uses a combination of various color-based algorithms. Those algorithms are supervised by a training algorithm in order to be directed specifically to the cervical tissue.
  • a matching-filter algorithm to classify the image pixels into background and cervix, using a spectral angle of RGB space is employed.
  • the algorithm first defines a target spectrum (corresponding the cervix) using a grid box sampling method.
  • the target spectrum and all samples of spectra can be represented in multi-dimensional space. In this space, the red pixel value of a spectrum corresponds to the value in the red dimension, the green pixel value of a spectrum corresponds to the value in the green dimension, and the blue pixel value of a spectrum corresponds to the value in the blue dimension.
  • the red, green, and blue values together define a vector in this space.
  • the target spectrum and all samples of spectra can be represented as vectors this space.
  • the spectral angle between the target spectrum vector and each sample spectrum vector can be assessed. Pixels for which this angle is less than some minimum threshold are deemed sufficiently similar to the cervical target pixels to pass filtering. In high-resolution image acquisition, a more accurate EM algorithm using HSV space is used.
  • the algorithm can be easily extended to alternate color spaces, such as LAB and LUV.
  • the hue color feature is used to characterize the color of the pixels.
  • the input RGB image is transformed from the RGB color space to the HSI color space, keeping the hue component for calculations.
  • the original RGB image and transformed hue image are shown in FIG. 3( a ) and FIG. 3( b ) respectively.
  • the hue values of interest are preferably located using a roll-over spectrum, which can easily be seen in the histogram of the hue values in FIG. 3( c ).
  • the roll-over spectrum is a technique used to generate the hue histogram. It involves shifting the phase of the histogram by 180 degrees.
  • all hue values are shifted by the median value (180 degrees) of the entire dynamic range of all hue values (360 degrees).
  • the dynamic range is a term used to describe the ratio between the smallest and largest possible values of a variable quantity.
  • the shifted histogram is shown in FIG. 4( a ).
  • the red color now corresponds to the middle value of the hue channel.
  • Color digital images are made of pixels, and pixels are made of combinations of primary colors.
  • a channel in this context is the grayscale image of the same size as a color image, made of just one of these primary colors. For instance, an image from a standard digital camera will have a red, green and blue channel.
  • a grayscale image has just one channel.
  • FIG. 4( b ) shows the shifted hue image.
  • the histogram of the hue values is inherently very noisy. Therefore, it is preferable to smooth the histogram.
  • Smoothers or smoothing filters, are algorithms for time-series processing that reduce abrupt changes in the time-series and make it look smoother. Smoothers constitute a broad subclass of filters. Like all filters, smoothers may be subdivided into linear and nonlinear. Linear filters reduce the power of higher frequencies in the spectrum and preserve the power of lower frequencies
  • the image in each R, G, B component is smoothed separately, with a 3 ⁇ 3 median filter followed by a 7 ⁇ 7 Gaussian filter with a sigma (the variance) of 1.5.
  • the median filter is preferably a non-linear digital filtering technique, often used to remove noise from images or other signals.
  • the Gaussian filter is preferably a linear filter that is also used as a smoother.
  • the output of the Gaussian filter at the moment t is the weighted mean of the input values, and the weights are defined by formula.
  • FIG. 5( a ), 5( b ), 5( c ), and 5( d ) The results for the RGB color image, the hue characteristics image and the histogram before each R, G, B component is smoothed and after each R, G, B component is smoothed are shown in FIG. 5( a ), 5( b ), 5( c ), and 5( d ) respectively.
  • the histogram itself is smoothed, preferably using an opening-closing Alternating Sequential Filter (ASF) with a horizontal line structuring element of size 2 (SDC Morphology Toolbox for MATLAB® Version 1.3 of 21 Apr. 2004 http://www.mmorph.com/—SDC Information Systems, incorporated herein by reference).
  • FIG. 5( e ) shows the histogram before smoothing
  • FIG. 5( f ) shows the histogram after smoothing.
  • the expected hue value histogram has a peak (light gray) for the region corresponding to the cervix and vagina sidewalls, and a very close peak (dark gray) to the right of it when the body parts are outside the cervix and the vaginal sidewalls are visible, as shown in FIG. 6 .
  • Hue can be expressed as an angle vector in color space. Therefore, the histogram can be split by using the value of this angle.
  • heuristic based thresholding hard thresholding
  • an EM algorithm Dempster, A., Laird, N., and Rubin, D.
  • FIGS. 7( a ) and 7 ( b ) show the fitted Gaussian model (2 classes) and FIG. 7( b ) depicts the segmentation result.
  • the cervix region is preferably cleaned up by post-processing. First holes are closed by filling them in using a morphological operation. Then, small regions are deleted using an opening-closing ASF with reconstruction using a cross as structuring element of size 8. The results are shown in FIG. 8( a )-( c ) where FIG. 8( a ) shows the cervix region obtained by classification, FIG. 8 ( b ) shows the cervix region with closed holes, and FIG. 8 ( c ) shows the cervix region with small regions cleaned up.
  • this information is then used to assess some of the instrument settings. Poor image quality can be attributed to several factors such as:
  • Region assessment preferably consists of the following steps: (1) Fitting the detected ROI into a surrounding box; (2) calculating the dimensions and area of the surrounding box; (3) calculating the mass center of the ROI (the geometric center of the ROI) and (4) fitting the ROI to an ellipse (See FIG. 9( a ) and FIG. 9( b ) for examples of a cervix fitted with a box and an ellipse).
  • the mass center of the ROI is calculated using the following integral equation:
  • ⁇ (r) is the density function
  • r is the position function.
  • the density of a two-dimensional image of the ROI can be easily understood if you consider an ROI (with white area indicating where the cervix is located and black area indicating where the background is located, then the density function is simply 1 in the white area and 0 in the black area.
  • the surrounding box is used to compute parameters of the fitted ellipse. The above information is then used to do region assessment based on the following criteria:
  • the ratio of the ROI area to the entire image area is preferably used. If that ratio is larger than a threshold value between approximately 0.25 to 0.45, with 0.35 being the preferred value, the camera zooming is deemed satisfactory, otherwise an error message will be displayed and notify the operator of incorrect camera zooming (See FIG. 10( a ) for an example where the cervical region is too small).
  • the distance between the cervix's mass center and the image's center is preferably used. If the distance is smaller than approximately 0.1 to 0.3 times the image width, with 0.2 being the preferred value, we deem that the cervix region is centered. Otherwise the camera positioning is not satisfactory (See FIG. 10( b ) for an example where the cervical region is not centered).
  • the ROI is compared to the fitted ellipse (the algorithm assumes that a fully visible cervix region is elliptical or near elliptical in shape). If the difference between the ROI's area and the area of the ellipse is greater than approximately 0.2 to 0.3 times the area of the ROI (cervix region), with 0.25 being the preferred value, the cervix is deemed obstructed by the improper existence of obstacles (see FIGS. 10( c )- 10 ( e ) for examples).
  • Contrast is a measure of the gradation (difference) in luminance (brightness) that provides information. Generally speaking, contrast tells us the smallest difference two nearby signals can have and still be distinguishable.
  • a simple but robust way to assess image contrast is to do histogram analysis over the ROI. Empirically, the ROI (the cervix) is usually pinkish. The reflectance of the cervix is highest in the range of the optical spectrum corresponding to red. Therefore the quality of contrast can be assessed by analyzing the dynamic range of the red channel.
  • the optical spectrum also known as the visible spectrum is the portion of the electromagnetic spectrum that is visible to the human eye.
  • Electromagnetic radiation in this range of wavelengths is called visible light or simply light.
  • a typical human eye will respond to wavelengths in air from about 380 to 750 nanometers (equal to one billionth of a meter).
  • Red is any of a number of similar colors evoked by light consisting predominantly of the longest wavelengths of light discernible by the human eye, in the wavelength range of roughly 625-740 nanometer.
  • FIG. 11( a ) and FIG. 11( b ) shows an image in good contrast and its corresponding histogram.
  • the assessment employs the histogram smoothing and Gaussian fitting, described in section 2.3 to smooth and remove any false peaks to provide a single-peak histogram.
  • the assessment preferably analyzes the dynamic range by the peak farthest to the right in the R (red) channel histogram because it provides robust results. See FIG. 12 ( a ), FIG. 12( b ), and FIG. 12( c ) for example.
  • the dynamic range is deemed satisfactory if the peak of the histogram is larger than 4 ⁇ 5 of the full dynamic range, otherwise an error message will be displayed and notify the operator of incorrect camera contrast setting.
  • histogram analysis can be done by calculating the maximal acceptable digital number (DN) of the image compared to the optimal digital number of the camera.
  • a DN is a positive integer value representing the relative brightness of a pixel in a digital image.
  • the exposure time can be adjusted accordingly to make sure the image taken is of good contrast.
  • the detection of blur in images is well-investigated only if there is a reference image to compare to an input image.
  • various measurements such as entropy and Pixel Signal to Noise Ratio (PSNR) are used for determining how blurry the image is by comparing to a reference image.
  • PSNR Pixel Signal to Noise Ratio
  • the presently preferred embodiment of this invention performs the blur assessment in a unique perceptual manner without the use of a reference image.
  • the following limitations are important in the selection of a preferred algorithm: (1) no reference image is used; and (2) the various causes of image blur (e.g. the camera can be out of focus, motion blur, or a combination of the two).
  • Frequency-based methods make real time application possible because of their fast processing speed.
  • the problem with most frequency-based methods is that they are sensitive to structure changes, so that an image that contains more structures but looks blurry may reflect higher quality than an image that contains less structures but has no blur at all.
  • the presently preferred embodiment of this invention performs its blur assessment by evaluating the distribution of wavelengths by using a spatial frequency method that evaluates wavenumber frequency.
  • Wavenumber is the spatial analog of frequency, that is, it is the measurement of the number of repeating units of a propagating wave (the number of occurrences that a wave has the same phase) per unit of space instead of per unit of time.
  • the final step of blur assessment preferably uses a normalized image power spectrum method as the quality measure.
  • FIG. 13 A flow chart of the algorithm is depicted in FIG. 13 .
  • the blocks are preferably be square, but can be of any size and spacing, as long as the distance between two blocks is not greater than approximately 20% of the length of each block.
  • the local measurement of blur is calculated by image power spectrum (Nill N., Bouzas B., Objective Image quality measure derived from digital image power spectra, Optical engineering, April 1992, Vol. 31, 813-825, incorporated herein by reference) and then is normalized by the zero components, which is shown as FIG. 14( a ).
  • the term “local” refers to the fact that the calculation is limited to the area of a particular pixel in an image.
  • the 2D image power spectrum is then transformed into a 1D diagram, by calculating the integral of all the pixels for each radius.
  • polar coordinate integration is used according to each radial value. See FIG. 14( b ).
  • the polar coordinate system is a two-dimensional coordinate system in which each point on a plane is determined by an angle and a distance from an origin.
  • the polar coordinate system is especially useful in situations where the relationship between two points is most easily expressed in terms of angles and distance than the more familiar Cartesian or rectangular coordinate system, by which such a relationship can only be found through trigonometric formulae.
  • radius refers to distance.
  • the blur assessment uses intrinsic information within the image itself.
  • the inventors recognized that an image power spectrum can be a statistical tool for texture analysis and that the high frequency wavenumber information of the texture is always damaged in a blurred image, so the power spectrum is separated into three parts: (a) low frequency area; (b) high frequency area; and (c) noise area.
  • the assessment assumes that the low frequency area represents structure information invariant to blur and the high frequency area represents detailed information that is more sensitive to blur (See FIG. 14( b ) for example, where the image power spectrum ID diagram is separated into 3 parts: (a), (b), and (c)).
  • the degree of blur is then calculated by analyzing the ratio between the low frequency and high frequency areas. When the ratio is smaller than a threshold value, the whole block is considered blurry.
  • the threshold is preferably 0.4 but can range from 0.3 to 0.5. Note that the noise spectrum has been discarded.
  • the global measurement gives out a decision as a whole by using the percentage of the number of blurred blocks in entire image.
  • more weight is given to those blocks in the center of the image than those in the periphery because the invention is concerned with the quality of the image at the image center (where the ROI is located).
  • the weight is used to calculate the coverage rate.
  • the preferred weighted method of calculating the coverage rate is preferably achieved by multiplying the number of blurry blocks in the periphery by 0.8, multiply the number of blurry blocks in the center by 1.2, adding them together, and then divide the sum by the total number of blocks.
  • FIG. 15 shows an example of blurred block detection in the region of interest.
  • Determining the focus of the cervical images can be done by two categories of methods, and they are employed for different purposes.
  • the first method relies on a laser auto-focus system in the hardware.
  • a laser dot is projected onto the surface of the cervix at a tilted angle, and the distance between the camera and the cervix is determined using the offset of the laser dot from to the center of the image.
  • the greater the offset the more blurry the image is.
  • the ideal case is that the laser is at the center.
  • This method is preferably used in low-resolution image acquisition, which gives us a rough estimation about focus.
  • the second method is based on image content only, utilizing a frequency-based algorithm to detect blurred areas on the images. This method can be used in high-resolution image acquisition providing more detailed information, which can be employed to improve the performance of the CAD system.
  • PP parallel-polarized
  • XP cross-polarized
  • the contamination detection in this invention preferably includes a training stage (i.e. machine learning) and a classification stage.
  • the training stage is accomplished by a series of annotations, wherein a physician will manually mark the contamination region.
  • a joint texture/color model is employed to generate the feature vector per image pixel (Chad Carson, Serge Belongie, Hayit Greenspan and Jitendra Malik, Blobworld: Enrage Segmentation Using Expectation-Maximization and Its Application to Image Querying; IEEE Trans. on Pattern Analysis and Machine Intelligence, 24(8), 1026-1038, August 2002, incorporated herein by reference).
  • SVM Support Vector Machine
  • SVMs generally are a set of related supervised learning methods used for classification and regression.
  • the training process uses 3 color features and 3 texture features.
  • the 3 color components are preferably the L*a*b* coordinates found after spatial averaging using a Gaussian filter, and the 3 texture components are preferably anisotropy, polarity and contrast (Zhang J., Liu Y., Zhao T., SVM Based Feature Screening Applied to Hierarchical Cervical Cancer Detection, International Conference on Diagnostic Imaging and Analysis (ICDIA 2002), August, 2002, incorporated herein by reference).
  • an SVM algorithm as supervised classification is preferably utilized, due to adequate, accurate annotation.
  • the assessment algorithms of the invention may also be suitable for image quality assessment for other tissue diagnosis such as colorectal cancer and skin cancer, and could be used for telemedicine applications. They may also be combined with other instruments and methods for systems that automatically analyze and adjust the quality of acquired images.

Abstract

Automated image quality assessment methods, which include locating a region of interest, region assessment, contrast assessment, blur assessment, and contaminant detection, on video data and high-resolution imagery. Where the blur assessment is performed without a reference image by dividing the region into non-overlapping block, measuring the wavenumber frequency of the blocks and calculating the ratio of the low frequency to high frequency areas.

Description

  • This application claims priority to U.S. provisional patent application 60/918,527 filed on Mar. 16, 2007.
  • TECHNICAL FIELD
  • This invention generally relates to medical imaging and more specifically to image processing to achieve high-quality standardized digital imagery for use in archive-quality medical records and Computer-Aided-Diagnosis (CAD) systems.
  • BACKGROUND ART
  • Although this invention is being disclosed in connection with cervical cancer, it is applicable to many other areas of medicine. Uterine cervical cancer is the second most common cancer in women worldwide, with nearly 500,000 new cases and over 270,000 deaths annually (http://www-depdb.iarc.fr/globocan2002.htm, incorporated herein by reference) Colposcopy is a diagnostic method used to detect cancer precursors and cancer of the uterine cervix (B. S. Apgar, Brotzman, G. L. and Spitzer, M., Colposcopy: Principles and Practice, W. B. Saunders Company: Philadelphia, 2002, incorporated herein by reference). CAD for colposcopy represents a new application of medical image processing. The inventors have developed a CAD system that mimics or emulates the diagnostic process used by colposcopists to assess the severity of abnormalities (Lange H. and Ferris, Daron G.; Computer-Aided-Diagnosis (CAD) for colposcopy; SPIE Medical Imaging 2005; SPIE Proc. 5747, 2005, incorporated herein by reference). Scoring schemes, like the Reid's colposcopic index are an aid for making colposcopic diagnoses (Reid R, Scalzi P. Genital warts and cervical cancer. VII. An improved colposcopic index for differentiating benign papillomaviral infection from high-grade cervical intraepithelial neoplasia. Am J Obstet Gynecol 1985;153:611-618, incorporated herein by reference) based on various features, including acetowhitening, vessel patterns and lesion margins. These features are individually assessed and scored before the scores of all features are combined to yield a composite score that grades disease severity. However, the quality of the images must be assessed before further analysis, to ensure reliable scoring. This invention includes a systematic framework of algorithms that automatically assesses cervical images acquired from a digital colposcope. This assessment results in a filtered dataset of images that can then be used for CAD algorithms. This invention can be used to control image acquisition, which guarantees quality of input imagery for CAD systems and archive-quality medical records, and can also be used in telemedicine cervical cancer diagnosis.
  • The limited quality of cervical imagery can be attributed to several factors, including: incorrect instrument settings, incorrect instrument positioning, glint, blur due to poor focus, and physical contaminants. Glint (specular reflection) eliminates the color information in affected pixels and can therefore introduce artifacts in feature extraction algorithms. Specular reflection is perfect, mirror-like reflection of light from a surface, in which light from a single incoming direction (a ray) is reflected into a single outgoing direction. A pixel is a single point in a graphic image and is the smallest single element of an image. Each pixel in an image has its own value that correlates to its brightness or intensity. In a color image, each pixel can be described using its hue, saturation, and value (HSV) or hue, saturation, lightness (HSL), but is usually represented instead as the red, green, and blue (RGB) intensities. Hue, saturation, and intensity (HSI) and hue, saturation, and brightness (HSB) are alternative names for HSV and HSL. HSL and HSV can be used to represent colors as points in a cylinder whose central axis ranges from black at the bottom to white at the top with neutral colors between them, where the angle around the axis corresponds to “hue”, distance from the axis corresponds to “saturation”, and distance along the axis corresponds to “lightness”, “value”, and “brightness”. Instrument settings that result in an inadequate dynamic range (defined below) or overly constrained (too small) region of interest can reduce or eliminate pixel information and thus make image analysis algorithms unreliable. Poor focus causes image blur with a consequent loss of texture information. In addition, a variety of physical contaminants, such as blood, can obscure the desired scene, and reduce or eliminate diagnostic information from affected areas.
  • The present invention proposes a series of image quality assessment algorithms called Active Image Quality Assessment (AIQA), which include locating a region of interest, region assessment of the image, contrast assessment of the image, blur assessment of the image and contamination detection. These algorithms are specifically designed for cervical imaging, but can be applied to other types of tissue imaging as well. While many of the algorithms described herein are well-known in the art, the inventors are unaware of any other image processing method that uses the specific blur assessment algorithm of this invention, or its application to CAD technology. The following patents and patent applications may be considered relevant to the field of the invention:
  • U.S. Pat. No.7,298,883 to Giger et al., incorporated herein by reference, discloses a computer-aided diagnosis (CAD) scheme to aid in the detection, characterization, diagnosis, and/or assessment of normal and diseased states (including lesions and/or images). The scheme employs lesion features for characterizing the lesion and includes a non-parametric classification, to aid in the development of CAD methods in a limited database scenario to distinguish between malignant and benign lesions. The non-parametric classification is robust to kernel size.
  • U.S. Pat. No. 7,272,252 to De La Torre-Bueno and McBride, incorporated herein by reference, discloses a method and apparatus for automated analysis of transmitted and fluorescently labeled biological samples, wherein the apparatus automatically scans at a low magnification to acquire images which are analyzed to determine candidate cell objects of interest. Once candidate objects of interest are identified, further analysis is conducted automatically to process and collect data from samples having different staining agents.
  • U.S. Patent Publication No. 2007/0019854 to Gholap; Abhijeet S. et al., incorporated herein by reference, discloses a method and system of automated digital image analysis of prostrate neoplasms using morphologic patterns. The method and system provide automated screening of prostate needle biopsy specimens in a digital image and automated diagnosis of prostatectomy specimens.
  • U.S. Patent Publication No. 2005/0251013 to Krishnan, et al., incorporated herein by reference, discloses systems and methods for processing a medical image to automatically identify the anatomy and view (or pose) from the medical image and automatically assess the diagnostic quality of the medical image. In one aspect a method for automated decision support for medical imaging includes obtaining image data, extracting feature data from the image data, and automatically performing anatomy identification, view identification and/or determining a diagnostic quality of the image data, using the extracted feature data.
  • U.S. Patent Publication No. 2005/0049497 to Krishnan, et al., incorporated herein by reference, discloses CAD (computer-aided diagnosis) systems and applications for breast imaging are provided, which implement methods to automatically extract and analyze features from a collection of patient information (including image data and/or non-image data) of a subject patient, to provide decision support for various aspects of physician workflow including, for example, automated diagnosis of breast cancer other automated decision support functions that enable decision support for, e.g., screening and staging for breast cancer. The CAD systems implement machine-learning techniques that use a set of training data obtained (learned) from a database of labeled patient cases in one or more relevant clinical domains and/or expert interpretations of such data to enable the CAD systems to “learn” to analyze patient data and make proper diagnostic assessments and decisions for assisting physician workflow.
  • U.S. Pat. No.6,813,374 to Karimi et al., incorporated herein by reference, discloses a method and apparatus to assess the image quality of a CT scanner and verify that a CT scanner meets it is performance specifications.
  • U.S. Pat. No.6,687,329 to Hsieh et al. discloses a method for processing image data comprising: (a) acquiring first image data via an imaging system; (b) processing the first image data in accordance with a CAD algorithm, the CAD algorithm performing at least one of segmenting, identifying and classifying a feature of interest in the first image data; and (c) prescribing acquisition of at least second image data based upon results of the CAD algorithm.
  • U.S. Patent Publication No. 2004/006,8167 to Hsieh, Jiang et al., incorporated herein by reference, discloses a method and system for generating processing image data based on the analysis of an initial image by a CAD algorithm which may perform various analyses such as segmentation, edge and structure identification. The post-processing may enhance a feature of interest in the image as identified by the CAD analysis. Image enhancement may include highlighting a feature of interest and changing the spatial resolution (e.g. zoom).
  • U.S. Pat. No. 6,147,705 to Krauter et al., incorporated herein by reference, discloses an apparatus and method for a video colposcope with electronic green filter. A video camera obtains a subject electronic image of a subject object, and using algorithm-driven digital signal processing circuitry (DSP), color saturation, hue, and intensity levels of the subject electronic image are modified according to DSP reference filter algorithm and reference color balance levels as stored, thus producing a modified electronic image corresponding to the subject electronic image. The modified electronic image is outputted to a display in continuous real time as the corresponding subject image is obtained by the video camera. This modified electronic image emulates that obtained through an optical green filter and incorporates a simulated white balance.
  • U.S. Pat. No. 5,982,917 to Clarke, et al., incorporated herein by reference, discloses a computer-assisted diagnostic (CAD) method and apparatus are described for the enhancement and detection of suspicious regions in digital X-ray images, with particular emphasis on early cancer detection using digital mammography. One objective is to improve the sensitivity of detection of suspicious areas such as masses, while maintaining a low false positive detection rate, and to classify masses as benign or malignant. A modular CAD technique has been developed as a potentially automatic and/or second-opinion method for mass detection and classification in digital mammography that may in turn be readily modified for application with different digital X-ray detectors with varying gray-scale and resolution characteristics. The method consists of using a plurality of CAD modules to preprocess and enhance image features in the gray-level, the directional texture, and the morphological domains.
  • U.S. Pat. No. 5,740,801 to Branson, incorporated herein by reference, discloses a system for acquiring images during a medical procedure and using the acquired images which includes a storage device for storing, for each one of a plurality of users of the system, or for each one of a plurality of medical procedures, or for each one of a plurality of input or output devices, information that indicates one or more processing operations to be performed on images obtained by an input device. A system processor responds to an identity the user who is currently using the system by performing processing operations on the obtained images and applying the images to an output device based on the stored information that corresponds to the current user.
  • DISCLOSURE OF INVENTION
  • The present invention described herein and more fully below, consists of a system framework of assessment algorithms to produce standardized imaging data suitable for archive-quality electronic medical records and for CAD systems. First, glare free image data is collected using a digital imager. A region of interest is then located within the image using an image classification algorithm. Region assessment is applied to the located region of interest to evaluate instrument settings, preferably incorrect camera zooming, incorrect camera positioning, and the existence of any obstructions in the image. The region of interest is then assessed for proper contrast using a histogram-based algorithm. Next, a blur assessment is performed without the use of a reference image. The region of interest is divided into non-overlapping blocks. A local measurement is computed for each of the blocks based on frequency information using an image power spectrum to create two-dimensional display. The two dimensional display is then converted to a one-dimensional display that is separated into three corresponding areas: low frequency, high frequency and noise. The degree of blur for each of the blocks is then calculated as the ratio of the low-frequency to high frequency areas. The degree of blur is then used to determine if the block is a blurred block. The total percentage of blurred blocks in the image to determine if said image is blurry. Finally, the invention detects contamination (obstructions) in the image using a training stage and a classification algorithm.
  • BRIEF DESCRIPTION OF DRAWINGS
  • FIG. 1 shows the system framework of automatic image quality assessment algorithms.
  • FIG. 2 shows the algorithm framework of probability based cervix region detection.
  • FIG. 3( a), FIG. 3( b) and FIG. 3( c) show the HSI transformation: FIG. 3( a) is the original RGB image, FIG. 3( b) is the transformed hue image, and FIG. 3( c) is the histogram of hue values.
  • FIG. 4( a) and FIG. 4( b) show the hue shifting: FIG. 4( a) is the histogram of shifted hue values and FIG. 4( b) is the shifted hue image.
  • FIG. 5( a), FIG. 5( b), FIG. 5( c), FIG. 5( d), FIG. 5( e) and FIG. 5( f) show the histogram smoothing process: FIG. 5( a) shows an RGB color image after each red, green, and blue component is smoothed separately; FIG. 5( b) shows the corresponding hue image after each red, green, and blue component is smoothed separately; FIG. 5( c) shows the histogram of hue values of the RGB image before each red, green, and blue component is smoothed separately; FIG. 5( d) shows the histogram of the hue values of the RGB image after smoothing; FIG. 5( e) shows the histogram of hue values before the whole histogram itself is smoothed, FIG. 5( f) shows the histogram of hue values after the whole histogram itself is smoothed.
  • FIG. 6 shows the expected histogram of hue values wherein the histogram has a peak (light gray) in the region corresponding to the cervix and vaginal sidewalls, and a very close peak (dark gray) to the right of it when the body parts outside the cervix and vaginal sidewalls are visible.
  • FIG. 7( a) and FIG. 7( b) shows the fitted Gaussian model and the classification result: FIG. 7( a) Fitted Gaussian model (2 classes); FIG. 7( b) Segmentation result
  • FIG. 8( a), FIG. 8( b), and FIG. 8( c) show post-processing results: FIG. 8( a) Cervix region—obtained by classification, FIG. 8 (b) Cervix region—closed holes. FIG. 8( c) Cervix region—cleaned up small regions.
  • FIG. 9( a) and FIG. 9( b) show the region assessment examples. Circles represent the ellipse fitting, rectangles represent the surrounding box and the crosses represent the mass center of the cervix region.
  • FIG. 10( a), FIG. 10( b), FIG. 10( c)(1), FIG. 10( c)(2), FIG. 10( d)(1), FIG. 10( d)(2), FIG. 10( e)(1) and FIG. 10( e)(2) show some results of region assessment: FIG. 10( a) Incorrect camera zooming (the cervical region is too small). FIG. 10( b) Incorrect camera positioning (the cervical region is not well centered). FIG. 10( c)-FIG. 10( e) show 3 examples of improper existence of obstacles or partial visible cervix region: FIG. 10( c)(1) and FIG. 10( c)(2) show the existence of a cotton swab; FIG. 10( d)(1) and FIG. 10( d)(2) show the existence of a speculum; FIG. 10( e)(1) and FIG. 10( e)(2) show the extraordinary strong illumination caused by an abnormal camera setting, which impaired the image visibility. All the 3 examples above indicate poor image quality and that the entire cervix region cannot be seen clearly.
  • FIG. 11( a) and FIG. 11( b) show an example of image contrast analysis: FIG. 11( a) An example of good contrast image, and FIG. 11( b) its corresponding histogram in red (R) channel.
  • FIG. 12( a) and FIG. 12( b) show an example of false peak removal: FIG. 12( a) Input image; FIG. 12( b) shows the corresponding histogram in R Channel (False peak exists); FIG. 12( c) Gaussian fitted histogram (preferably keeping only the rightmost peak).
  • FIG. 13 shows a flowchart of the blur assessment.
  • FIG. 14( a) and FIG. 14( b) shows the image power spectrum: FIG. 14( a) 2D display of image power spectrum; FIG. 14( b) Conversion of image power spectrum into ID diagram.
  • FIG. 15 shows the Blur detection result (black squares represent areas being detected as blur).
  • FIG. 16( a), FIG. 16( b), FIG. 16( c) and FIG. 16( d) show 2 examples of contamination detection results: FIG. 16( a): A portion of cervical image; FIG. 16( b): Contamination detection (gray area) for the cervical image in FIG. 16( a); FIG. 16( c): A portion of cervical image; FIG. 16( d): Contamination detection (gray area) results for the image in FIG. 16( c).
  • BEST MODES FOR CARRYING OUT INVENTION 1. System Framework
  • The presently preferred embodiment of the invention described herein preferably starts from an RGB (Red-Green-Blue) color space image from a digital colposcope. The input image is a glare free RGB image of a uterine cervix. Glare free imagery can be obtained either by cross-polarized (XP) image acquisition or glare removal pre-processing (Lange H.; Automatic glare removal in reflectance imagery of the uterine cervix; SPIE Medical Imaging 2005; SPIE Proc. 5747, 2005, incorporated herein by reference).
  • The invention preferably comprises a framework of robust, real-time, industry-oriented algorithms to carry out the invention using statistical, morphological and signal processing methods. First, a Region-Of-Interest (ROI), preferably the cervix, is detected using a hue color cluster that discriminates between cervix and background. Then an adaptive peak-removing histogram equalization algorithm is used to assess the contrast. Following the contrast assessment, a frequency-based method is applied to fulfill the blur assessment. Finally, the contamination detection algorithm is accomplished by machine learning and a classification algorithm. The framework for the invention is shown in FIG. 1.
  • 2. Region of Interest (ROI) Detection
  • The ROI is detected using a hue color classifier that discriminates between the cervix and the background. First, the glare free RGB image is transformed into an HSI (Hue-Saturation-Intensity) image through HSI transformation. A histogram of the hue values is created and histogram smoothing is performed to reduce the inherent noise of the hue values. Then an Expectation-Maximization (EM) cluster is employed to fit the 2 Gaussian peaks in the histogram, and segmentation is performed based on the likelihood of a pixel belonging to one of the peaks. A Gaussian peak is the peak of a fitted curve, assuming that the histogram is an approximated Gaussian distribution (also known as a normal distribution). Segmentation refers to the process of partitioning a digital image into multiple regions (sets of pixels). The goal of segmentation is to simplify and/or change the representation of an image into something that is more meaningful and easier to analyze. An EM algorithm is used in statistics for finding maximum likelihood estimates of parameters in probabilistic modes, where the model depends on unobserved latent variables. EM alternates between performing an expectation (E) step, which computes an expectation of the likelihood by including the latent variable as if they were observed, and a maximization (M) step, which computes the maximum likelihood estimates of the parameters by maximizing the expected likelihood found on the E step. The parameters found on the M step are then used to begin another E step, and the process is repeated. Once this is completed, post-processing on the ROI is performed. The ROI is then analyzed using region assessment and contrast assessment. The preferred algorithm framework of ROI detection is shown in FIG. 2.
  • The invention preferably uses a combination of various color-based algorithms. Those algorithms are supervised by a training algorithm in order to be directed specifically to the cervical tissue. In low-resolution image acquisition, a matching-filter algorithm to classify the image pixels into background and cervix, using a spectral angle of RGB space is employed. The algorithm first defines a target spectrum (corresponding the cervix) using a grid box sampling method. The target spectrum and all samples of spectra can be represented in multi-dimensional space. In this space, the red pixel value of a spectrum corresponds to the value in the red dimension, the green pixel value of a spectrum corresponds to the value in the green dimension, and the blue pixel value of a spectrum corresponds to the value in the blue dimension. The red, green, and blue values together define a vector in this space. The target spectrum and all samples of spectra can be represented as vectors this space. The spectral angle between the target spectrum vector and each sample spectrum vector can be assessed. Pixels for which this angle is less than some minimum threshold are deemed sufficiently similar to the cervical target pixels to pass filtering. In high-resolution image acquisition, a more accurate EM algorithm using HSV space is used.
  • The algorithm can be easily extended to alternate color spaces, such as LAB and LUV.
  • 2.1 HSI Transformation
  • The hue color feature is used to characterize the color of the pixels. Preferably, the input RGB image is transformed from the RGB color space to the HSI color space, keeping the hue component for calculations. The original RGB image and transformed hue image are shown in FIG. 3( a) and FIG. 3( b) respectively. The hue values of interest are preferably located using a roll-over spectrum, which can easily be seen in the histogram of the hue values in FIG. 3( c). The roll-over spectrum is a technique used to generate the hue histogram. It involves shifting the phase of the histogram by 180 degrees.
  • To simplify the calculations and visualization, all hue values are shifted by the median value (180 degrees) of the entire dynamic range of all hue values (360 degrees). The dynamic range is a term used to describe the ratio between the smallest and largest possible values of a variable quantity. The shifted histogram is shown in FIG. 4( a). As a reference, the red color now corresponds to the middle value of the hue channel. Color digital images are made of pixels, and pixels are made of combinations of primary colors. A channel in this context is the grayscale image of the same size as a color image, made of just one of these primary colors. For instance, an image from a standard digital camera will have a red, green and blue channel. A grayscale image has just one channel. FIG. 4( b) shows the shifted hue image.
  • 2.2 Histogram Smoothing
  • The histogram of the hue values is inherently very noisy. Therefore, it is preferable to smooth the histogram. In image processing, it is usually necessary to perform noise reduction on an image before performing higher-level processing steps. Smoothers, or smoothing filters, are algorithms for time-series processing that reduce abrupt changes in the time-series and make it look smoother. Smoothers constitute a broad subclass of filters. Like all filters, smoothers may be subdivided into linear and nonlinear. Linear filters reduce the power of higher frequencies in the spectrum and preserve the power of lower frequencies For this invention, preferably the image in each R, G, B component is smoothed separately, with a 3×3 median filter followed by a 7×7 Gaussian filter with a sigma (the variance) of 1.5. The median filter is preferably a non-linear digital filtering technique, often used to remove noise from images or other signals. The Gaussian filter is preferably a linear filter that is also used as a smoother. The output of the Gaussian filter at the moment tis the weighted mean of the input values, and the weights are defined by formula.
  • The results for the RGB color image, the hue characteristics image and the histogram before each R, G, B component is smoothed and after each R, G, B component is smoothed are shown in FIG. 5( a), 5(b), 5(c), and 5(d) respectively. Next, the histogram itself is smoothed, preferably using an opening-closing Alternating Sequential Filter (ASF) with a horizontal line structuring element of size 2 (SDC Morphology Toolbox for MATLAB® Version 1.3 of 21 Apr. 2004 http://www.mmorph.com/—SDC Information Systems, incorporated herein by reference). FIG. 5( e) shows the histogram before smoothing and FIG. 5( f) shows the histogram after smoothing.
  • 2.3 Classification and Post-Processing
  • The expected hue value histogram has a peak (light gray) for the region corresponding to the cervix and vagina sidewalls, and a very close peak (dark gray) to the right of it when the body parts are outside the cervix and the vaginal sidewalls are visible, as shown in FIG. 6. Hue can be expressed as an angle vector in color space. Therefore, the histogram can be split by using the value of this angle. Instead of heuristic based thresholding (hard thresholding), which forces segmentation of all values that are lager or smaller than a threshold value, an EM algorithm (Dempster, A., Laird, N., and Rubin, D. (1977), Maximum likelihood from incomplete data via the EM algorithm, Journal of the Royal Statistical Society, Series B, 39(1):1-38, incorporated herein by reference) is preferably used as a probability based method to separate the two peaks by fitting the histogram into 2 mixture Gaussian models. The fitted Gaussian model and the classification result are shown in FIGS. 7( a) and 7(b), where FIG. 7( a) shows the fitted Gaussian model (2 classes) and FIG. 7( b) depicts the segmentation result.
  • After classification, the cervix region is preferably cleaned up by post-processing. First holes are closed by filling them in using a morphological operation. Then, small regions are deleted using an opening-closing ASF with reconstruction using a cross as structuring element of size 8. The results are shown in FIG. 8( a)-(c) where FIG. 8( a) shows the cervix region obtained by classification, FIG. 8 (b) shows the cervix region with closed holes, and FIG. 8 (c) shows the cervix region with small regions cleaned up.
  • 3. Region Assessment
  • After the ROI has been detected, preferably this information is then used to assess some of the instrument settings. Poor image quality can be attributed to several factors such as:
  • 1. Incorrect camera zooming: which makes the cervix region too small in comparison to the entire image size.
  • 2. Incorrect camera positioning: in which the cervix region is not centered in the image.
  • 3. Improper existence of other obstructions such as speculum or cotton swab, which will block the view of the cervix region. Certain parts of the cervix may be excluded from the field of view due to a patient's movement as well.
  • When the above circumstances occur, this system should automatically detect them and block them from entering a subsequent CAD system. Region assessment preferably consists of the following steps: (1) Fitting the detected ROI into a surrounding box; (2) calculating the dimensions and area of the surrounding box; (3) calculating the mass center of the ROI (the geometric center of the ROI) and (4) fitting the ROI to an ellipse (See FIG. 9( a) and FIG. 9( b) for examples of a cervix fitted with a box and an ellipse). The mass center of the ROI is calculated using the following integral equation:
  • ρ ( r ) r V ρ ( r ) V
  • where ρ(r) is the density function, and r is the position function. The density of a two-dimensional image of the ROI can be easily understood if you consider an ROI (with white area indicating where the cervix is located and black area indicating where the background is located, then the density function is simply 1 in the white area and 0 in the black area. The surrounding box is used to compute parameters of the fitted ellipse. The above information is then used to do region assessment based on the following criteria:
  • 1. For the assessment of incorrect camera zooming, the ratio of the ROI area to the entire image area is preferably used. If that ratio is larger than a threshold value between approximately 0.25 to 0.45, with 0.35 being the preferred value, the camera zooming is deemed satisfactory, otherwise an error message will be displayed and notify the operator of incorrect camera zooming (See FIG. 10( a) for an example where the cervical region is too small).
  • 2. For the assessment of incorrect camera positioning, the distance between the cervix's mass center and the image's center (the geometric center) is preferably used. If the distance is smaller than approximately 0.1 to 0.3 times the image width, with 0.2 being the preferred value, we deem that the cervix region is centered. Otherwise the camera positioning is not satisfactory (See FIG. 10( b) for an example where the cervical region is not centered).
  • 3. For assessment of improper existence of obstructions or partially visible cervix region, the ROI is compared to the fitted ellipse (the algorithm assumes that a fully visible cervix region is elliptical or near elliptical in shape). If the difference between the ROI's area and the area of the ellipse is greater than approximately 0.2 to 0.3 times the area of the ROI (cervix region), with 0.25 being the preferred value, the cervix is deemed obstructed by the improper existence of obstacles (see FIGS. 10( c)-10(e) for examples).
  • 4. Contrast Assessment
  • The purpose of the contrast assessment is to make sure the images that are taken have a satisfactory contrast. Contrast is a measure of the gradation (difference) in luminance (brightness) that provides information. Generally speaking, contrast tells us the smallest difference two nearby signals can have and still be distinguishable. A simple but robust way to assess image contrast is to do histogram analysis over the ROI. Empirically, the ROI (the cervix) is usually pinkish. The reflectance of the cervix is highest in the range of the optical spectrum corresponding to red. Therefore the quality of contrast can be assessed by analyzing the dynamic range of the red channel. The optical spectrum (also known as the visible spectrum) is the portion of the electromagnetic spectrum that is visible to the human eye. Electromagnetic radiation in this range of wavelengths is called visible light or simply light. A typical human eye will respond to wavelengths in air from about 380 to 750 nanometers (equal to one billionth of a meter). Red is any of a number of similar colors evoked by light consisting predominantly of the longest wavelengths of light discernible by the human eye, in the wavelength range of roughly 625-740 nanometer.
  • From our experiments, if the peak of the histogram in red channel is greater than ⅘ of the full dynamic range (note that all the glints have been removed by preprocessing), the images are deemed to have satisfactory or good contrast. FIG. 11( a) and FIG. 11( b) shows an image in good contrast and its corresponding histogram.
  • If multiple peaks exist, the assessment employs the histogram smoothing and Gaussian fitting, described in section 2.3 to smooth and remove any false peaks to provide a single-peak histogram. Thus, the assessment preferably analyzes the dynamic range by the peak farthest to the right in the R (red) channel histogram because it provides robust results. See FIG. 12 (a), FIG. 12( b), and FIG. 12( c) for example. After the single-peak histogram is obtained by Gaussian fitting, the dynamic range is deemed satisfactory if the peak of the histogram is larger than ⅘ of the full dynamic range, otherwise an error message will be displayed and notify the operator of incorrect camera contrast setting.
  • Alternatively, histogram analysis can be done by calculating the maximal acceptable digital number (DN) of the image compared to the optimal digital number of the camera. A DN is a positive integer value representing the relative brightness of a pixel in a digital image. The exposure time can be adjusted accordingly to make sure the image taken is of good contrast.
  • 5. Blur Assessment
  • The detection of blur in images is well-investigated only if there is a reference image to compare to an input image. For example, various measurements such as entropy and Pixel Signal to Noise Ratio (PSNR) are used for determining how blurry the image is by comparing to a reference image. The presently preferred embodiment of this invention performs the blur assessment in a unique perceptual manner without the use of a reference image. The following limitations are important in the selection of a preferred algorithm: (1) no reference image is used; and (2) the various causes of image blur (e.g. the camera can be out of focus, motion blur, or a combination of the two).
  • Frequency-based methods make real time application possible because of their fast processing speed. The problem with most frequency-based methods is that they are sensitive to structure changes, so that an image that contains more structures but looks blurry may reflect higher quality than an image that contains less structures but has no blur at all. The presently preferred embodiment of this invention performs its blur assessment by evaluating the distribution of wavelengths by using a spatial frequency method that evaluates wavenumber frequency. Wavenumber is the spatial analog of frequency, that is, it is the measurement of the number of repeating units of a propagating wave (the number of occurrences that a wave has the same phase) per unit of space instead of per unit of time. The final step of blur assessment preferably uses a normalized image power spectrum method as the quality measure.
  • The algorithm can be described as the following steps:
  • 1. Divide the image into multiple non-overlapping blocks.
  • 2. For each block, compute a local measurement based on wave number frequency information using an image power spectrum.
  • 3. Compute global measurement for the entire image based on the local measurements obtained from Step 2.
  • 4. Determine whether the image is blurry or not from the global measurement.
  • A flow chart of the algorithm is depicted in FIG. 13. Note, the blocks are preferably be square, but can be of any size and spacing, as long as the distance between two blocks is not greater than approximately 20% of the length of each block.
  • The local measurement of blur is calculated by image power spectrum (Nill N., Bouzas B., Objective Image quality measure derived from digital image power spectra, Optical engineering, April 1992, Vol. 31, 813-825, incorporated herein by reference) and then is normalized by the zero components, which is shown as FIG. 14( a). The term “local” refers to the fact that the calculation is limited to the area of a particular pixel in an image. The 2D image power spectrum is then transformed into a 1D diagram, by calculating the integral of all the pixels for each radius. In order to analyze the energy property in each frequency band, polar coordinate integration is used according to each radial value. See FIG. 14( b). In mathematics, the polar coordinate system is a two-dimensional coordinate system in which each point on a plane is determined by an angle and a distance from an origin. The polar coordinate system is especially useful in situations where the relationship between two points is most easily expressed in terms of angles and distance than the more familiar Cartesian or rectangular coordinate system, by which such a relationship can only be found through trigonometric formulae. Here radius refers to distance.
  • In order to determine the degree of blur without the use of a reference image, the blur assessment uses intrinsic information within the image itself. The inventors recognized that an image power spectrum can be a statistical tool for texture analysis and that the high frequency wavenumber information of the texture is always damaged in a blurred image, so the power spectrum is separated into three parts: (a) low frequency area; (b) high frequency area; and (c) noise area. The assessment assumes that the low frequency area represents structure information invariant to blur and the high frequency area represents detailed information that is more sensitive to blur (See FIG. 14( b) for example, where the image power spectrum ID diagram is separated into 3 parts: (a), (b), and (c)). The degree of blur is then calculated by analyzing the ratio between the low frequency and high frequency areas. When the ratio is smaller than a threshold value, the whole block is considered blurry. The threshold is preferably 0.4 but can range from 0.3 to 0.5. Note that the noise spectrum has been discarded.
  • After each block is evaluated for blur, the global measurement gives out a decision as a whole by using the percentage of the number of blurred blocks in entire image. Preferably, more weight is given to those blocks in the center of the image than those in the periphery because the invention is concerned with the quality of the image at the image center (where the ROI is located). The weight is used to calculate the coverage rate. The preferred weighted method of calculating the coverage rate is preferably achieved by multiplying the number of blurry blocks in the periphery by 0.8, multiply the number of blurry blocks in the center by 1.2, adding them together, and then divide the sum by the total number of blocks.
  • Thus, if the blurred blocks cover less than approximately 20-30% of the entire region of interest, with 25% being the preferred value, the image is deemed satisfactory (i.e. not blurry), otherwise an error message will pop up and feedback to the operator. FIG. 15 shows an example of blurred block detection in the region of interest.
  • Determining the focus of the cervical images can be done by two categories of methods, and they are employed for different purposes. The first method relies on a laser auto-focus system in the hardware. A laser dot is projected onto the surface of the cervix at a tilted angle, and the distance between the camera and the cervix is determined using the offset of the laser dot from to the center of the image. The greater the offset, the more blurry the image is. The ideal case is that the laser is at the center. This method is preferably used in low-resolution image acquisition, which gives us a rough estimation about focus.
  • The second method is based on image content only, utilizing a frequency-based algorithm to detect blurred areas on the images. This method can be used in high-resolution image acquisition providing more detailed information, which can be employed to improve the performance of the CAD system.
  • 6. Contamination Detection
  • Different types of contamination detection algorithms have been used in the prior art. Color and texture in both parallel-polarized (PP) and cross-polarized (XP) images are crucial information in detecting blood spot, mucus, and other types of obstruction, such as cotton swap. XP means having 2 polarizations at angles which are substantially perpendicular to each other. PP means singly-polarized or multiple polarities having polarization angles which are substantially parallel to each other.
  • The contamination detection in this invention preferably includes a training stage (i.e. machine learning) and a classification stage. The training stage is accomplished by a series of annotations, wherein a physician will manually mark the contamination region. Then a joint texture/color model is employed to generate the feature vector per image pixel (Chad Carson, Serge Belongie, Hayit Greenspan and Jitendra Malik, Blobworld: Enrage Segmentation Using Expectation-Maximization and Its Application to Image Querying; IEEE Trans. on Pattern Analysis and Machine Intelligence, 24(8), 1026-1038, August 2002, incorporated herein by reference). Finally a Support Vector Machine (SVM) algorithm is used to detect contamination in the cervical images (Chang C. and Lin J. Training nu-support vector regression: theory and algorithms, Neural Computation, 14 (2002), 1959-1977, incorporated herein by reference). SVMs generally are a set of related supervised learning methods used for classification and regression.
  • The training process uses 3 color features and 3 texture features. The 3 color components are preferably the L*a*b* coordinates found after spatial averaging using a Gaussian filter, and the 3 texture components are preferably anisotropy, polarity and contrast (Zhang J., Liu Y., Zhao T., SVM Based Feature Screening Applied to Hierarchical Cervical Cancer Detection, International Conference on Diagnostic Imaging and Analysis (ICDIA 2002), August, 2002, incorporated herein by reference). However, instead of using an EM algorithm as unsupervised clustering, an SVM algorithm as supervised classification is preferably utilized, due to adequate, accurate annotation. Some preliminary results of the blood detection are provided here. Furthermore, for other kinds of contamination detection such as mucus and purulence, similar algorithms can be designed if the annotation is precise and adequate. FIG. 16 shows some experimental results on blood detection.
  • While the present invention has been particularly shown and described with reference to embodiments described in the detailed description and illustrated in the figures, it will be understood by one skilled in the art that various changes in detail may be effected therein without departing from the spirit and scope of the invention, as defined by the claims. Accordingly, no limitations are to be implied or inferred except as explicitly set forth in the claims.
  • INDUSTRIAL APPLICABILITY
  • The assessment algorithms of the invention may also be suitable for image quality assessment for other tissue diagnosis such as colorectal cancer and skin cancer, and could be used for telemedicine applications. They may also be combined with other instruments and methods for systems that automatically analyze and adjust the quality of acquired images.

Claims (12)

1. A method of image quality assessment to produce standardized images for use in archive-quality electronic medical records and in CAD systems comprising:
collecting a raw image free from glare during an examination with a digital imager, wherein said raw image contains a region of interest having borders, a border length and a border width, that define an image area within said borders, and a image center, and wherein said region of interest contains a region of interest area and a region of interest mass center;
locating said region of interest using an image classification algorithm;
applying a region assessment to said region of interest to detect incorrect camera zooming, incorrect camera positioning, and obstructions;
performing a contrast assessment on said region of interest using a histogram-based algorithm;
running a blur assessment without a reference image, wherein said blur assessment step comprises dividing said region of interest into non-overlapping blocks, computing a local measurement for each of said blocks based on frequency information using an image power spectrum to produce a two-dimensional display of said image power spectrum, converting said two-dimensional display to a one-dimensional display, separating said one-dimensional display into a low frequency area, a high frequency area and a noise area, and determining a degree of blur for each of said bocks by calculating the ratio of said low-frequency area to said high-frequency area, and using said degree of blur to determine if said block is a blurred block;
determining the percentage of blurred blocks to determine if said image is blurry; and
using contamination detection to detect obstructions using a training stage and a classification algorithm to produce said standardized images.
2. A method according to claim 1, wherein said locating step comprises transforming said raw image into an HSI image having hue values, creating a hue histogram of said hue values wherein said hue histogram contains peaks, ranges for said peaks, a dynamic range, and a median value for said dynamic range; shifting said hue histogram by the median value of the dynamic range; smoothing said hue histogram using filters; performing classification using an EM algorithm; and applying post processing.
3. A method according to claim 1, wherein said region assessment step comprises fitting said region of interest into a surrounding box; calculating said region of interest mass center; fitting said region of interest area to an ellipse having an ellipse area; detecting said incorrect camera zooming by calculating a zoom ratio between said region of interest area to said image area; detecting said incorrect camera positioning by calculating a distance between said region of interest mass center and said image center; and detecting said obstructions by comparing differences between said region of interest area and said ellipse area.
4. A method according to claim 1, wherein said contrast assessment is performed using a histogram based algorithm that generates a contrast histogram having peaks, ranges, and channels, including red channels, and wherein said contrast assessment is satisfactory if said contrast histogram peak farthest to the right in said red channel is greater than approximately ⅘ of said contrast histogram's range.
5. A method according to claim 1, wherein each of said blocks is a blurred block if said ratio is less than approximately 0.3-0.5, and wherein said blur assessment is satisfactory if each of said blurred blocks covers less than approximately 20-30% of said region of interest area.
6. A method according to claim 1, wherein said training stage comprises the steps of machine learning with a series of annotations; applying a joint texture/color model to produce a feature vector per image pixel; and applying classification algorithm to detect contamination.
7. A method according to claim 3, wherein said camera zooming is satisfactory when said zoom ratio of said region of interest area to said image area is greater than approximately 0.25-0.45.
8. A method according to claim 3, wherein said camera positioning is satisfactory when said distance between said region of interest mass center and said image center is less than approximately 0.1-0.3 times said border width.
9. A method according to claim 3, wherein said obstructions exist if said difference between said region of interest area and said ellipse area is larger than 0.2-0.3 times said region of interest area.
10. A method according to claim 6, wherein said joint/color model uses three color features and three texture features, and wherein said classification algorithm is a Support Vector Machine.
11. A method of image quality assessment to produce standardized images for use in archive-quality electronic medical records and in CAD systems comprising:
collecting a raw image free from glare during an examination with a digital imager, wherein said raw image contains a region of interest having borders, a border length and a border width, that define an image area within said borders, and a image center, and wherein said region of interest contains a region of interest area and a region of interest mass center;
locating said region of interest using an image classification algorithm by transforming said raw image into an HSI image having hue values, creating a hue histogram of said hue values wherein said hue histogram contains peaks, ranges for said peaks, a dynamic range, and a median value for said dynamic range; shifting said hue histogram by the median value of the dynamic range; smoothing said hue histogram using filters; performing classification using an EM algorithm; and applying post processing;
applying a region assessment to said region of interest to detect incorrect camera zooming, incorrect camera positioning, and obstructions by fitting said region of interest into a surrounding box; calculating said region of interest mass center; fitting said region of interest area to an ellipse having an ellipse area; detecting said incorrect camera zooming by calculating a zoom ratio between said region of interest area to said image area; detecting said incorrect camera positioning by calculating a distance between said region of interest mass center and said image center; and detecting said obstructions by comparing differences between said region of interest area and said ellipse area;
performing a contrast assessment on said region of interest using a histogram-based algorithm that generates a contrast histogram having peaks, ranges, and channels, including red channels, and wherein said contrast assessment is satisfactory if said contrast histogram peak farthest to the right in said red channel is greater than approximately ⅘ of said contrast histogram's range;
running a blur assessment without a reference image, wherein said blur assessment step comprises dividing said region of interest into non-overlapping blocks, computing a local measurement for each of said blocks based on frequency information using an image power spectrum to produce a two-dimensional display of said image power spectrum, converting said two-dimensional display to a one-dimensional display, separating said one-dimensional display into a low frequency area, a high frequency area and a noise area, and determining a degree of blur for each of said bocks by calculating the ratio of said low-frequency area to said high-frequency area, and using said degree of blur to determine if said block is a blurred block;
determining the percentage of blurred blocks to determine if said image is blurry; and
using contamination detection to detect obstructions using a training stage that utilizes machine learning with a series of annotations; applying a joint texture/color model to produce a feature vector per image pixel; and applying a Support Vector Machine to detect contamination and producing said standardized images.
12. A method of blur assessment without a reference image comprising:
dividing said region of interest into non-overlapping blocks, computing a local measurement for each of said blocks based on frequency information using an image power spectrum to produce a two-dimensional display of said image power spectrum, converting said two-dimensional display to a one-dimensional display, separating said one-dimensional display into a low frequency area, a high frequency area and a noise area, and determining a degree of blur for each of said bocks by calculating the ratio of said low-frequency area to said high-frequency area, and using said degree of blur to determine if said block is a blurred block; and
determining the percentage of blurred blocks to determine if said image is blurry.
US12/075,910 2007-03-16 2008-03-14 Method of image quality assessment to produce standardized imaging data Expired - Fee Related US8295565B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/075,910 US8295565B2 (en) 2007-03-16 2008-03-14 Method of image quality assessment to produce standardized imaging data

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US91852707P 2007-03-16 2007-03-16
US12/075,910 US8295565B2 (en) 2007-03-16 2008-03-14 Method of image quality assessment to produce standardized imaging data

Publications (2)

Publication Number Publication Date
US20080226148A1 true US20080226148A1 (en) 2008-09-18
US8295565B2 US8295565B2 (en) 2012-10-23

Family

ID=39762742

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/075,910 Expired - Fee Related US8295565B2 (en) 2007-03-16 2008-03-14 Method of image quality assessment to produce standardized imaging data
US12/075,890 Expired - Fee Related US8401258B2 (en) 2007-03-16 2008-03-14 Method to provide automated quality feedback to imaging devices to achieve standardized imaging data

Family Applications After (1)

Application Number Title Priority Date Filing Date
US12/075,890 Expired - Fee Related US8401258B2 (en) 2007-03-16 2008-03-14 Method to provide automated quality feedback to imaging devices to achieve standardized imaging data

Country Status (4)

Country Link
US (2) US8295565B2 (en)
EP (1) EP2137696A2 (en)
JP (1) JP2010521272A (en)
WO (2) WO2008115405A2 (en)

Cited By (56)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090116713A1 (en) * 2007-10-18 2009-05-07 Michelle Xiao-Hong Yan Method and system for human vision model guided medical image quality assessment
US20100086189A1 (en) * 2008-10-07 2010-04-08 Xiaohui Wang Automated quantification of digital radiographic image quality
US20100226547A1 (en) * 2009-03-03 2010-09-09 Microsoft Corporation Multi-Modal Tone-Mapping of Images
US20110255759A1 (en) * 2005-12-28 2011-10-20 Olympus Medical Systems Corp. Image processing device and image processing method in image processing device
US20110255762A1 (en) * 2010-04-15 2011-10-20 Harald Deischinger Method and system for determining a region of interest in ultrasound data
US20110274338A1 (en) * 2010-05-03 2011-11-10 Sti Medical Systems, Llc Image analysis for cervical neoplasia detection and diagnosis
US20120288172A1 (en) * 2011-05-10 2012-11-15 General Electric Company Method and system for ultrasound imaging with cross-plane images
CN102955947A (en) * 2011-08-19 2013-03-06 北京百度网讯科技有限公司 Equipment and method for determining image definition
US20130121566A1 (en) * 2011-09-02 2013-05-16 Sylvain Paris Automatic Image Adjustment Parameter Correction
US20130121546A1 (en) * 2010-05-31 2013-05-16 Dvp Technologies Ltd. Inspection of region of interest
WO2013126568A1 (en) * 2012-02-21 2013-08-29 Massachusetts Eye & Ear Infirmary Calculating conjunctival redness
US20130230224A1 (en) * 2010-11-24 2013-09-05 Nocimed, Llc Systems and methods for automated voxelation of regions of interest for magnetic resonance spectroscopy
CN103325128A (en) * 2013-05-16 2013-09-25 深圳市理邦精密仪器股份有限公司 Method and device intelligently identifying characteristics of images collected by colposcope
CN103415869A (en) * 2010-12-13 2013-11-27 巴黎狄德罗大学(巴黎七大) Method of detecting and quantifying blur in a digital image
WO2014118786A1 (en) * 2013-02-04 2014-08-07 Orpheus Medical Ltd. Color reduction in images of human body
US8805112B2 (en) 2010-05-06 2014-08-12 Nikon Corporation Image sharpness classification system
US8897604B2 (en) 2011-09-23 2014-11-25 Alibaba Group Holding Limited Image quality analysis for searches
US8903169B1 (en) 2011-09-02 2014-12-02 Adobe Systems Incorporated Automatic adaptation to image processing pipeline
CN104504676A (en) * 2014-11-07 2015-04-08 嘉兴学院 Full-reference image quality evaluation method based on multi-vision sensitive feature similarity
US9020243B2 (en) 2010-06-03 2015-04-28 Adobe Systems Incorporated Image adjustment
US20150187046A1 (en) * 2012-09-18 2015-07-02 Fujifilm Corporation Still image display device and system, and imaging device
US20150206324A1 (en) * 2011-01-26 2015-07-23 Stmicroelectronics S.R.L. Texture detection in image processing
US20150244946A1 (en) * 2013-11-04 2015-08-27 Sos Agaian Method and systems for thermal image / video measurements and processing
US20150332123A1 (en) * 2014-05-14 2015-11-19 At&T Intellectual Property I, L.P. Image quality estimation using a reference image portion
US9251439B2 (en) 2011-08-18 2016-02-02 Nikon Corporation Image sharpness classification system
US9412039B2 (en) 2010-11-03 2016-08-09 Nikon Corporation Blur detection system for night scene images
US20160345021A1 (en) * 2015-05-20 2016-11-24 Texas Instruments Incorporated Still Block Detection in a Video Sequence
US20170147858A1 (en) * 2015-11-19 2017-05-25 Microsoft Technology Licensing, Llc Eye feature identification
US20170178320A1 (en) * 2015-12-21 2017-06-22 Koninklijke Philips N.V. Device, system and method for quality assessment of medical images
CN106971386A (en) * 2016-01-14 2017-07-21 广州市动景计算机科技有限公司 Judge method, device and the client device of image integrity degree and page loading degree
US9724013B2 (en) 2009-10-14 2017-08-08 Nocimed, Inc. MR spectroscopy system and method for diagnosing painful and non-painful intervertebral discs
US9760830B2 (en) 2011-12-28 2017-09-12 Siemens Aktiengesellschaft Control method and control system
CN107203991A (en) * 2017-04-19 2017-09-26 山西农业大学 A kind of half reference image quality appraisement method based on spectrum residual error
KR101789513B1 (en) * 2016-07-11 2017-10-26 주식회사 인피니트헬스케어 Method of determining image quality in digital pathology system
CN107451959A (en) * 2016-05-31 2017-12-08 宇龙计算机通信科技(深圳)有限公司 Image processing method and system
US20180018762A1 (en) * 2016-07-18 2018-01-18 Xiaomi Inc. Method, device and medium for enhancing saturation
US20180174284A1 (en) * 2016-12-20 2018-06-21 Fujitsu Limited Biometric image processing device, biometric image processing method and computer-readable non-transitory medium
US10004395B2 (en) 2014-05-02 2018-06-26 Massachusetts Eye And Ear Infirmary Grading corneal fluorescein staining
US10045711B2 (en) 2012-04-14 2018-08-14 Nocimed, Inc. Magnetic resonance spectroscopy pulse sequence, acquisition, and processing system and method
CN109473149A (en) * 2018-11-09 2019-03-15 天津开心生活科技有限公司 Data Quality Assessment Methodology, device, electronic equipment and computer-readable medium
US10251578B2 (en) 2009-10-14 2019-04-09 Nocimed, Inc. MR spectroscopy system and method for diagnosing painful and non-painful intervertebral discs
US10289940B2 (en) * 2015-06-26 2019-05-14 Here Global B.V. Method and apparatus for providing classification of quality characteristics of images
US20190164271A1 (en) * 2017-11-24 2019-05-30 Ficosa Adas, S.L.U. Determining clean or dirty captured images
US10482633B2 (en) * 2016-09-12 2019-11-19 Zebra Medical Vision Ltd. Systems and methods for automated detection of an indication of malignancy in a mammographic image
US20200160498A1 (en) * 2018-11-19 2020-05-21 Vision Guided Robotics LLC Inspection system
US20200257888A1 (en) * 2017-10-20 2020-08-13 Nec Corporation Three-dimensional facial shape estimating device, three-dimensional facial shape estimating method, and non-transitory computer-readable medium
RU2734575C1 (en) * 2020-04-17 2020-10-20 Общество с ограниченной ответственностью "АЙРИМ" (ООО "АЙРИМ") Method and system for identifying new growths on x-ray images
US20210118122A1 (en) * 2019-10-22 2021-04-22 Canon U.S.A., Inc. Apparatus and Method for Inferring Contrast Score of an Image
US20210192729A1 (en) * 2019-12-20 2021-06-24 PAIGE.AI, Inc. Systems and methods for processing electronic images to detect contamination in specimen preparations
CN113393461A (en) * 2021-08-16 2021-09-14 北京大学第三医院(北京大学第三临床医学院) Method and system for screening metaphase chromosome image quality based on deep learning
US11288771B2 (en) * 2020-04-29 2022-03-29 Adobe Inc. Texture hallucination for large-scale image super-resolution
US11288800B1 (en) * 2018-08-24 2022-03-29 Google Llc Attribution methodologies for neural networks designed for computer-aided diagnostic processes
US11330238B2 (en) * 2015-05-17 2022-05-10 Endochoice, Inc. Endoscopic image enhancement using contrast limited adaptive histogram equalization (CLAHE) implemented in a processor
US11538204B2 (en) * 2019-02-27 2022-12-27 Daikin Industries, Ltd. Information providing system
US11564619B2 (en) 2016-06-19 2023-01-31 Aclarion, Inc. Magnetic resonance spectroscopy system and method for diagnosing pain or infection associated with propionic acid
US11798151B1 (en) * 2022-04-25 2023-10-24 Rivian Ip Holdings, Llc Systems and methods for determining image capture degradation of a camera sensor

Families Citing this family (53)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007011708A2 (en) 2005-07-15 2007-01-25 Micell Technologies, Inc. Stent with polymer coating containing amorphous rapamycin
KR101406415B1 (en) 2005-07-15 2014-06-19 미셀 테크놀로지즈, 인코포레이티드 Polymer coatings containing drug powder of controlled morphology
WO2007127363A2 (en) 2006-04-26 2007-11-08 Micell Technologies, Inc. Coatings containing multiple drugs
US11426494B2 (en) 2007-01-08 2022-08-30 MT Acquisition Holdings LLC Stents having biodegradable layers
WO2008086369A1 (en) 2007-01-08 2008-07-17 Micell Technologies, Inc. Stents having biodegradable layers
JP5608160B2 (en) 2008-04-17 2014-10-15 ミセル テクノロジーズ、インコーポレイテッド Stent with bioabsorbable layer
CA2946195A1 (en) 2008-07-17 2010-01-21 Micell Technologies, Inc. Drug delivery medical device
EP2344982A4 (en) * 2008-10-10 2012-09-19 Sti Medical Systems Llc Methods for tissue classification in cervical imagery
CN101448174A (en) * 2008-12-26 2009-06-03 深圳华为通信技术有限公司 Image quality evaluation device and method thereof
EP2413847A4 (en) 2009-04-01 2013-11-27 Micell Technologies Inc Coated stents
EP3366326A1 (en) 2009-04-17 2018-08-29 Micell Technologies, Inc. Stents having controlled elution
EP2453834A4 (en) 2009-07-16 2014-04-16 Micell Technologies Inc Drug delivery medical device
CA2797110C (en) 2010-04-22 2020-07-21 Micell Technologies, Inc. Stents and other devices having extracellular matrix coating
JP5761601B2 (en) * 2010-07-01 2015-08-12 株式会社リコー Object identification device
CA2805631C (en) 2010-07-16 2018-07-31 Micell Technologies, Inc. Drug delivery medical device
US8773577B2 (en) 2010-10-27 2014-07-08 Qualcomm Incorporated Region of interest extraction
JP4831259B1 (en) * 2011-03-10 2011-12-07 オムロン株式会社 Image processing apparatus, image processing method, and control program
US8854041B2 (en) * 2011-05-20 2014-10-07 Kabushiki Kaisha Toshiba Spatially shaped pre-saturation profile for enhanced non-contrast MRA
US10117972B2 (en) 2011-07-15 2018-11-06 Micell Technologies, Inc. Drug delivery medical device
JP5950196B2 (en) * 2011-08-30 2016-07-13 株式会社リコー Imaging apparatus, and image analysis apparatus and moving apparatus using the same
WO2013052824A1 (en) * 2011-10-05 2013-04-11 Cireca Theranostics, Llc Method and system for analyzing biological specimens by spectral imaging
US10188772B2 (en) 2011-10-18 2019-01-29 Micell Technologies, Inc. Drug delivery medical device
CN102708567B (en) * 2012-05-11 2014-12-10 宁波大学 Visual perception-based three-dimensional image quality objective evaluation method
CN102708568B (en) * 2012-05-11 2014-11-05 宁波大学 Stereoscopic image objective quality evaluation method on basis of structural distortion
US9798918B2 (en) * 2012-10-05 2017-10-24 Cireca Theranostics, Llc Method and system for analyzing biological specimens by spectral imaging
CN104937935A (en) * 2012-11-16 2015-09-23 Vid拓展公司 Perceptual preprocessing filter for viewing-conditions-aware video coding
JP6330024B2 (en) 2013-03-12 2018-05-23 マイセル・テクノロジーズ,インコーポレイテッド Bioabsorbable biomedical implant
EP2986199B1 (en) 2013-04-18 2018-01-31 Koninklijke Philips N.V. Acquiring cervical images
JP2016519965A (en) 2013-05-15 2016-07-11 マイセル・テクノロジーズ,インコーポレイテッド Bioabsorbable biomedical implant
KR20140137715A (en) * 2013-05-23 2014-12-03 삼성디스플레이 주식회사 Apparatus and method for detecting x-ray
WO2015059613A2 (en) * 2013-10-22 2015-04-30 Koninklijke Philips N.V. Image visualization
EP2887266A1 (en) * 2013-12-23 2015-06-24 Koninklijke Philips N.V. Automatic extraction of a region of interest based on speculum detection
US10045050B2 (en) * 2014-04-25 2018-08-07 Vid Scale, Inc. Perceptual preprocessing filter for viewing-conditions-aware video coding
US9741107B2 (en) * 2015-06-05 2017-08-22 Sony Corporation Full reference image quality assessment based on convolutional neural network
US10275876B2 (en) 2015-06-12 2019-04-30 International Business Machines Corporation Methods and systems for automatically selecting an implant for a patient
CN106022354B (en) * 2016-05-07 2018-09-18 浙江大学 Image MTF measurement methods based on SVM
US10713537B2 (en) 2017-07-01 2020-07-14 Algolux Inc. Method and apparatus for joint image processing and perception
US10832808B2 (en) 2017-12-13 2020-11-10 International Business Machines Corporation Automated selection, arrangement, and processing of key images
CN108230314B (en) * 2018-01-03 2022-01-28 天津师范大学 Image quality evaluation method based on deep activation pooling
US20210042915A1 (en) * 2018-01-23 2021-02-11 Mobileodt Ltd. Automated monitoring of medical imaging procedures
DE102018201794B3 (en) * 2018-02-06 2019-04-11 Heidelberger Druckmaschinen Ag Adaptive image smoothing
CN108830823B (en) * 2018-03-14 2021-10-26 西安理工大学 Full-reference image quality evaluation method based on spatial domain combined frequency domain analysis
CN108447059B (en) * 2018-04-09 2021-06-29 华侨大学 Full-reference light field image quality evaluation method
US11436720B2 (en) * 2018-12-28 2022-09-06 Shanghai United Imaging Intelligence Co., Ltd. Systems and methods for generating image metric
US11210554B2 (en) 2019-03-21 2021-12-28 Illumina, Inc. Artificial intelligence-based generation of sequencing metadata
US11676685B2 (en) 2019-03-21 2023-06-13 Illumina, Inc. Artificial intelligence-based quality scoring
US11593649B2 (en) 2019-05-16 2023-02-28 Illumina, Inc. Base calling using convolutions
US11048971B1 (en) * 2019-12-24 2021-06-29 Ping An Technology (Shenzhen) Co., Ltd. Method for training image generation model and computer device
IL295560A (en) 2020-02-20 2022-10-01 Illumina Inc Artificial intelligence-based many-to-many base calling
WO2022014258A1 (en) * 2020-07-17 2022-01-20 富士フイルム株式会社 Processor device and processor device operation method
US11688041B2 (en) 2021-03-02 2023-06-27 International Business Machines Corporation System and method of automatic image enhancement using system generated feedback mechanism
US20220336054A1 (en) 2021-04-15 2022-10-20 Illumina, Inc. Deep Convolutional Neural Networks to Predict Variant Pathogenicity using Three-Dimensional (3D) Protein Structures
CN116309379B (en) * 2023-02-24 2023-11-03 飞燕航空遥感技术有限公司 Automatic aerial photography quality inspection method based on multi-data fusion

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6081612A (en) * 1997-02-28 2000-06-27 Electro Optical Sciences Inc. Systems and methods for the multispectral imaging and characterization of skin tissue
US20030095197A1 (en) * 2001-09-20 2003-05-22 Eastman Kodak Company System and method for deciding when to correct image-specific defects based on camera, scene, display and demographic data
US6738494B1 (en) * 2000-06-23 2004-05-18 Eastman Kodak Company Method for varying an image processing path based on image emphasis and appeal
US20040156559A1 (en) * 2002-11-25 2004-08-12 Sarnoff Corporation Method and apparatus for measuring quality of compressed video sequences without references
US20060147125A1 (en) * 2003-06-27 2006-07-06 Caviedes Jorge E Sharpness metric for asymmetrically enhanced image and video
US20060159325A1 (en) * 2005-01-18 2006-07-20 Trestle Corporation System and method for review in studies including toxicity and risk assessment studies
US7711174B2 (en) * 2004-05-13 2010-05-04 The Charles Stark Draper Laboratory, Inc. Methods and systems for imaging cells
US7940970B2 (en) * 2006-10-25 2011-05-10 Rcadia Medical Imaging, Ltd Method and system for automatic quality control used in computerized analysis of CT angiography

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2746615B2 (en) * 1988-04-22 1998-05-06 オリンパス光学工業株式会社 Endoscope image processing device
GB9408923D0 (en) * 1994-05-05 1994-06-22 Eaton Corp Power synchronizer for a compound transmission
US5565678A (en) 1995-06-06 1996-10-15 Lumisys, Inc. Radiographic image quality assessment utilizing a stepped calibration target
US6147705A (en) 1996-08-20 2000-11-14 Welch Allyn Inc. Apparatus and method for video colposcope with electronic green filter
EP1267709B1 (en) * 2000-03-28 2009-04-29 Board of Regents, The University of Texas System Method and apparatus for diagnostic multispectral digital imaging
JP4694051B2 (en) * 2000-07-11 2011-06-01 Hoya株式会社 Electronic endoscope
US6713978B2 (en) * 2001-07-18 2004-03-30 Texas A&M University System Method and system for determining induction motor speed
JP5259033B2 (en) * 2001-08-03 2013-08-07 オリンパス株式会社 Endoscope system
US7038820B1 (en) * 2002-04-03 2006-05-02 Eastman Kodak Company Automatic exposure control for an image sensor
US6674885B2 (en) * 2002-06-04 2004-01-06 Amersham Biosciences Corp Systems and methods for analyzing target contrast features in images of biological samples
JP4311959B2 (en) * 2003-03-24 2009-08-12 Hoya株式会社 Electronic endoscope device
CA2581656A1 (en) * 2003-09-26 2005-04-07 Tidal Photonics, Inc. Apparatus and methods relating to color imaging endoscope systems
US7272207B1 (en) * 2006-03-24 2007-09-18 Richard Aufrichtig Processes and apparatus for variable binning of data in non-destructive imaging

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6081612A (en) * 1997-02-28 2000-06-27 Electro Optical Sciences Inc. Systems and methods for the multispectral imaging and characterization of skin tissue
US6738494B1 (en) * 2000-06-23 2004-05-18 Eastman Kodak Company Method for varying an image processing path based on image emphasis and appeal
US20030095197A1 (en) * 2001-09-20 2003-05-22 Eastman Kodak Company System and method for deciding when to correct image-specific defects based on camera, scene, display and demographic data
US20040156559A1 (en) * 2002-11-25 2004-08-12 Sarnoff Corporation Method and apparatus for measuring quality of compressed video sequences without references
US20060147125A1 (en) * 2003-06-27 2006-07-06 Caviedes Jorge E Sharpness metric for asymmetrically enhanced image and video
US7711174B2 (en) * 2004-05-13 2010-05-04 The Charles Stark Draper Laboratory, Inc. Methods and systems for imaging cells
US20060159325A1 (en) * 2005-01-18 2006-07-20 Trestle Corporation System and method for review in studies including toxicity and risk assessment studies
US7940970B2 (en) * 2006-10-25 2011-05-10 Rcadia Medical Imaging, Ltd Method and system for automatic quality control used in computerized analysis of CT angiography

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Nill et al ("Objective Image quality measure derived from digital image power spectra, Optical engineering, April 1992, Vol. 31, 813-825) *

Cited By (102)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110255759A1 (en) * 2005-12-28 2011-10-20 Olympus Medical Systems Corp. Image processing device and image processing method in image processing device
US8300955B2 (en) * 2005-12-28 2012-10-30 Olympus Medical Systems Corp. Image processing device and image processing method in image processing device for identifying features in an image
US8086007B2 (en) 2007-10-18 2011-12-27 Siemens Aktiengesellschaft Method and system for human vision model guided medical image quality assessment
US20090116713A1 (en) * 2007-10-18 2009-05-07 Michelle Xiao-Hong Yan Method and system for human vision model guided medical image quality assessment
US20100086189A1 (en) * 2008-10-07 2010-04-08 Xiaohui Wang Automated quantification of digital radiographic image quality
US8571290B2 (en) * 2008-10-07 2013-10-29 Carestream Health, Inc. Automated quantification of digital radiographic image quality
US20100226547A1 (en) * 2009-03-03 2010-09-09 Microsoft Corporation Multi-Modal Tone-Mapping of Images
US8290295B2 (en) * 2009-03-03 2012-10-16 Microsoft Corporation Multi-modal tone-mapping of images
US11844601B2 (en) 2009-10-14 2023-12-19 Aclarion, Inc. MR spectroscopy system and method for diagnosing painful and non-painful intervertebral discs
US10285622B2 (en) 2009-10-14 2019-05-14 Nocimed, Inc. MR spectroscopy system and method for diagnosing painful and non-painful intervertebral discs
US10251578B2 (en) 2009-10-14 2019-04-09 Nocimed, Inc. MR spectroscopy system and method for diagnosing painful and non-painful intervertebral discs
US9724013B2 (en) 2009-10-14 2017-08-08 Nocimed, Inc. MR spectroscopy system and method for diagnosing painful and non-painful intervertebral discs
US20110255762A1 (en) * 2010-04-15 2011-10-20 Harald Deischinger Method and system for determining a region of interest in ultrasound data
US20110274338A1 (en) * 2010-05-03 2011-11-10 Sti Medical Systems, Llc Image analysis for cervical neoplasia detection and diagnosis
US8503747B2 (en) * 2010-05-03 2013-08-06 Sti Medical Systems, Llc Image analysis for cervical neoplasia detection and diagnosis
US8805112B2 (en) 2010-05-06 2014-08-12 Nikon Corporation Image sharpness classification system
US9082165B2 (en) * 2010-05-31 2015-07-14 Dvp Technologies Ltd. Inspection of region of interest
US20130121546A1 (en) * 2010-05-31 2013-05-16 Dvp Technologies Ltd. Inspection of region of interest
US9020243B2 (en) 2010-06-03 2015-04-28 Adobe Systems Incorporated Image adjustment
US9070044B2 (en) 2010-06-03 2015-06-30 Adobe Systems Incorporated Image adjustment
US9412039B2 (en) 2010-11-03 2016-08-09 Nikon Corporation Blur detection system for night scene images
US9280718B2 (en) * 2010-11-24 2016-03-08 Nocimed, Llc Systems and methods for automated voxelation of regions of interest for magnetic resonance spectroscopy
US20130230224A1 (en) * 2010-11-24 2013-09-05 Nocimed, Llc Systems and methods for automated voxelation of regions of interest for magnetic resonance spectroscopy
US9808177B2 (en) 2010-11-24 2017-11-07 Nocimed, Inc. Systems and methods for automated voxelation of regions of interest for magnetic resonance spectroscopy
US10517504B2 (en) 2010-11-24 2019-12-31 Nocimed, Inc. Systems and methods for automated voxelation of regions of interest for magnetic resonance spectroscopy
US20140023266A1 (en) * 2010-12-13 2014-01-23 Université Paris Diderot-Paris 7 Method of Detecting and Quantifying Blur in a Digital Image
CN103415869A (en) * 2010-12-13 2013-11-27 巴黎狄德罗大学(巴黎七大) Method of detecting and quantifying blur in a digital image
US9076192B2 (en) * 2010-12-13 2015-07-07 Université Paris Diderot—Paris 7 Method of detecting and quantifying blur in a digital image
US20150206324A1 (en) * 2011-01-26 2015-07-23 Stmicroelectronics S.R.L. Texture detection in image processing
US9959633B2 (en) * 2011-01-26 2018-05-01 Stmicroelectronics S.R.L. Texture detection in image processing
US8798342B2 (en) * 2011-05-10 2014-08-05 General Electric Company Method and system for ultrasound imaging with cross-plane images
US20120288172A1 (en) * 2011-05-10 2012-11-15 General Electric Company Method and system for ultrasound imaging with cross-plane images
US9251439B2 (en) 2011-08-18 2016-02-02 Nikon Corporation Image sharpness classification system
CN102955947A (en) * 2011-08-19 2013-03-06 北京百度网讯科技有限公司 Equipment and method for determining image definition
US8903169B1 (en) 2011-09-02 2014-12-02 Adobe Systems Incorporated Automatic adaptation to image processing pipeline
US9292911B2 (en) 2011-09-02 2016-03-22 Adobe Systems Incorporated Automatic image adjustment parameter correction
US9008415B2 (en) * 2011-09-02 2015-04-14 Adobe Systems Incorporated Automatic image adjustment parameter correction
US20130121566A1 (en) * 2011-09-02 2013-05-16 Sylvain Paris Automatic Image Adjustment Parameter Correction
US8897604B2 (en) 2011-09-23 2014-11-25 Alibaba Group Holding Limited Image quality analysis for searches
US9760830B2 (en) 2011-12-28 2017-09-12 Siemens Aktiengesellschaft Control method and control system
US10548474B2 (en) 2012-02-21 2020-02-04 Massachusetts Eye & Ear Infirmary Calculating conjunctival redness
US11298018B2 (en) 2012-02-21 2022-04-12 Massachusetts Eye And Ear Infirmary Calculating conjunctival redness
US9854970B2 (en) 2012-02-21 2018-01-02 Massachusetts Eye & Ear Infirmary Calculating conjunctival redness
WO2013126568A1 (en) * 2012-02-21 2013-08-29 Massachusetts Eye & Ear Infirmary Calculating conjunctival redness
US11179057B2 (en) 2012-04-14 2021-11-23 Nocimed, Inc. Magnetic resonance spectroscopy pulse sequence, acquisition, and processing system and method
US10045711B2 (en) 2012-04-14 2018-08-14 Nocimed, Inc. Magnetic resonance spectroscopy pulse sequence, acquisition, and processing system and method
US10646135B2 (en) 2012-04-14 2020-05-12 Nocimed, Inc. Magnetic resonance spectroscopy pulse sequence, acquisition, and processing system and method
US11633124B2 (en) 2012-04-14 2023-04-25 Aclarion, Inc. Magnetic resonance spectroscopy pulse sequence, acquisition, and processing system and method
US9754346B2 (en) * 2012-09-18 2017-09-05 Fujifilm Corporation Still image generation device for providing a first and second mixture of an in-focus image and an out-of-focus image
US20150187046A1 (en) * 2012-09-18 2015-07-02 Fujifilm Corporation Still image display device and system, and imaging device
US20150359413A1 (en) * 2013-02-04 2015-12-17 Orpheus Medical Ltd. Color reduction in images of an interior of a human body
WO2014118786A1 (en) * 2013-02-04 2014-08-07 Orpheus Medical Ltd. Color reduction in images of human body
US9936858B2 (en) * 2013-02-04 2018-04-10 Orpheus Medical Ltd Color reduction in images of an interior of a human body
CN103325128A (en) * 2013-05-16 2013-09-25 深圳市理邦精密仪器股份有限公司 Method and device intelligently identifying characteristics of images collected by colposcope
US20150244946A1 (en) * 2013-11-04 2015-08-27 Sos Agaian Method and systems for thermal image / video measurements and processing
US11844571B2 (en) 2014-05-02 2023-12-19 Massachusetts Eye And Ear Infirmary Grading corneal fluorescein staining
US11350820B2 (en) 2014-05-02 2022-06-07 Massachusetts Eye And Ear Infirmary Grading corneal fluorescein staining
US10004395B2 (en) 2014-05-02 2018-06-26 Massachusetts Eye And Ear Infirmary Grading corneal fluorescein staining
US10492674B2 (en) 2014-05-02 2019-12-03 Massachusetts Eye And Ear Infirmary Grading corneal fluorescein staining
US10026010B2 (en) * 2014-05-14 2018-07-17 At&T Intellectual Property I, L.P. Image quality estimation using a reference image portion
US20150332123A1 (en) * 2014-05-14 2015-11-19 At&T Intellectual Property I, L.P. Image quality estimation using a reference image portion
CN104504676A (en) * 2014-11-07 2015-04-08 嘉兴学院 Full-reference image quality evaluation method based on multi-vision sensitive feature similarity
US11750782B2 (en) * 2015-05-17 2023-09-05 Endochoice, Inc. Endoscopic image enhancement using contrast limited adaptive histogram equalization (CLAHE) implemented in a processor
US20220239878A1 (en) * 2015-05-17 2022-07-28 Endochoice, Inc. Endoscopic image enhancement using contrast limited adaptive histogram equalization (clahe) implemented in a processor
US11330238B2 (en) * 2015-05-17 2022-05-10 Endochoice, Inc. Endoscopic image enhancement using contrast limited adaptive histogram equalization (CLAHE) implemented in a processor
US9832484B2 (en) * 2015-05-20 2017-11-28 Texas Instruments Incorporated Still block detection in a video sequence
US20160345021A1 (en) * 2015-05-20 2016-11-24 Texas Instruments Incorporated Still Block Detection in a Video Sequence
US10289940B2 (en) * 2015-06-26 2019-05-14 Here Global B.V. Method and apparatus for providing classification of quality characteristics of images
US20170147858A1 (en) * 2015-11-19 2017-05-25 Microsoft Technology Licensing, Llc Eye feature identification
US10043075B2 (en) * 2015-11-19 2018-08-07 Microsoft Technology Licensing, Llc Eye feature identification
US10402967B2 (en) * 2015-12-21 2019-09-03 Koninklijke Philips N.V. Device, system and method for quality assessment of medical images
US20170178320A1 (en) * 2015-12-21 2017-06-22 Koninklijke Philips N.V. Device, system and method for quality assessment of medical images
CN106971386A (en) * 2016-01-14 2017-07-21 广州市动景计算机科技有限公司 Judge method, device and the client device of image integrity degree and page loading degree
CN107451959A (en) * 2016-05-31 2017-12-08 宇龙计算机通信科技(深圳)有限公司 Image processing method and system
US11564619B2 (en) 2016-06-19 2023-01-31 Aclarion, Inc. Magnetic resonance spectroscopy system and method for diagnosing pain or infection associated with propionic acid
KR101789513B1 (en) * 2016-07-11 2017-10-26 주식회사 인피니트헬스케어 Method of determining image quality in digital pathology system
US10204403B2 (en) * 2016-07-18 2019-02-12 Xiaomi Inc. Method, device and medium for enhancing saturation
US20180018762A1 (en) * 2016-07-18 2018-01-18 Xiaomi Inc. Method, device and medium for enhancing saturation
US10482633B2 (en) * 2016-09-12 2019-11-19 Zebra Medical Vision Ltd. Systems and methods for automated detection of an indication of malignancy in a mammographic image
US10957079B2 (en) * 2016-09-12 2021-03-23 Zebra Medical Vision Ltd. Systems and methods for automated detection of an indication of malignancy in a mammographic image
US20180174284A1 (en) * 2016-12-20 2018-06-21 Fujitsu Limited Biometric image processing device, biometric image processing method and computer-readable non-transitory medium
US10643317B2 (en) * 2016-12-20 2020-05-05 Fujitsu Limited Biometric image processing device, biometric image processing method and computer-readable non-transitory medium
CN107203991A (en) * 2017-04-19 2017-09-26 山西农业大学 A kind of half reference image quality appraisement method based on spectrum residual error
US20200257888A1 (en) * 2017-10-20 2020-08-13 Nec Corporation Three-dimensional facial shape estimating device, three-dimensional facial shape estimating method, and non-transitory computer-readable medium
US11488415B2 (en) * 2017-10-20 2022-11-01 Nec Corporation Three-dimensional facial shape estimating device, three-dimensional facial shape estimating method, and non-transitory computer-readable medium
US10922803B2 (en) * 2017-11-24 2021-02-16 Ficosa Adas, S.L.U. Determining clean or dirty captured images
US20190164271A1 (en) * 2017-11-24 2019-05-30 Ficosa Adas, S.L.U. Determining clean or dirty captured images
US11288800B1 (en) * 2018-08-24 2022-03-29 Google Llc Attribution methodologies for neural networks designed for computer-aided diagnostic processes
CN109473149A (en) * 2018-11-09 2019-03-15 天津开心生活科技有限公司 Data Quality Assessment Methodology, device, electronic equipment and computer-readable medium
US11024023B2 (en) * 2018-11-19 2021-06-01 Vision Guided Robotics LLC Inspection system
US20200160498A1 (en) * 2018-11-19 2020-05-21 Vision Guided Robotics LLC Inspection system
US11538204B2 (en) * 2019-02-27 2022-12-27 Daikin Industries, Ltd. Information providing system
US20210118122A1 (en) * 2019-10-22 2021-04-22 Canon U.S.A., Inc. Apparatus and Method for Inferring Contrast Score of an Image
US11669949B2 (en) * 2019-10-22 2023-06-06 Canon U.S.A., Inc. Apparatus and method for inferring contrast score of an image
US11823378B2 (en) * 2019-12-20 2023-11-21 PAIGE.AI, Inc. Systems and methods for processing electronic images to detect contamination in specimen preparations
US20210192729A1 (en) * 2019-12-20 2021-06-24 PAIGE.AI, Inc. Systems and methods for processing electronic images to detect contamination in specimen preparations
US11182899B2 (en) * 2019-12-20 2021-11-23 PAIGE.AI, Inc. Systems and methods for processing electronic images to detect contamination
RU2734575C1 (en) * 2020-04-17 2020-10-20 Общество с ограниченной ответственностью "АЙРИМ" (ООО "АЙРИМ") Method and system for identifying new growths on x-ray images
US11288771B2 (en) * 2020-04-29 2022-03-29 Adobe Inc. Texture hallucination for large-scale image super-resolution
CN113393461A (en) * 2021-08-16 2021-09-14 北京大学第三医院(北京大学第三临床医学院) Method and system for screening metaphase chromosome image quality based on deep learning
US20230342899A1 (en) * 2022-04-25 2023-10-26 Rivian Ip Holdings, Llc Systems and methods for determining image capture degradation of a camera sensor
US11798151B1 (en) * 2022-04-25 2023-10-24 Rivian Ip Holdings, Llc Systems and methods for determining image capture degradation of a camera sensor

Also Published As

Publication number Publication date
US8295565B2 (en) 2012-10-23
WO2008115410A4 (en) 2009-10-15
WO2008115405A3 (en) 2009-08-13
US20080226147A1 (en) 2008-09-18
EP2137696A2 (en) 2009-12-30
JP2010521272A (en) 2010-06-24
WO2008115405A2 (en) 2008-09-25
US8401258B2 (en) 2013-03-19
WO2008115410A3 (en) 2009-08-13
WO2008115410A2 (en) 2008-09-25

Similar Documents

Publication Publication Date Title
US8295565B2 (en) Method of image quality assessment to produce standardized imaging data
US10531825B2 (en) Thresholding methods for lesion segmentation in dermoscopy images
US10192099B2 (en) Systems and methods for automated screening and prognosis of cancer from whole-slide biopsy images
US8131054B2 (en) Computerized image analysis for acetic acid induced cervical intraepithelial neoplasia
US8483454B2 (en) Methods for tissue classification in cervical imagery
Fauzi et al. Computerized segmentation and measurement of chronic wound images
US20150379712A1 (en) Medical image processing
Siddalingaswamy et al. Automatic localization and boundary detection of optic disc using implicit active contours
Vécsei et al. Automated Marsh-like classification of celiac disease in children using local texture operators
Ramakanth et al. Approximate nearest neighbour field based optic disk detection
Maghsoudi et al. A computer aided method to detect bleeding, tumor, and disease regions in Wireless Capsule Endoscopy
JP6578058B2 (en) Image processing apparatus, method for operating image processing apparatus, and operation program for image processing apparatus
Garnavi Computer-aided diagnosis of melanoma
Souaidi et al. A fully automated ulcer detection system for wireless capsule endoscopy images
US20210209755A1 (en) Automatic lesion border selection based on morphology and color features
CN113689424B (en) Ultrasonic inspection system capable of automatically identifying image features and identification method
WO2023272395A1 (en) Classification and improvement of quality of vascular images
Isavand Rahmani et al. Retinal blood vessel segmentation using gabor filter and morphological reconstruction
Chen et al. Saliency-based bleeding localization for wireless capsule endoscopy diagnosis
Zhang et al. Abnormal region detection in gastroscopic images by combining classifiers on neighboring patches
Gu et al. Automatic image quality assessment for uterine cervical imagery
Zebari et al. Skin Lesion Segmentation Using K-means Clustering with Removal Unwanted Regions
Breneman Towards early-stage malignant melanoma detection using consumer mobile devices
Gómez et al. Finding regions of interest in pathological images: An attentional model approach
Pewton et al. Dermoscopic dark corner artifacts removal: Friend or foe?

Legal Events

Date Code Title Description
AS Assignment

Owner name: STI MEDICAL SYSTEMS, LLC, HAWAII

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GU, JIA;LI, WENJING;HARGROVE, JOHN TAYLOR;AND OTHERS;REEL/FRAME:022275/0442

Effective date: 20090210

AS Assignment

Owner name: CADES SCHUTTE A LIMITED LIABILITY LAW PARTNERSHIP

Free format text: UCC FINANCING STATEMENT;ASSIGNOR:STI MEDICAL SYSTEMS, LLC;REEL/FRAME:030744/0957

Effective date: 20130701

REMI Maintenance fee reminder mailed
LAPS Lapse for failure to pay maintenance fees
STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20161023