US20100008546A1 - Pattern identification method, registration device, verification device and program - Google Patents

Pattern identification method, registration device, verification device and program Download PDF

Info

Publication number
US20100008546A1
US20100008546A1 US12/445,519 US44551907A US2010008546A1 US 20100008546 A1 US20100008546 A1 US 20100008546A1 US 44551907 A US44551907 A US 44551907A US 2010008546 A1 US2010008546 A1 US 2010008546A1
Authority
US
United States
Prior art keywords
pattern
living body
distribution
center
blood vessel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/445,519
Inventor
Hiroshi Abe
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Assigned to SONY CORPORATION reassignment SONY CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ABE, HIROSHI
Publication of US20100008546A1 publication Critical patent/US20100008546A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1365Matching; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/14Vascular patterns

Definitions

  • the present invention relates to a pattern identification method, registration device, verification device and program, and is preferably applied to biometrics authentication.
  • a blood vessel has been among the subjects of biometrics authentication.
  • a blood vessel image of a registrant is usually registered in an authentication device as registration data.
  • the authentication device makes a determination as to whether a person is the registrant according to how much verification data, which is input as data to be verified, resembles the registration data.
  • the authentication device may obtain a pattern (referred to as pseudo blood vessel pattern, hereinafter) that resembles a pattern of blood vessels (referred to as blood vessel pattern, hereinafter) because tubes inside the radish such as vessels, sieve tubes, and fascicles, look like the blood vessels of a living body: the use of radish or the like allows identity theft.
  • pseudo blood vessel pattern a pattern of blood vessels
  • blood vessel pattern a pattern of blood vessels
  • Patent Document 1 Japanese Patent Publication No. 2002-259345Non Patent Document 1: Tsutomu Matsumoto, “Biometrics Authentication for Financial Transaction,” [online], Apr. 15, 2005, the 9th study group of the Financial Services Agency for the issues on forged cash cards, (searched on Aug. 21, 2006), Internet ⁇ URL: http://www.fsa.go.jp/singi/singi_fccsg/gaiyou/f-20050415-singi_fccsg/02.pdf>)
  • the coordinates and other factors of the pseudo blood vessel pattern can not be exactly the same as those of the registrant's blood vessel pattern. So even if the above identity theft prevention method is applied, anyone can be identified as the registrant, allowing identity theft and lowering the accuracy of authentication.
  • the present invention has been made in view of the above points and is intended to provide a pattern identification method, registration device, verification device and program that can improve the accuracy of authentication.
  • a pattern identification method of the present invention includes the steps of: calculating, for each of living body's patterns obtained from a plurality of living body's samples, two or more form values representing the shape of the pattern; calculating the center of the distribution of the two or more form values and a value representing the degree of the spread from the center; calculating a distance between the two or more form values of a pattern obtained from those to be registered or to be compared with registered data and the center of the distribution of the two or more form values using the value representing the degree of the spread from the center; and disposing of the pattern if the distance is greater than a predetermined threshold.
  • this pattern identification method can recognize where the pattern obtained from those to be either registered or compared with the registered data exists in the distribution having a plurality of dimensions (pattern form values) regarding each living body's pattern, and whether it exists within a range extending from the center of the distribution to a boundary (threshold): existing inside the range means that it is a living body's pattern.
  • this pattern identification method can increase the possibility that it eliminates a pseudo pattern resembling the living body's pattern before registering or comparing them, even if the pattern obtained from those to be either registered or compared with the registered data is the pseudo pattern.
  • a registration device of the present invention includes: storage means for storing, for each of living body's patterns obtained from a plurality of living body's samples, the center of the distribution of two or more form values representing the shape of the pattern and a value representing the degree of the spread from the center; calculation means for calculating a distance between the two or more form values of a pattern obtained from those to be registered and the center of the distribution of the two or more form values stored in the storage means using the value; and registration means for disposing of the pattern if the distance is greater than a predetermined threshold while registering the pattern in a storage medium if the distance is within the threshold.
  • this registration device can recognize where the pattern obtained from those to be registered exists in the distribution having a plurality of dimensions (pattern form values) regarding each living body's pattern, and whether it exists within a range extending from the center of the distribution to a boundary (threshold): existing inside the range means that it is a living body's pattern.
  • this registration device can increase the possibility that it eliminates a pseudo pattern resembling the living body's pattern before registering them, even if the pattern obtained from those to be registered is the pseudo pattern.
  • a verification device of the present invention includes: storage means for storing, for each of living body's patterns obtained from a plurality of living body's samples, the center of the distribution of two or more form values representing the shape of the pattern and a value representing the degree of the spread from the center; calculation means for calculating a distance between the two or more form values of a pattern obtained from those to be registered and the center of the distribution of the two or more form values stored in the storage means; and verification means for disposing of the pattern if the distance is greater than a predetermined threshold while comparing the pattern with registered data registered in a storage medium if the distance is within the threshold.
  • this verification device can recognize where the pattern obtained from those to be compared exists in the distribution having a plurality of dimensions (pattern form values) regarding each living body's pattern, and whether it exists within a range extending from the center of the distribution to a boundary (threshold): existing inside the range means that it is a living body's pattern.
  • this verification device can increase the possibility that it eliminates a pseudo pattern resembling the living body's pattern before comparing them, even if the pattern obtained from those to be compared is the pseudo pattern.
  • a program of the present invention causing a computer that stores, for each of living body's patterns obtained from a plurality of living body's samples, the center of the distribution of two or more form values representing the shape of the pattern and a value representing the degree of the spread from the center, executes: a first process of calculating a distance between the two or more form values of a pattern obtained from those to be registered and the center of the distribution of the two or more form values stored in the storage means using the value; and a second process of disposing of the pattern if the distance is greater than a predetermined threshold while registering the pattern in a storage medium if the distance is within the threshold, or a second process of disposing of the pattern if the distance is greater than a predetermined threshold while comparing the pattern with registered data registered in a storage medium if the distance is within the threshold.
  • this program can recognize where the pattern obtained from those to be either registered or compared with the registered data exists in the distribution having a plurality of dimensions (pattern form values) regarding each living body's pattern, and whether it exists within a range extending from the center of the distribution to a boundary (threshold): existing inside the range means that it is a living body's pattern.
  • this program can increase the possibility that it eliminates a pseudo pattern resembling the living body's pattern before registering or comparing them, even if the pattern obtained from those to be either registered or compared with the registered data is the pseudo pattern.
  • the present invention can recognize where the pattern obtained from those to be either registered or compared with the registered data exists in the distribution having a plurality of dimensions (pattern form values) regarding each living body's pattern, and whether it exists within a range extending from the center of the distribution to a boundary (threshold): existing inside the range means that it is a living body's pattern. Accordingly, they can increase the possibility that it eliminates the pseudo pattern before registering or comparing them by assuming that it is not the living body's pattern.
  • the registration device, verification device, extraction method and program that are able to improve the accuracy of authentication can be realized.
  • FIG. 1 is a block diagram illustrating the configuration of a data generation device according to an embodiment of the present invention.
  • FIG. 2 is a functional block diagram illustrating the image process of a control section.
  • FIG. 3 is a schematic diagram illustrating images before and after a preprocessing process.
  • FIG. 4 is a schematic diagram as to a description of an emerging pattern of an end point, a diverging point, and an isolated point.
  • FIG. 5 is a schematic diagram illustrating a tracking of a blood vessel line between a diverging point and a diverging or end point.
  • FIG. 6 is a schematic diagram as to a description of a tracking of a blood vessel pixel.
  • FIG. 7 is a schematic diagram illustrating an emerging pattern of a point on a line and an inflection point.
  • FIG. 8 is a schematic diagram as to a description of the detection of an inflection point.
  • FIG. 9 is a schematic diagram as to a description of the determination of an overlap ratio of a segment's pixel with respect to an original blood vessel pixel.
  • FIG. 10 is a flowchart illustrating the procedure of a removal process.
  • FIG. 11 is a schematic diagram illustrating an inflection point before and after removal.
  • FIG. 12 is a schematic diagram illustrating the connection of segment blood vessel lines (three diverging points).
  • FIG. 13 is a schematic diagram illustrating the connection of segment blood vessel lines (four diverging points).
  • FIG. 14 is a schematic diagram illustrating characteristic points obtained from a characteristic point extraction process.
  • FIG. 15 is a schematic diagram illustrating a blood vessel pattern and a pseudo blood vessel pattern.
  • FIG. 16 is a schematic diagram as to the calculation of an angle of a segment with respect to a horizontal axis passing through the end point of the segment.
  • FIG. 17 is a schematic diagram illustrating an angle distribution of a blood vessel pattern.
  • FIG. 18 is a schematic diagram illustrating an angle distribution of a pseudo blood vessel pattern.
  • FIG. 19 is a schematic diagram illustrating the length of a segment resembling a straight line.
  • FIG. 20 is a schematic diagram illustrating the distribution of distinguishing indicators.
  • FIG. 21 is a schematic diagram illustrating the distribution of distinguishing indicators on a ⁇ -C plane.
  • FIG. 22 is a flowchart illustrating the procedure of a data generation process.
  • FIG. 23 is a block diagram illustrating the configuration of an authentication device according to an embodiment of the present invention.
  • FIG. 24 is a schematic diagram illustrating the procedure of a distinguishing process (1).
  • FIG. 25 is a schematic diagram illustrating the procedure of a distinguishing process (2).
  • An authentication system of the present embodiment includes a data generation device and an authentication device.
  • the data generation device generates data (referred to as blood vessel pattern range data, hereinafter) representing a range: a determination is to be made about blood vessel patterns based on this range.
  • the data generation device records the data in an internal memory of the authentication device.
  • the authentication device is equipped with a function that makes a determination as to whether a pattern of an image data obtained as a result of taking a picture of an object is a pseudo blood vessel pattern according to the blood vessel pattern range data.
  • FIG. 1 shows the configuration of the data generation device.
  • the data generation device 1 includes a control section 10 to which an operation section 11 , an image pickup section 12 , a flash memory 13 , and a interface (referred to as external interface, hereinafter) 14 that exchanges data with an external section are connected via a bus 15 .
  • a control section 10 to which an operation section 11 , an image pickup section 12 , a flash memory 13 , and a interface (referred to as external interface, hereinafter) 14 that exchanges data with an external section are connected via a bus 15 .
  • the control section 10 is a microcomputer including CPU (Central Processing Unit) that takes overall control of the data generation device 1 , ROM (Read Only Memory) that stores various programs and setting information, and RAM (Random Access Memory) that serves as a work memory for CPU.
  • CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • an image pickup command COM 1 or a command COM 2 that orders the generation of the blood vessel pattern range data is given to the control section 10 from the operation section 11 .
  • the control section 10 Based on the execution commands COM 1 and COM 2 , the control section 10 makes a determination as to which mode it should start. Using a program corresponding to the determination, the control section 10 appropriately controls the image pickup section 12 , the flash memory 13 , and the external interface 14 to run in image pickup mode or data, generation mode.
  • control section 10 enters the image pickup mode, which is an operation mode, to control the image pickup section 12 .
  • a drive control section 12 a of the image pickup section 12 drives and controls one or more near infrared beam sources LS that emit a near infrared beam toward a predetermined position of the data generation device 1 , and an image pickup element ID that is for example CCD (Charge Coupled Device).
  • CCD Charge Coupled Device
  • the image pickup element ID After the emission of the near infrared beam to an object placed at the predetermined position, the image pickup element ID receives the near infrared beam from the object via an optical system OP and an aperture diaphragm DH, converts it into electric signals and transmits them to the drive control section 12 a as image signals S 1 .
  • the near infrared beam emitted from the near infrared beam source LS gets into the finger, and, after being reflected and scattered inside the finger, emerges from the finger as a blood vessel representation beam to enter the image pickup element ID: the blood vessel representation beam represents the finger's blood vessels.
  • the blood vessel representation beam is then transmitted to the drive control section 12 a as the image signals S 1 .
  • the drive control section 12 a adjusts the position of an optical lens of the optical system OP according to the pixel values of the image signals S 1 , so that the object is in focus.
  • the drive control section 12 a also adjusts the aperture of the aperture diaphragm DH so that the amount of light entering the image pickup element ID becomes appropriate. After the adjustment, an image signal S 2 output from the image pickup element ID is supplied to the control section 10 .
  • the control section 10 performs a predetermined image process for the image signals S 2 to extract a characteristic of an object pattern from the image, and stores the extracted image in the flash memory 13 as image data D 1 .
  • control section 10 can perform the image pickup mode.
  • the image process can be divided into a preprocessing section 21 and a characteristic point extraction section 22 .
  • the following provides a detailed description of the preprocessing section 21 and the characteristic point extraction section 22 .
  • the image signals S 2 supplied from the image pickup section 12 are those obtained as a result of taking a picture of a living body's finger.
  • the preprocessing section 21 sequentially performs an A/D (Analog/Digital) conversion process, a predetermined outline extraction process such as Sobel filtering, a predetermined smoothing process such as Gaussian filtering, a binarization process, and a thinning process for the image signals S 2 supplied from the image pickup section 12 .
  • A/D Analog/Digital
  • a predetermined outline extraction process such as Sobel filtering
  • a predetermined smoothing process such as Gaussian filtering
  • a binarization process a thinning process for the image signals S 2 supplied from the image pickup section 12 .
  • an image (the image signals S 2 ) shown in FIG. 3(A) is input into the preprocessing section 21 : thanks to the preprocessing by the preprocessing section 21 , the image is converted into an image shown in FIG. 3(B) , with the blood vessel pattern of the image emphasized.
  • the preprocessing section 21 outputs data (referred to as image data, hereinafter) D 21 whose image shows the extracted blood vessel pattern to the characteristic point extraction section 22 .
  • the blood vessel lines (the blood vessel pattern) included in the image of the image data 21 are converted by the binarization process into white pixels; their widths (or thickness) are represented as “1” as a result of the thinning process. If the width of the blood vessel line is “1,” then the width of the line is one pixel.
  • the characteristic point extraction section 22 detects end points, diverging points, and inflection points from the white pixels (referred to as blood vessel pixels, hereinafter) that constitute a blood vessel patter of the input image, and appropriately removes the inflection points with reference to the end points and the diverging points.
  • white pixels referred to as blood vessel pixels, hereinafter
  • the characteristic point extraction section 22 detects the end and diverging points from the blood vessel lines in the first stage of the process.
  • the characteristic point extraction section 22 recognizes the blood vessel pixels as attention pixels in a predetermined order, and examines the eight pixels around the attention pixel to count the number of the blood vessel pixels.
  • FIG. 4 shows a pattern of how the end, diverging and isolated points of the blood vessel lines appear.
  • a hatched area represents the attention pixel;
  • a black area represents the blood vessel pixel (the white pixel) for ease of explanation. It is obvious from FIG. 4 . that if the width of the blood vessel line is represented as one pixel, the correlation between the attention pixel and the number of the blood vessel pixels is self determined; as for the diverging pattern, it must have three or four diverging points.
  • the characteristic point extraction section 22 detects this attention pixel as the end point.
  • the characteristic point extraction section 22 detects this attention pixel as the diverging point.
  • the characteristic point extraction section 22 detects the attention pixel as the isolated point.
  • the characteristic point extraction section 22 removes the isolated points, which do not constitute the blood vessel line, from the detected end, diverging and isolated points.
  • the characteristic point extraction section 22 detects the end and diverging points from the blood vessel lines in the first stage of the process.
  • the characteristic point extraction section 22 detects the inflection points in the second stage of the process.
  • the characteristic point extraction section 22 recognizes the diverging point DP 1 as a starting point, and other characteristic points (the ending points EP 1 and EP 2 , and the diverging point DP 2 ), which appear after the starting point (or the diverging point DP 1 ), as a terminal point; it then tracks a segment of the blood vessel line (referred to as segment blood vessel line, hereinafter) extending from the starting point to the terminal point.
  • segment blood vessel line referred to as segment blood vessel line, hereinafter
  • the characteristic point extraction section 22 recognizes the diverging point DP 2 as a starting point, and other characteristic points (the ending points EP 3 and EP 4 ), which appear after the starting point (or the diverging point DP 2 ), as a terminal point; it then tracks the segment blood vessel line.
  • the starting points are the diverging points DP 1 and DP 2 , but the end points can also be the starting points.
  • the end points can only be either the starting or terminal points, while there is another diverging point (or points) before or after (or at both sides of) the diverging point regardless of whether it is the starting or terminal point.
  • FIG. 6 illustrates a specific method of tracking.
  • the characteristic point extraction section 22 sequentially tracks the blood vessel pixels of the segment blood vessel line from the starting point to the terminal point by performing a process of excluding the previous attention pixel (a pixel filled with horizontal lines) from the blood vessel pixels around the current attention pixel (a hatched pixel) and choosing from them the next attention pixel until the blood vessel pixels around the current attention pixel include the terminal point.
  • FIG. 7 shows a pattern of how the points on the line and the inflection points appear.
  • a hatched area represents the attention pixel;
  • a black area represents the blood vessel pixel (the white pixel), for ease of explanation.
  • the characteristic point extraction section 22 detects the current attention pixel as the inflection point (a pixel hatched in a grid-like pattern).
  • the characteristic point extraction section 22 After reaching the terminal point, the characteristic point extraction section 22 recognizes a series of characteristic points extending from the segment blood vessel line starting point to the terminal point as one group.
  • the characteristic point extraction section 22 detects the inflection points of each segment blood vessel line extending from one diverging or end point to the next diverging or end point in the second stage of the process.
  • the characteristic point extraction section 22 recognizes the group of the characteristic points, or the series of the characteristic points extending from the segment blood vessel line' starting, point to the terminal point, as one processing unit (referred to as segment blood vessel constituting-points row, hereinafter), and removes the inflection points from the segment blood vessel line.
  • a square area represents a pixel (referred to as original blood vessel pixel, hereinafter) constituting the original blood vessel line; a hatched area represents the end or diverging point of the original blood vessel pixel.
  • segment blood vessel constituting-points row there are the original blood vessel pixels from the characteristic point (referred to as reference point, hereinafter) GP bs , which was selected as a point of reference, to removal candidate points GP cd (GP cd1 to GP cd3 ); there are segments SG (SG 1 to SG 3 ) extending from the reference point GP bs to the removal candidate points GP cd .
  • reference point hereinafter
  • the characteristic point extraction section 22 counts the number of the segment SG's pixels (referred to as segment pixels, hereinafter) overlapped with the original blood vessel pixels, and gradually moves the removal candidate point GP cd toward the terminal point until the ratio of the number of the overlapped pixels to the number of pixels existing between the reference point GP bs and the removal candidate point GP cd becomes less than a predetermined threshold (referred to as overlap ratio threshold).
  • segment pixels (seven pixels) of the segment SG 2 are overlapped with four of the original blood vessel pixels (seven pixels) existing between the reference point GP bs and the corresponding removal candidate point GP cd2 , and this means that the overlap ratio is “4/7.”
  • segment pixels (nine pixels) of the segment SG 3 are overlapped with two of the original blood vessel pixels (nine pixels) existing between the reference point GP bs and the corresponding removal candidate point GP cd3 , and this means that the overlap ratio is “2/9.”
  • the characteristic point extraction section 22 removes the characteristic point GP cd1 between the characteristic point, which was selected as the removal candidate point GP cd2 immediately before the removal candidate point (characteristic point) GP cd3 , and the reference point GP bs . Accordingly, even if the characteristic point GP cd1 is removed, the segment SG 2 extending from the reference point GP bs to the remaining characteristic point GP cd2 can substantially represents the original blood vessel line.
  • the characteristic point GP cd1 may be removed even when the segment SG 3 does not resemble a series of original blood vessel pixels (a segment blood vessel line) extending from the reference point GP bs to the removal candidate point GP cd3 . If the overlap ratio threshold is too large, the characteristic point GP cd1 may be left.
  • a first overlap ration threshold is set; if it is less than the segment length threshold, a second overlap ratio threshold, which is larger than the first overlap ratio threshold, is set.
  • FIG. 34 shows a procedure of this process.
  • the characteristic point extraction section 22 selects the starting point of the segment blood vessel constituting-points row as the reference point, and selects the first characteristic point from the reference point as the removal candidate point (step SP 1 ).
  • the characteristic point extraction section 22 makes a determination as to whether this is a case in which it calculates the overlap ratio for the first time after starting the removal process of the inflection points or a case in which it makes a determination as to whether the length of the previous segment GP J+( ⁇ 1) ⁇ GP J+ ⁇ , which appeared immediately before the segment GP J ⁇ GP J+ ⁇ extending from the current reference point GP J to the removal candidate point GP J+a , is less than the segment length threshold (step SP 32 ).
  • the characteristic point extraction section 22 sets the first overlap ratio threshold as the overlap ratio threshold (step SP 33 ), calculates the overlap ratio of the current segment GP J ⁇ GP J+ ⁇ extending from the reference point GP J to the removal candidate point GP J+ ⁇ with respect to the original blood vessel pixels (step SP 34 ), and makes a determination as to whether this overlap ratio is greater than or equal to the first overlap ratio threshold (step SP 35 ).
  • the characteristic point extraction section 22 sets the second overlap ratio threshold as the overlap ratio threshold (step SP 36 ), calculates the overlap ratio of the current segment GP J ⁇ GP J+ ⁇ extending from the reference point GP J to the removal candidate point GP J+ ⁇ with respect to the original blood vessel pixels (step SP 34 ), and makes a determination as to whether this overlap ratio is greater than or equal to the second overlap ratio threshold (step SP 35 ).
  • the overlap ratio is greater than or equal to the overlap ratio threshold, this means that the current segment GP J ⁇ GP J+ ⁇ extending from the reference point GP J to the removal candidate point GP J+ ⁇ resembles, or is the same as, the original blood vessel line extending from the reference point GP J to the removal candidate point GP J+ ⁇ .
  • the characteristic point extraction section 22 makes a determination as to whether the current removal candidate point GP J+ ⁇ is the terminal point of the segment blood vessel constituting-points row (step SP 37 ); if it is not the terminal point, the characteristic point extraction section 22 selects the next characteristic point, which is closer to the terminal point than the current removal candidate point GP J+ ⁇ is, as a new removal candidate point GP J+ ⁇ (step SP 38 ) before returning to the above-described process (step SP 32 ).
  • the overlap ratio is less than or equal to the overlap ratio threshold, this means that the current segment GP J ⁇ GP J+ ⁇ extending from the reference point GP J to the removal candidate point GP J+ ⁇ is completely different from the original blood vessel line extending from the reference point GP J to the removal candidate point GP J+ ⁇ .
  • the characteristic point extraction section 22 removes all the characteristic points between the characteristic point, which was selected as the removal candidate point GP J+ ⁇ immediately before the current one, and the current reference point (characteristic point) GP J (step SP 39 ).
  • the characteristic point extraction section 22 makes a determination as to whether the current removal candidate point GP J+ ⁇ is the terminal point of the segment blood vessel constituting-points row (step SP 40 ); if it is not the terminal point, the characteristic point extraction section 22 selects the current removal candidate point GP J+ ⁇ as the reference point GP J and the next characteristic point, which is closer to the terminal point than the reference point GP J is, as a new removal candidate point GP J+ ⁇ (step SP 41 ) before returning to the above-noted process (step SP 32 ).
  • the characteristic point extraction section 22 removes all the characteristic points between the current removal candidate point (characteristic point) GP J+ ⁇ and the current reference point (characteristic point) GP J (step SP 42 ) before ending this removal process of the inflection points.
  • the characteristic point extraction section 22 performs the removal process of the inflection points.
  • FIG. 11 shows those before and after the removal process.
  • the segment length threshold for the removal process is 5 [mm]; the first overlap ratio threshold is 0.5 (50[%]); the second overlap ratio threshold is 0.7 (70[%]).
  • a square area represents the original blood vessel pixel; a circular area represents the pixel constituting the segment; a hatched area represents the end or inflection point of the original blood vessel pixel.
  • the characteristic point extraction section 22 chooses, from among three or four segment blood vessel lines extending from the diverging point on the blood vessel line, the two segment blood vessel lines that, if combined, resembles a straight line, and connects them as one segment blood vessel line, thereby removing the starting or terminal point, which was the end point of the two segment blood vessel line.
  • the width of the blood vessel is one pixel
  • the number of segment blood vessel lines extending from the diverging point must be three or four, as described above with reference to FIG. 4 .
  • the characteristic point extraction section 22 calculates the cosines (cos ( ⁇ A-B ), cos ( ⁇ A-C ), cos ( ⁇ B-C )) of the crossing angles ⁇ A-B , ⁇ A-C , and ⁇ B-C of each pair of the segment blood vessel lines PBL A , PBL B , and PBL C .
  • the characteristic point extraction section 22 recognizes the pair of the segment blood vessel lines' segment blood vessel constituting-points rows GP A1 , GP A2 , . . . , GP A-END and GP B1 , GP B2 , . . .
  • GP B-END corresponding to the cosine cos ( ⁇ A-B ); recognizes the both ends of these segment blood vessel constituting-points rows; regards the points GP A-END and GP B-END , which have not been overlapped with each other, as the starting and end points; and recognizes the characteristic points between the starting and end points as one group.
  • the pair of the segment blood vessel lines PBL A and PBL B is combined.
  • the number of the segment blood vessel constituting-points row GP AB-first , . . . , GP AB10 , GP AB11 , GP AB12 , . . , GP AB-end of the combined segment blood vessel line PBL AB is one less than the number of the pair of the segment blood vessel lines' segment blood vessel constituting-points rows, which are not combined.
  • the characteristic point extraction section 22 does not recognize any group. If there are other diverging points left, the characteristic point extraction section 22 recognizes the next diverging point as a processing target; if not, the characteristic point extraction section 22 ends the process.
  • the characteristic point extraction section 22 calculates the cosines ( cos ( ⁇ A-B ), cos ( ⁇ A-C ), cos ( ⁇ A-D ), cos ( ⁇ B-C ), cos ( ⁇ B-D ), cos ( ⁇ C-D )) of the crossing angles ⁇ A-B , ⁇ A-C , ⁇ A-D , ⁇ B-C , ⁇ B-D and ⁇ C-D of each pair of the segment blood vessel lines PBL A , PBL B , PBL C and PBL D .
  • the characteristic point extraction section 22 recognizes the pair of the segment blood vessel lines' segment blood vessel constituting-points rows GP B1 , GP B2 , . . . , GP B-END and GP D1 , GP D2 , . . .
  • GP D-END corresponding to the cosine cos ( ⁇ B-D ); recognizes the both ends of these segment blood vessel constituting-points rows; regards the points GP B-END and GP D-END , which have not been overlapped with each other, as the starting and end points; and recognizes the characteristic points between the starting and end points as one group.
  • the pair of the segment blood vessel lines PBL B and PBL D is combined.
  • the number of the segment blood vessel constituting-points row GP BD-first , . . . , GP BD10 , GP BD11 , GP BD12 , . . . , GP BD-end of the combined segment blood vessel line PBL BD is one less than the number of the pair of the segment blood vessel lines' segment blood vessel constituting-points rows, which are not combined.
  • the characteristic point extraction section 22 transforms the segment blood vessel constituting-points rows of the segment blood vessel lines PBL A and PBL C into one segment blood vessel constituting-points row GP AC-first , . . . , GP AC10 , GP AC11 , GP AC12 , . .
  • GP AC-end in the same way as it has done for the segment blood vessel constituting-points rows of the segment blood vessel lines PBL B and PBL D ; and removes one of the starting points GP A1 and GP C1 , which were the end points of the original segment blood vessel constituting-points rows.
  • the characteristic point extraction section 22 does not recognize any group. If there are other diverging points left, the characteristic point extraction section 22 recognizes the next diverging point as a processing target; if not, the characteristic point extraction section 22 ends the process.
  • the overlapping points have the same positional (or coordinate) information. But since each belongs to a different group, they are distinguished for ease of explanation.
  • the characteristic point extraction section 22 recognizes the blood vessel lines extending from the diverging points on the blood vessel line; recognizes the pair of the blood vessel lines whose crossing angle's cosine is less than the second cosine threshold; and combines the segment blood vessel lines' segment blood vessel constituting-points rows into one segment blood vessel constituting-points row, thereby removing either the starting or terminal point, which was the end point of the pair of the segment blood vessel constituting-points rows.
  • the characteristic point extraction section 22 detects the end, diverging and inflection points (the first and second stages); and extracts, from among these points, the blood vessel lines' characteristic points on group (segment blood vessel line row) basis with each group being based on the end and diverging points, so that a line passing through the characteristic points resemble both a blood vessel line and a straight line (the third and fourth stages).
  • the characteristic point extraction process of the characteristic point extraction section 22 extracts the characteristic points from the image, as shown in FIG. 14 , so that a line passing through the characteristic points resembles both a blood vessel line and a straight line.
  • the characteristic point extraction section 22 stores the data (the image data D 1 ) of the image of the extracted characteristic points in the flash memory 13 .
  • control section 10 starts a data generation process using these image data sets D 22 i.
  • the pseudo blood vessel pattern is obtained as a result of taking a picture of a gummi candy (an elastic snack, like rubber, made of gelatin, sugar, and thick malt syrup) or radish.
  • FIG. 15 shows the blood vessel pattern obtained from a living body's finger and the pseudo blood vessel patterns obtained from the gummi candy and the radish. As shown in FIG. 15 , the blood vessel pattern ( FIG. 15(A) ) and the pseudo blood vessel patterns ( FIG. 15(B) ) look like the same pattern overall.
  • the distribution of the angles of the image's horizontal direction with respect to the segments connecting the characteristic points of the pattern is represented with the length of the segment (the number of pixels constituting the segment) represented as frequency.
  • the concentration is observed at 90-degrees point and around it; as for the pseudo blood vessel pattern ( FIG. 18(A) ) obtained from the gummi candy and the pseudo blood vessel pattern ( FIG. 18(B) ) obtained from the radish, it spreads between 0 degree and 180 degrees, showing a lack of regularity. This is because the blood vessel pattern does not spread but has certain directivity (along the length of the finger).
  • the blood vessel pattern's segments resembling a straight line ( FIG. 19 (A)) are longer than those of the pseudo blood vessel pattern ( FIG. 19(B) ) obtained from the gummi candy and the pseudo blood vessel pattern ( FIG. 19(C) ) obtained from the radish. Therefore, the number of the segments (the segment blood vessel lines) recognized as groups by the above characteristic point extraction process is less than that of the pseudo blood vessel patterns.
  • the distinguishing indicators of the blood vessel pattern and the pseudo blood vessel pattern may be: first, the spread of the angle distribution; second, the intensity of the angle distribution at the 90-degrees point and around it; and, third, the number of the segments recognized as groups.
  • the spread of the angle distribution can be represented by the variance of the distribution (or standard deviation).
  • the angles of the image's horizontal direction with respect to the segments are represented by ⁇ K
  • the length of the segments is represented by L K
  • the average of the distribution of the angles ⁇ K of the segments l K is represented, because of the length L K of the segments being weighted, as follows:
  • the intensity of the distribution can be represented by a ratio of the size of the distribution existing within a predetermined angular range around the 90-degrees point to the size of the total distribution. This means that if the angular range is “lower [degree] ⁇ upper [degree]” and the size of the distribution is S, the intensity of the distribution is represented as follows:
  • the number of segments recognized as groups is the number of groups allocated after the above characteristic point extraction process, i.e. the number of the remaining groups (the segment blood vessel constituting-points rows) after the characteristic point extraction section 22 's inflection point detection process of recognizing the rows of the characteristic points (the segment blood vessel constituting-points rows) extending from the starting points through the inflection points to the terminal points and combining the groups (the segment blood vessel constituting-points rows) as one group so that it resembles a straight line.
  • FIG. 20 shows the result of distinguishing between the blood vessel pattern and the pseudo blood vessel pattern obtained from the gummi candy using the three distinguishing indicators.
  • the lightly plotted points are those obtained from the pseudo blood vessel pattern of the gummi candy; the number of samples is 635.
  • the darkly plotted points are those obtained from the blood vessel pattern, which is selected from the five blood vessel patterns generated as a result of taking a picture of a finger five times: the selected blood vessel pattern has the furthest Mahanobis distance from the center of the distribution of the lightly plotted points, and the number of samples is 127.
  • Rf G represents a boundary (referred to as pseudo blood vessel boundary, hereinafter) and the pseudo blood vessel pattern is determined based on this boundary. Specifically, its Mahanobis distance is 2.5 from the center of the distribution of the lightly plotted points.
  • Rf F represents a boundary (referred to as blood vessel boundary, hereinafter) and the blood vessel pattern is determined based on this boundary. Specifically, its Mahanobis distance is 2.1 from the center of the distribution of the darkly plotted points.
  • the blood vessel pattern can substantially be distinguished from the pseudo blood vessel pattern; as long as a ⁇ -C plane of the three-dimensional distribution of FIG. 20 is concerned, the blood vessel pattern can be completely distinguished from the pseudo blood vessel pattern, as shown in FIG. 21 .
  • the spread of the angle distribution is represented by the standard deviation.
  • the data generation process is performed according to a flowchart shown in FIG. 22 .
  • control section 10 reads out a plurality of samples of the image data sets D 1 i from the flash memory 13 , and calculates the three distinguishing indicators for each blood vessel pattern of the image data sets D 1 i (i.e. the variance of the angle distribution, the intensity of the angle distribution, and the number of the segments recognized as groups) (a loop of step SP 1 to SP 5 ).
  • control section 10 substitutes a matrix with the each sample's blood vessel pattern and the blood vessel pattern's distinguishing indicators expressed in columns and rows respectively:
  • represents the variance of the angle distribution
  • P represents the intensity of the angle distribution
  • C represents the number of the segments recognized as groups (step SP 6 ).
  • control section 10 calculates from the matrix of the distinguishing indicators the center of the distribution of the distinguishing indicators of each sample as follows (step SP 7 ):
  • the covariance matrix represents the degree of the spread of the distribution of the distinguishing indicators of each sample; its inverse number is used for the calculation of the Mahalanobis distance.
  • the control section 10 generates the blood vessel pattern range data (which are data representing a range for which the determination of the blood vessel pattern should be made) by using the center of the distribution of the distinguishing indicators, which was calculated at step SP 7 , the inverse matrix of the covariance matrix, which was calculated at step SP 8 , and a predetermined blood vessel boundary number (whose Mahalanobis distance is “2.1” in the case of FIG. 20 ) (step SP 9 ); stores the data in the internal memory of the authentication device (step SP 10 ); and then ends the data generation process.
  • the blood vessel pattern range data which are data representing a range for which the determination of the blood vessel pattern should be made
  • the control section 10 uses the following tendencies as the distinguishing indicators for the blood vessel pattern and the pseudo blood vessel pattern to generate the data (the center of the distribution of the distinguishing indicators, the inverse matrix of the covariance matrix, and the blood vessel boundary number) representing the range for which the determination of the blood vessel pattern should be made: the tendency that the blood vessel pattern does not spread but has certain directivity (along the length of the finger), and the tendency that of all the segments of the blood vessel pattern, the one resembling a straight line is longer than the others.
  • FIG. 23 illustrates the configuration of the authentication device.
  • the data generation device 1 includes a control section 30 to which an operation section 31 , an image pickup section 32 , a flash memory 33 , a external interface 34 and a notification section 35 are connected via a bus 36 .
  • the control section 30 is a microcomputer including CPU that takes overall control of the authentication device 1 , ROM that stores various programs and setting information, and RAM that serves as a work memory for CPU.
  • the blood vessel pattern range data generated by the data generation device 1 are stored in ROM.
  • an execution command COM 10 of a mode (referred to as blood vessel registration mode, hereinafter) in which the blood vessels of a registration-target user (referred to as registrant, hereinafter) are registered or an execution command COM 20 of a mode (referred to as authentication mode, hereinafter) in which a determination as to whether a person is the registrant or not is made is given to the control section 30 from the operation section 31 .
  • the control section 30 Based on the execution commands COM 10 and COM 20 , the control section 30 makes a determination as to which mode it should start. Using a program corresponding to the determination, the control section 30 appropriately controls the image pickup section 32 , the flash memory 33 , the external interface 34 and the notification section 35 to run in blood vessel registration mode or authentication mode.
  • the control section 30 enters the blood vessel registration mode, which is an operation mode, to control the image pickup section 32 .
  • the image pickup section 32 drives and controls a near infrared beam source LS and an image pickup element ID.
  • the image pickup section 32 also adjusts the position of an optical lens of an optical system OP and the aperture of an aperture diaphragm DH based on an image signal S 10 a that the image pickup element ID output as a result of taking a picture of an object put at a predetermined position of the authentication device 2 . After the adjustment, the image pickup section 32 supplies an image signal S 20 a output from the image pickup element ID to the control section 30 .
  • the control section 30 sequentially performs the same preprocessing process and characteristic point extraction process as those of the preprocessing section 21 and characteristic point extraction section 22 ( FIG. 2 ) of the data generation device 1 for the image signals S 20 a, in order to extract an object pattern from the image and to extract a series of characteristic points on group (segment blood vessel constituting-points row) basis, which extends from the starting point to the terminal point via the inflection point.
  • the control section 30 performs a process (referred to as distinguishing process, hereinafter) to distinguish the object pattern as a blood vessel pattern or a pseudo blood vessel pattern; if it recognizes the object pattern as a blood vessel pattern, the control section 30 stores the characteristic points of the object pattern in the flash memory 33 as information (referred to as registrant identification data, hereinafter) DIS, which will be used for identifying the registrant, thereby completing the registration.
  • distinguishing process hereinafter
  • control section 30 performs the blood vessel registration mode.
  • control section 30 determines whether it should perform the authentication mode. If the determination by the control section 30 is that it should perform the authentication mode, the control section 30 enters the authentication mode and controls the image pickup section 32 in a similar way to when it performs the blood vessel registration mode.
  • the image pickup section 32 drives and controls the near infrared beam source LS and the image pickup element ID.
  • the image pickup section 32 also adjusts the position of the optical lens of the optical system OP and the aperture of the aperture diaphragm DH based on an image signal S 10 b that the image pickup element ID output. After the adjustment, the image pickup section 32 supplies an image signal S 20 b output from the image pickup element ID to the control section 30 .
  • the control section 30 sequentially performs the same preprocessing process and characteristic point extraction process as those of the above-described blood vessel registration mode for the image signals S 20 b and reads out the registrant identification data DIS from the flash memory 33 , in which the data DIS has been registered.
  • control section 30 performs the same distinguishing process as that of the above-described blood vessel registration mode; if it distinguishes an object pattern extracted from the image signals S 20 b as the blood vessel pattern, the control section 30 then compares each of the characteristic points extracted from the object pattern as a group (segment blood vessel constituting-points row) extending from the starting point to the terminal point via the inflection point with the characteristic points of the registrant identification data DIS read out from the flash memory 33 , thereby making a determination as to whether a person is the registrant (au authorized user) according to the degree of congruence.
  • group segment blood vessel constituting-points row
  • control section 30 if the determination by the control section 30 is that he is the registrant, the control section 30 generates an execution command COM 30 in order to let an operation processing device (not shown), which is connected to the external interface 34 , perform a predetermined operation.
  • the control section 30 supplies this execution command COM 30 to the operation processing device via the external interface 34 .
  • the operation processing device is connected to the external interface 34 .
  • the authentication device 1 may contain the software and hardware of the operation processing device.
  • control section 30 displays on a display section 35 a of the notification section 35 information to that effect, and outputs sound through a sound output section 35 b of the notification section 35 , visually and auditorily notifying a user of the fact that he is not the registrant.
  • control section 30 performs the authentication mode.
  • the following provides a detailed description of the distinguishing process by the control section 30 .
  • the distinguishing process is performed according to a flowchart shown in FIG. 24 .
  • the control section 30 After having sequentially performed the preprocessing process and the characteristic point extraction process for the image signals S 20 a or S 20 b that are input during the blood vessel registration mode or the authentication mode, the control section 30 starts the procedure of the distinguishing process. At step SP 11 , the control section 30 detects the variance of the angle distribution, the intensity of the angle distribution and the number of the segments recognized as groups from the object pattern extracted from the image signals S 20 a or S 20 b.
  • This detection determines the position of the object pattern, whose object is the current target of image capturing, in the three dimensional space ( FIG. 20 ) of the distinguishing indicators of the plurality of sample patterns recognized as the authorized blood vessel patterns.
  • the control section 30 calculates the Mahalanobis distance between the center of the three-dimensional distribution of the distinguishing indicators and the position of the object pattern based on the blood vessel pattern range data (the center of the distribution of the distinguishing indicators, the inverse matrix of the covariance matrix, and the blood vessel boundary number) stored in ROM.
  • the Mahalanobis distance D CP is calculated by:
  • D CP ⁇ square root over (( P ⁇ CT ) T ⁇ Cov ⁇ 1 ⁇ ( P ⁇ CT )) ⁇ square root over (( P ⁇ CT ) T ⁇ Cov ⁇ 1 ⁇ ( P ⁇ CT )) ⁇ (6)
  • CT is the center of the distribution of the distinguishing indicators
  • Cov ⁇ 1 is the inverse matrix of the covariance matrix
  • P is the position of the object pattern.
  • the control section 30 makes a determination as to whether the Mahalanobis distance calculated at step SP 12 is less than the blood vessel boundary number of the blood vessel pattern range data stored in ROM.
  • the blood vessel boundary number represents the value of the boundary Rf F with respect to the center of the distribution of the distinguishing indicators: the determination of the blood vessel pattern should be made based on the boundary Rf F . Accordingly, if the Mahalanobis distance is greater than the blood vessel boundary number, this means that the extracted object pattern should not be recognized as an appropriate blood vessel pattern since it may be a pseudo blood vessel pattern or a completely different pattern from the blood vessel pattern.
  • control section 30 proceeds to step SP 14 and disposes of the object pattern extracted from the image signals S 20 a or S 20 b and its characteristic points, and informs a user, through the notification section 35 ( FIG. 23 ), that it should take a picture again, before ending the distinguishing process.
  • the Mahalanobis distance is less than or equal to the blood vessel boundary number, this means that the extracted object pattern should be recognized as an appropriate blood vessel pattern.
  • control section 30 proceeds to step SP 15 and, if it is running in blood vessel registration mode, recognizes the characteristic points extracted as a group (segment blood vessel constituting-points row), which extends from the object pattern's starting point to the terminal point through the inflection point, as those to be registered; if it is running in authentication mode, the control section 30 recognizes them as those to be compared with the characteristic points already registered as the registrant identification data DIS. The control section 30 subsequently ends the distinguishing process.
  • group segment blood vessel constituting-points row
  • the control section 30 uses the following tendencies as the distinguishing indicators for the blood vessel pattern and the pseudo blood vessel pattern, the control section 30 generates the blood vessel pattern range data (the center of the distribution of the distinguishing indicators, the inverse matrix of the covariance matrix, and the blood vessel boundary number): the tendency that the blood vessel pattern does not spread but has certain directivity (along the length of the finger), and the tendency that of all the segments of the blood vessel pattern, the one resembling a straight line is longer than the others. Based on the blood vessel pattern range data, the control section 30 eliminates the pseudo blood vessel patterns and the like.
  • the data generation processing device 1 of the authentication system calculates a form value representing the shape of the pattern.
  • the form value is determined to represent the shape of the pattern: the tendency that the blood vessel pattern does not spread but has certain directivity (along the length of the finger), and the tendency that the segment resembling a straight line is longer than the others.
  • the data generation processing device 1 calculates the following values as the shape values ( FIG. 22 : step SP 1 to step SP 5 ): firstly, the degree of the spread of the weighted distribution ( FIG. 17 ) with the length of the segment used as frequency, as for the distribution of the angles ( FIG. 16 ) of the reference axis (perpendicular to the direction of the circulation of blood) with respect to the segments connecting the characteristic points of the blood vessel pattern; secondly, the ratio of the size of the distribution existing within the predetermined angular range whose center is equal to the angle of the direction of the blood circulation (90 degrees) to the size of the total distribution; thirdly, the number of the segments ( FIG. 19(A) ).
  • the data generation processing device 1 calculates the center of the three-dimensional distribution ( FIG. 20 ) of those form values, and the inverse number of the value (the covariance matrix) representing the degree of the spread from the center, and stores them in the internal memory of the authentication device 2 .
  • the authentication device 2 of the authentication system calculates the above-noted three form values for the pattern obtained from the image signals S 20 a or S 20 b that were input as those to be either registered or compared with the registered data. Then, using the inverse number of the covariance matrix, the authentication device 2 calculates the Mahalanobis distance between the position identified by the three form values in the three-dimensional distribution and the center of the three-dimensional distribution ( FIG. 20 ) stored in the internal memory. If the Mahalanobis distance is greater than the predetermined threshold (the blood vessel boundary number ( FIG. 20 : “Rf f ”), the authentication device 2 disposes of the pattern ( FIG. 24 ).
  • the predetermined threshold the blood vessel boundary number ( FIG. 20 : “Rf f ”
  • the authentication system recognizes where the pattern obtained from those to be either registered or compared with the registered data exists in the three-dimensional distribution ( FIG. 20 ) corresponding to the three indicators representing the characteristics of the blood vessel patterns, and whether it exists within the range extending from the center of the distribution to the boundary (the blood vessel boundary number ( FIG. 20 : “Rf f ”): existing inside the range means that it is a living body's pattern.
  • the authentication system assumes that the pseudo blood vessel pattern is not the blood vessel pattern. This increases the possibility that the authentication system eliminates the pseudo blood vessel pattern before registering or comparing them.
  • the data generation device 1 and the authentication device 2 calculate the form values after extracting the characteristic points of the blood vessel pattern so that the line passing through these characteristic points resembles both the blood vessel pattern and the straight line.
  • the authentication system calculates the form values representing the shape of the pattern. This allows the authentication system to precisely calculate the form values. This increases the possibility that the authentication system eliminates the pseudo blood vessel pattern after assuming that it is not the blood vessel pattern.
  • the authentication system recognizes where the pattern obtained from those to be either registered or compared with the registered data exists in the three-dimensional distribution corresponding to the three indicators representing the characteristics of the blood vessel patterns, and whether it exists within the range extending from the center of the distribution to the boundary: existing inside the range means that it is a living body's pattern. This increases the possibility that the authentication system eliminates the pseudo blood vessel pattern after assuming that it is not the blood vessel pattern. Thus, the authentication system that is able to improve the accuracy of authentication can be realized.
  • the determination is made as to whether the input pattern is the blood vessel pattern or not based on the data representing the distribution of the blood vessel pattern obtained from the plurality of samples and the data (threshold) representing the boundary of the distribution, which is used for the determination of the blood vessel pattern.
  • the present invention is not limited to this.
  • the distribution of the pseudo blood vessel pattern may also be used when the determination is made as to whether the input pattern is the blood vessel pattern or not.
  • the above-noted data generation process ( FIG. 22 ) of the data generation device 1 stores the center of the distribution of the three distinguishing indicators of each blood vessel pattern obtained from the living body's samples, the inverse matrix of the covariance matrix, and the blood vessel boundary number (“Rf F ,” or “2.1” of the Mahalanobis distance, in the case of FIG. 20 ) in ROM of the authentication device 2 as the blood vessel pattern range data.
  • the above-noted data generation process FIG. 22
  • the authentication device 2 calculates the Mahalanobis distance (referred to as living body distribution-related distance, hereinafter) between the position of the input pattern (the object pattern whose object is the current target of image capturing) in the three distinguishing indicators' distribution and the center of the distribution; at the same time, based on the pseudo blood vessel pattern range data, the authentication device 2 calculates the Mahalanobis distance (referred to as non-living body distribution-related distance, hereinafter) between the position of the input pattern in the three distinguishing indicators' distribution and the center of the distribution (step SP 22 ).
  • the Mahalanobis distance referred to as living body distribution-related distance, hereinafter
  • the authentication device 2 makes a determination as to whether the non-living body distribution-related distance is less than or equal to the pseudo blood vessel boundary number (step SP 23 ). If the non-living body distribution-related distance is less than or equal to the pseudo blood vessel boundary number, this means that, as indicated by the ⁇ -P plane of the three-dimensional distribution of FIG. 20 , for example, the input pattern exists in an area where the range, in which things should be determined as the blood vessel patterns, is overlapped with the range, in which things should be determined as the pseudo blood vessel patterns.
  • the authentication device 2 therefore disposes of the input pattern (the object pattern whose object is the current target of image capturing) and the like even when the living body distribution-related distance is less than or equal to the blood vessel boundary number (step SP 14 ).
  • the authentication device 2 recognizes the characteristic points extracted as a group (segment blood vessel constituting-points row) extending from the object pattern's starting point to the terminal point via the inflection point as those to be either registered or compared (step SP 15 ).
  • the distribution of the pseudo blood vessel pattern can be also used when the determination is made as to whether the input pattern is the blood vessel pattern. This increases the possibility that the authentication system
  • the authentication device 2 then makes a determination as to whether the non-living body distribution-related distance is less than or equal to the pseudo blood vessel boundary number (step SP 23 ).
  • the following is also possible: for example, in such a case, a determination is made as to whether the living body distribution-related distance calculated at step SP 22 is greater than the non-living body distribution-related distance.
  • the form pattern (the blood vessel pattern) of the blood vessels is applied as the living body's pattern.
  • the present invention is not limited to this.
  • Other things, such as a form pattern of fingerprints, vocal prints, mouth prints, or nerves, can be applied if a corresponding acquisition means is used based on an applied living body's pattern.
  • the above-noted three distinguishing indicators can be used as the form values representing the shape of the pattern if the applied living body's pattern, like the blood vessel pattern or the nerve pattern, has the tendency that it does not spread but has certain directivity (along the length of the finger), or the tendency that the segment resembling a straight line is long.
  • the form values may need to be changed according to the characteristics of the applied living body's pattern.
  • the applied living body's pattern has the above characteristics
  • the following values are used as the three distinguishing indicators: firstly, the degree of the spread of the weighted distribution with the length of the segment used as frequency, as for the distribution of the angles of the reference axis with respect to the segments connecting the characteristic points of the pattern; secondly, the ratio of the size of the distribution existing within the predetermined angular range whose center is equal to the angle of the direction perpendicular to the reference axis to the size of the total angular range of the distribution; thirdly, the number of the segments.
  • the present invention is not limited to this.
  • distinguishing indicators Only two of those distinguishing indicators may be used, or another, new distinguishing indicator, such as the one used for a determination as to whether the top three peaks, of all the peaks of the angle distribution, includes the 90-degrees point, can be added to those three distinguishing indicators. In short, as long as there are two or more distinguishing indicators, they can be used as the values representing the shape of the pattern.
  • the blood vessel pattern range data stored in ROM of the authentication device 2 contains the center of the distribution of the three distinguishing indicators of each blood vessel pattern obtained from the living body's samples, the inverse matrix of the covariance matrix and the blood vessel boundary number (“Rf F ,” or “2.1” of the Mahalanobis distance, in the case of FIG. 20 ).
  • the blood vessel boundary number may be previously set in the authentication device 2 ; if only the inverse number of the covariance matrix is calculated during the calculation of the Mahalanobis distance ( FIG. 24 ( FIG. 25 ): step SP 12 ), it may only contain the center of the distribution of the three distinguishing indicators and the covariance matrix.
  • the preprocessing section 21 and the characteristic point extraction section 22 are applied as extraction means that extracts the characteristic points from the living body's pattern so that the line connecting these characteristic points resembles the living body's pattern and the straight line.
  • the present invention is not limited to this. The process of the preprocessing section 21 and the characteristic point extraction section 22 may be changed if necessary.
  • the preprocessing section 21 performs the A/D conversion process, the outline extraction process, the smoothing process, the binarization process, and the thinning process in that order.
  • some of the processes may be omitted or replaced, or another process may be added to the series of processes. Incidentally, the order of the processes can be changed if necessary.
  • the process of the characteristic point extraction section 22 can be replaced by a point extraction process (called Harris corner) or a well-known point extraction process such as the one disclosed in Japanese Patent Publication No. 2006-207033 ([0036] to [0163]).
  • the authentication device 2 including the image-capturing function, the verification function and the registration function is applied.
  • the present invention is not limited to this. Various applications are possible according to purposes and the like: those functions may be implemented in different devices.
  • the present invention can be applied to the field of biometrics authentication.

Abstract

A pattern identification method and other things to improve the accuracy of authentication are proposed. For each of living body's patterns obtained from a plurality of living body's samples, two or more form values representing the shape of the pattern are calculated; the center of the distribution of the two or more form values and a value representing the degree of the spread from the center are calculated; a distance between the two or more form values of a pattern obtained from those to be registered or to be compared with registered data and the center of the distribution of the two or more form values is calculated with the use of the value representing the degree of the spread from the center; the pattern is disposed of if the distance is greater than a predetermined threshold.

Description

    TECHNICAL FIELD
  • The present invention relates to a pattern identification method, registration device, verification device and program, and is preferably applied to biometrics authentication.
  • BACKGROUND ART
  • A blood vessel has been among the subjects of biometrics authentication. A blood vessel image of a registrant is usually registered in an authentication device as registration data. The authentication device makes a determination as to whether a person is the registrant according to how much verification data, which is input as data to be verified, resembles the registration data.
  • There are various proposals for such authentication devices to prevent identity theft. For example, one method focuses on the fact that the coordinates and other factors of the input verification data cannot be exactly the same as those of the previously input verification data: when the device finds that these verification data are all the same, it does not authenticate even if they are the same as the registration data (see Patent Document 1, for example). This identity theft prevention method works well when the registrants' blood vessel image data are stolen.
  • By the way, there is a report that if a picture of a root crop, such as radish, is taken instead of that of the finger, the authentication device may obtain a pattern (referred to as pseudo blood vessel pattern, hereinafter) that resembles a pattern of blood vessels (referred to as blood vessel pattern, hereinafter) because tubes inside the radish such as vessels, sieve tubes, and fascicles, look like the blood vessels of a living body: the use of radish or the like allows identity theft.
  • Patent Document 1: Japanese Patent Publication No. 2002-259345Non Patent Document 1: Tsutomu Matsumoto, “Biometrics Authentication for Financial Transaction,” [online], Apr. 15, 2005, the 9th study group of the Financial Services Agency for the issues on forged cash cards, (searched on Aug. 21, 2006), Internet <URL: http://www.fsa.go.jp/singi/singi_fccsg/gaiyou/f-20050415-singi_fccsg/02.pdf>)
  • In this case, the coordinates and other factors of the pseudo blood vessel pattern can not be exactly the same as those of the registrant's blood vessel pattern. So even if the above identity theft prevention method is applied, anyone can be identified as the registrant, allowing identity theft and lowering the accuracy of authentication.
  • DISCLOSURE OF THE INVENTION
  • The present invention has been made in view of the above points and is intended to provide a pattern identification method, registration device, verification device and program that can improve the accuracy of authentication.
  • To solve the above problem, a pattern identification method of the present invention includes the steps of: calculating, for each of living body's patterns obtained from a plurality of living body's samples, two or more form values representing the shape of the pattern; calculating the center of the distribution of the two or more form values and a value representing the degree of the spread from the center; calculating a distance between the two or more form values of a pattern obtained from those to be registered or to be compared with registered data and the center of the distribution of the two or more form values using the value representing the degree of the spread from the center; and disposing of the pattern if the distance is greater than a predetermined threshold.
  • Accordingly, this pattern identification method can recognize where the pattern obtained from those to be either registered or compared with the registered data exists in the distribution having a plurality of dimensions (pattern form values) regarding each living body's pattern, and whether it exists within a range extending from the center of the distribution to a boundary (threshold): existing inside the range means that it is a living body's pattern.
  • Accordingly, this pattern identification method can increase the possibility that it eliminates a pseudo pattern resembling the living body's pattern before registering or comparing them, even if the pattern obtained from those to be either registered or compared with the registered data is the pseudo pattern.
  • Moreover, a registration device of the present invention includes: storage means for storing, for each of living body's patterns obtained from a plurality of living body's samples, the center of the distribution of two or more form values representing the shape of the pattern and a value representing the degree of the spread from the center; calculation means for calculating a distance between the two or more form values of a pattern obtained from those to be registered and the center of the distribution of the two or more form values stored in the storage means using the value; and registration means for disposing of the pattern if the distance is greater than a predetermined threshold while registering the pattern in a storage medium if the distance is within the threshold.
  • Accordingly, this registration device can recognize where the pattern obtained from those to be registered exists in the distribution having a plurality of dimensions (pattern form values) regarding each living body's pattern, and whether it exists within a range extending from the center of the distribution to a boundary (threshold): existing inside the range means that it is a living body's pattern.
  • Accordingly, this registration device can increase the possibility that it eliminates a pseudo pattern resembling the living body's pattern before registering them, even if the pattern obtained from those to be registered is the pseudo pattern.
  • Furthermore, a verification device of the present invention includes: storage means for storing, for each of living body's patterns obtained from a plurality of living body's samples, the center of the distribution of two or more form values representing the shape of the pattern and a value representing the degree of the spread from the center; calculation means for calculating a distance between the two or more form values of a pattern obtained from those to be registered and the center of the distribution of the two or more form values stored in the storage means; and verification means for disposing of the pattern if the distance is greater than a predetermined threshold while comparing the pattern with registered data registered in a storage medium if the distance is within the threshold.
  • Accordingly, this verification device can recognize where the pattern obtained from those to be compared exists in the distribution having a plurality of dimensions (pattern form values) regarding each living body's pattern, and whether it exists within a range extending from the center of the distribution to a boundary (threshold): existing inside the range means that it is a living body's pattern.
  • Accordingly, this verification device can increase the possibility that it eliminates a pseudo pattern resembling the living body's pattern before comparing them, even if the pattern obtained from those to be compared is the pseudo pattern.
  • Furthermore, a program of the present invention causing a computer that stores, for each of living body's patterns obtained from a plurality of living body's samples, the center of the distribution of two or more form values representing the shape of the pattern and a value representing the degree of the spread from the center, executes: a first process of calculating a distance between the two or more form values of a pattern obtained from those to be registered and the center of the distribution of the two or more form values stored in the storage means using the value; and a second process of disposing of the pattern if the distance is greater than a predetermined threshold while registering the pattern in a storage medium if the distance is within the threshold, or a second process of disposing of the pattern if the distance is greater than a predetermined threshold while comparing the pattern with registered data registered in a storage medium if the distance is within the threshold.
  • Accordingly, this program can recognize where the pattern obtained from those to be either registered or compared with the registered data exists in the distribution having a plurality of dimensions (pattern form values) regarding each living body's pattern, and whether it exists within a range extending from the center of the distribution to a boundary (threshold): existing inside the range means that it is a living body's pattern.
  • Accordingly, this program can increase the possibility that it eliminates a pseudo pattern resembling the living body's pattern before registering or comparing them, even if the pattern obtained from those to be either registered or compared with the registered data is the pseudo pattern.
  • According to the present invention, they can recognize where the pattern obtained from those to be either registered or compared with the registered data exists in the distribution having a plurality of dimensions (pattern form values) regarding each living body's pattern, and whether it exists within a range extending from the center of the distribution to a boundary (threshold): existing inside the range means that it is a living body's pattern. Accordingly, they can increase the possibility that it eliminates the pseudo pattern before registering or comparing them by assuming that it is not the living body's pattern. Thus, the registration device, verification device, extraction method and program that are able to improve the accuracy of authentication can be realized.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram illustrating the configuration of a data generation device according to an embodiment of the present invention.
  • FIG. 2 is a functional block diagram illustrating the image process of a control section.
  • FIG. 3 is a schematic diagram illustrating images before and after a preprocessing process.
  • FIG. 4 is a schematic diagram as to a description of an emerging pattern of an end point, a diverging point, and an isolated point.
  • FIG. 5 is a schematic diagram illustrating a tracking of a blood vessel line between a diverging point and a diverging or end point.
  • FIG. 6 is a schematic diagram as to a description of a tracking of a blood vessel pixel.
  • FIG. 7 is a schematic diagram illustrating an emerging pattern of a point on a line and an inflection point.
  • FIG. 8 is a schematic diagram as to a description of the detection of an inflection point.
  • FIG. 9 is a schematic diagram as to a description of the determination of an overlap ratio of a segment's pixel with respect to an original blood vessel pixel.
  • FIG. 10 is a flowchart illustrating the procedure of a removal process.
  • FIG. 11 is a schematic diagram illustrating an inflection point before and after removal.
  • FIG. 12 is a schematic diagram illustrating the connection of segment blood vessel lines (three diverging points).
  • FIG. 13 is a schematic diagram illustrating the connection of segment blood vessel lines (four diverging points).
  • FIG. 14 is a schematic diagram illustrating characteristic points obtained from a characteristic point extraction process.
  • FIG. 15 is a schematic diagram illustrating a blood vessel pattern and a pseudo blood vessel pattern.
  • FIG. 16 is a schematic diagram as to the calculation of an angle of a segment with respect to a horizontal axis passing through the end point of the segment.
  • FIG. 17 is a schematic diagram illustrating an angle distribution of a blood vessel pattern.
  • FIG. 18 is a schematic diagram illustrating an angle distribution of a pseudo blood vessel pattern.
  • FIG. 19 is a schematic diagram illustrating the length of a segment resembling a straight line.
  • FIG. 20 is a schematic diagram illustrating the distribution of distinguishing indicators.
  • FIG. 21 is a schematic diagram illustrating the distribution of distinguishing indicators on a σ-C plane.
  • FIG. 22 is a flowchart illustrating the procedure of a data generation process.
  • FIG. 23 is a block diagram illustrating the configuration of an authentication device according to an embodiment of the present invention.
  • FIG. 24 is a schematic diagram illustrating the procedure of a distinguishing process (1).
  • FIG. 25 is a schematic diagram illustrating the procedure of a distinguishing process (2).
  • BEST MODE FOR CARRYING OUT THE INVENTION
  • An embodiment of the present invention will be described in detail with reference to the accompanying drawings.
  • (1) Overall Configuration of an Authentication System According to an Embodiment of the Present Invention
  • An authentication system of the present embodiment includes a data generation device and an authentication device. The data generation device generates data (referred to as blood vessel pattern range data, hereinafter) representing a range: a determination is to be made about blood vessel patterns based on this range. The data generation device records the data in an internal memory of the authentication device.
  • The authentication device is equipped with a function that makes a determination as to whether a pattern of an image data obtained as a result of taking a picture of an object is a pseudo blood vessel pattern according to the blood vessel pattern range data.
  • (2) Configuration of the Data Generation Device
  • FIG. 1 shows the configuration of the data generation device. The data generation device 1 includes a control section 10 to which an operation section 11, an image pickup section 12, a flash memory 13, and a interface (referred to as external interface, hereinafter) 14 that exchanges data with an external section are connected via a bus 15.
  • The control section 10 is a microcomputer including CPU (Central Processing Unit) that takes overall control of the data generation device 1, ROM (Read Only Memory) that stores various programs and setting information, and RAM (Random Access Memory) that serves as a work memory for CPU.
  • When a user operates the operation section 11, an image pickup command COM1 or a command COM2 that orders the generation of the blood vessel pattern range data is given to the control section 10 from the operation section 11. Based on the execution commands COM1 and COM2, the control section 10 makes a determination as to which mode it should start. Using a program corresponding to the determination, the control section 10 appropriately controls the image pickup section 12, the flash memory 13, and the external interface 14 to run in image pickup mode or data, generation mode.
  • (2-1) Image Pickup Mode
  • More specifically, if the determination is that it should start the image pickup mode, the control section 10 enters the image pickup mode, which is an operation mode, to control the image pickup section 12.
  • In this case, a drive control section 12 a of the image pickup section 12 drives and controls one or more near infrared beam sources LS that emit a near infrared beam toward a predetermined position of the data generation device 1, and an image pickup element ID that is for example CCD (Charge Coupled Device).
  • After the emission of the near infrared beam to an object placed at the predetermined position, the image pickup element ID receives the near infrared beam from the object via an optical system OP and an aperture diaphragm DH, converts it into electric signals and transmits them to the drive control section 12 a as image signals S1.
  • If the object is a finger of a living body, the near infrared beam emitted from the near infrared beam source LS gets into the finger, and, after being reflected and scattered inside the finger, emerges from the finger as a blood vessel representation beam to enter the image pickup element ID: the blood vessel representation beam represents the finger's blood vessels. The blood vessel representation beam is then transmitted to the drive control section 12 a as the image signals S1.
  • The drive control section 12 a adjusts the position of an optical lens of the optical system OP according to the pixel values of the image signals S1, so that the object is in focus. The drive control section 12 a also adjusts the aperture of the aperture diaphragm DH so that the amount of light entering the image pickup element ID becomes appropriate. After the adjustment, an image signal S2 output from the image pickup element ID is supplied to the control section 10.
  • The control section 10 performs a predetermined image process for the image signals S2 to extract a characteristic of an object pattern from the image, and stores the extracted image in the flash memory 13 as image data D1.
  • In this manner, the control section 10 can perform the image pickup mode.
  • The following describes how the image process is performed. From a functional point of view, as shown in FIG. 2, the image process can be divided into a preprocessing section 21 and a characteristic point extraction section 22. The following provides a detailed description of the preprocessing section 21 and the characteristic point extraction section 22. By the way, for ease of explanation, the image signals S2 supplied from the image pickup section 12 are those obtained as a result of taking a picture of a living body's finger.
  • (2-1-A) Preprocessing
  • In order to extract a blood vessel pattern, the preprocessing section 21 sequentially performs an A/D (Analog/Digital) conversion process, a predetermined outline extraction process such as Sobel filtering, a predetermined smoothing process such as Gaussian filtering, a binarization process, and a thinning process for the image signals S2 supplied from the image pickup section 12.
  • For example, assume that an image (the image signals S2) shown in FIG. 3(A) is input into the preprocessing section 21: thanks to the preprocessing by the preprocessing section 21, the image is converted into an image shown in FIG. 3(B), with the blood vessel pattern of the image emphasized.
  • The preprocessing section 21 outputs data (referred to as image data, hereinafter) D21 whose image shows the extracted blood vessel pattern to the characteristic point extraction section 22.
  • In the present embodiment, the blood vessel lines (the blood vessel pattern) included in the image of the image data 21 are converted by the binarization process into white pixels; their widths (or thickness) are represented as “1” as a result of the thinning process. If the width of the blood vessel line is “1,” then the width of the line is one pixel.
  • (2-1-B) Characteristic Point Extraction Process
  • The characteristic point extraction section 22 detects end points, diverging points, and inflection points from the white pixels (referred to as blood vessel pixels, hereinafter) that constitute a blood vessel patter of the input image, and appropriately removes the inflection points with reference to the end points and the diverging points.
  • (B-1) Detection of the End and Diverging Points
  • The characteristic point extraction section 22 detects the end and diverging points from the blood vessel lines in the first stage of the process.
  • More specifically, from among the pixels constituting the input image (the image data D21), the characteristic point extraction section 22 recognizes the blood vessel pixels as attention pixels in a predetermined order, and examines the eight pixels around the attention pixel to count the number of the blood vessel pixels.
  • Here, FIG. 4 shows a pattern of how the end, diverging and isolated points of the blood vessel lines appear. In FIG. 4, a hatched area represents the attention pixel; a black area represents the blood vessel pixel (the white pixel) for ease of explanation. It is obvious from FIG. 4. that if the width of the blood vessel line is represented as one pixel, the correlation between the attention pixel and the number of the blood vessel pixels is self determined; as for the diverging pattern, it must have three or four diverging points.
  • Accordingly, if there is one blood vessel pixel around the attention pixel, the characteristic point extraction section 22 detects this attention pixel as the end point. On the other hand, if there are three or four blood vessel pixels around the attention pixel, the characteristic point extraction section 22 detects this attention pixel as the diverging point. By contrast, if there is no blood vessel pixel around the attention pixel, the characteristic point extraction section 22 detects the attention pixel as the isolated point.
  • Then, the characteristic point extraction section 22 removes the isolated points, which do not constitute the blood vessel line, from the detected end, diverging and isolated points.
  • In this manner, the characteristic point extraction section 22 detects the end and diverging points from the blood vessel lines in the first stage of the process.
  • (B-2) Detection of the Inflection Point
  • Then, based on the end and diverging points, the characteristic point extraction section 22 detects the inflection points in the second stage of the process.
  • More specifically, for example, as shown in FIG. 5, the characteristic point extraction section 22 recognizes the diverging point DP1 as a starting point, and other characteristic points (the ending points EP1 and EP2, and the diverging point DP2), which appear after the starting point (or the diverging point DP1), as a terminal point; it then tracks a segment of the blood vessel line (referred to as segment blood vessel line, hereinafter) extending from the starting point to the terminal point. Similarly, the characteristic point extraction section 22 recognizes the diverging point DP2 as a starting point, and other characteristic points (the ending points EP3 and EP4), which appear after the starting point (or the diverging point DP2), as a terminal point; it then tracks the segment blood vessel line.
  • In this example of FIG. 5, the starting points are the diverging points DP1 and DP2, but the end points can also be the starting points. Incidentally, it is obvious from FIG. 5 that the end points can only be either the starting or terminal points, while there is another diverging point (or points) before or after (or at both sides of) the diverging point regardless of whether it is the starting or terminal point.
  • FIG. 6 illustrates a specific method of tracking. In FIG. 6, the characteristic point extraction section 22 sequentially tracks the blood vessel pixels of the segment blood vessel line from the starting point to the terminal point by performing a process of excluding the previous attention pixel (a pixel filled with horizontal lines) from the blood vessel pixels around the current attention pixel (a hatched pixel) and choosing from them the next attention pixel until the blood vessel pixels around the current attention pixel include the terminal point.
  • Since a series of the blood vessel pixels of the segment blood vessel line represents a blood vessel line's segment extending from one diverging or end point to the next diverging or end point, there is no diverging point between them. This means that the attention pixel must be either a point on a line or the inflection point. Incidentally, FIG. 7 shows a pattern of how the points on the line and the inflection points appear. In FIG. 7, like FIG. 4, a hatched area represents the attention pixel; a black area represents the blood vessel pixel (the white pixel), for ease of explanation.
  • For example, as shown in FIG. 8, during the process of the tracking between the starting and terminal points (those hatched in a diagonal grid-like pattern), if the linearity of the series of the previous attention pixels including the current attention pixel ends with the next attention pixel (or the blood vessel pixel), the characteristic point extraction section 22 detects the current attention pixel as the inflection point (a pixel hatched in a grid-like pattern).
  • After reaching the terminal point, the characteristic point extraction section 22 recognizes a series of characteristic points extending from the segment blood vessel line starting point to the terminal point as one group.
  • In this manner, using the end and diverging points as the points of reference, the characteristic point extraction section 22 detects the inflection points of each segment blood vessel line extending from one diverging or end point to the next diverging or end point in the second stage of the process.
  • (B-3) Removal of the Inflection Points
  • Then, in the third stage of the process, the characteristic point extraction section 22 recognizes the group of the characteristic points, or the series of the characteristic points extending from the segment blood vessel line' starting, point to the terminal point, as one processing unit (referred to as segment blood vessel constituting-points row, hereinafter), and removes the inflection points from the segment blood vessel line.
  • The same removal process is applied to all the segment blood vessel constituting-points rows; the following provides a detailed description about the process applied to one segment blood vessel constituting-points row, with reference to FIG. 9. In FIG. 9, a square area represents a pixel (referred to as original blood vessel pixel, hereinafter) constituting the original blood vessel line; a hatched area represents the end or diverging point of the original blood vessel pixel.
  • On the segment blood vessel constituting-points row, there are the original blood vessel pixels from the characteristic point (referred to as reference point, hereinafter) GPbs, which was selected as a point of reference, to removal candidate points GPcd (GPcd1 to GPcd3); there are segments SG (SG1 to SG3) extending from the reference point GPbs to the removal candidate points GPcd. The characteristic point extraction section 22 counts the number of the segment SG's pixels (referred to as segment pixels, hereinafter) overlapped with the original blood vessel pixels, and gradually moves the removal candidate point GPcd toward the terminal point until the ratio of the number of the overlapped pixels to the number of pixels existing between the reference point GPbs and the removal candidate point GPcd becomes less than a predetermined threshold (referred to as overlap ratio threshold).
  • In FIG. 9, all the segment pixels (two pixels) of the segment SG1 are overlapped with the original blood vessel pixels (two pixels) existing between the reference point GPbs and the corresponding, removal candidate point GPcd1, and this means that the overlap ratio is “2/2”. Moreover, the segment pixels (seven pixels) of the segment SG2 are overlapped with four of the original blood vessel pixels (seven pixels) existing between the reference point GPbs and the corresponding removal candidate point GPcd2, and this means that the overlap ratio is “4/7.” Moreover, the segment pixels (nine pixels) of the segment SG3 are overlapped with two of the original blood vessel pixels (nine pixels) existing between the reference point GPbs and the corresponding removal candidate point GPcd3, and this means that the overlap ratio is “2/9.”
  • If the overlap ratio of the segment pixels of the segment SG3 is less than the overlap ratio threshold, the characteristic point extraction section 22 removes the characteristic point GPcd1 between the characteristic point, which was selected as the removal candidate point GPcd2 immediately before the removal candidate point (characteristic point) GPcd3, and the reference point GPbs. Accordingly, even if the characteristic point GPcd1 is removed, the segment SG2 extending from the reference point GPbs to the remaining characteristic point GPcd2 can substantially represents the original blood vessel line.
  • Here, if the overlap ratio threshold is too small, the characteristic point GPcd1 may be removed even when the segment SG3 does not resemble a series of original blood vessel pixels (a segment blood vessel line) extending from the reference point GPbs to the removal candidate point GPcd3. If the overlap ratio threshold is too large, the characteristic point GPcd1 may be left.
  • Accordingly, in the present embodiment, the characteristic point extraction section 22 changes the overlap ratio threshold according to the length of the segment. More specifically, assume that the reference point is GPJ (J=1, 2, . . . , M (M: integer)) and that the αth removal candidate point from the reference point is GPJ+α. The following describes a case of calculating the overlap ratio of the segment GPJ−GPJ+α extending from the reference point GPJ to the removal candidate point GPJ+αwith respect to the original blood vessel pixels: If the length of the previous segment GPJ+(α−1)−GPJ+α, whose overlap ration was calculated immediately before the current one, is greater than or equal to a predetermined threshold (referred to as segment length threshold), a first overlap ration threshold is set; if it is less than the segment length threshold, a second overlap ratio threshold, which is larger than the first overlap ratio threshold, is set.
  • This allows the appropriate selection of the inflection points to be removed, so that a line passing through the inflection points on the segment blood vessel line resembles the segment blood vessel line.
  • More specifically, the removal process of the inflection points starts from the starting point of the segment blood vessel constituting-points row; FIG. 34 shows a procedure of this process. This means that the characteristic point extraction section 22 selects the starting point of the segment blood vessel constituting-points row as the reference point, and selects the first characteristic point from the reference point as the removal candidate point (step SP1).
  • Then, the characteristic point extraction section 22 makes a determination as to whether this is a case in which it calculates the overlap ratio for the first time after starting the removal process of the inflection points or a case in which it makes a determination as to whether the length of the previous segment GPJ+(α−1)−GPJ+α, which appeared immediately before the segment GPJ−GPJ+α extending from the current reference point GPJ to the removal candidate point GPJ+a, is less than the segment length threshold (step SP32).
  • If this is the case in which it is the first time to calculate the overlap ration after starting the removal process of the inflection point or the case in which the length of the previous segment GPJ+(α−1)−GPJ+α is less than the segment length threshold, the characteristic point extraction section 22 sets the first overlap ratio threshold as the overlap ratio threshold (step SP33), calculates the overlap ratio of the current segment GPJ−GPJ+α extending from the reference point GPJ to the removal candidate point GPJ+α with respect to the original blood vessel pixels (step SP34), and makes a determination as to whether this overlap ratio is greater than or equal to the first overlap ratio threshold (step SP35).
  • Whereas, if this is the case in which this is not the first time to calculate the overlap ration after starting the removal process of the inflection point and the case in which the length of the previous segment GPJ+(α−1)−GPJ+α is greater than or equal to the segment length threshold, the characteristic point extraction section 22 sets the second overlap ratio threshold as the overlap ratio threshold (step SP36), calculates the overlap ratio of the current segment GPJ−GPJ+α extending from the reference point GPJ to the removal candidate point GPJ+α with respect to the original blood vessel pixels (step SP34), and makes a determination as to whether this overlap ratio is greater than or equal to the second overlap ratio threshold (step SP35).
  • Here, if the overlap ratio is greater than or equal to the overlap ratio threshold, this means that the current segment GPJ−GPJ+α extending from the reference point GPJ to the removal candidate point GPJ+α resembles, or is the same as, the original blood vessel line extending from the reference point GPJ to the removal candidate point GPJ+α.
  • In this case, the characteristic point extraction section 22 makes a determination as to whether the current removal candidate point GPJ+α is the terminal point of the segment blood vessel constituting-points row (step SP37); if it is not the terminal point, the characteristic point extraction section 22 selects the next characteristic point, which is closer to the terminal point than the current removal candidate point GPJ+α is, as a new removal candidate point GPJ+α (step SP38) before returning to the above-described process (step SP32).
  • Whereas, if the overlap ratio is less than or equal to the overlap ratio threshold, this means that the current segment GPJ−GPJ+α extending from the reference point GPJ to the removal candidate point GPJ+α is completely different from the original blood vessel line extending from the reference point GPJ to the removal candidate point GPJ+α.
  • In this case, the characteristic point extraction section 22 removes all the characteristic points between the characteristic point, which was selected as the removal candidate point GPJ+α immediately before the current one, and the current reference point (characteristic point) GPJ (step SP39).
  • Then, the characteristic point extraction section 22 makes a determination as to whether the current removal candidate point GPJ+α is the terminal point of the segment blood vessel constituting-points row (step SP40); if it is not the terminal point, the characteristic point extraction section 22 selects the current removal candidate point GPJ+α as the reference point GPJ and the next characteristic point, which is closer to the terminal point than the reference point GPJ is, as a new removal candidate point GPJ+α (step SP41) before returning to the above-noted process (step SP32).
  • Whereas, if the determination by the characteristic point extraction section 22 is that the current removal candidate point GPJ+α is the terminal point of the segment blood vessel constituting-points row (step SP37(Y) or step SP40(Y)), the characteristic point extraction section 22 removes all the characteristic points between the current removal candidate point (characteristic point) GPJ+α and the current reference point (characteristic point) GPJ (step SP42) before ending this removal process of the inflection points.
  • In that manner, the characteristic point extraction section 22 performs the removal process of the inflection points. Incidentally, FIG. 11 shows those before and after the removal process. In the case of FIG. 11, the segment length threshold for the removal process is 5 [mm]; the first overlap ratio threshold is 0.5 (50[%]); the second overlap ratio threshold is 0.7 (70[%]). Moreover, in FIG. 35, a square area represents the original blood vessel pixel; a circular area represents the pixel constituting the segment; a hatched area represents the end or inflection point of the original blood vessel pixel.
  • It is obvious from FIG. 11 that the above removal process has appropriately removed the inflection point; therefore, a line passing through the inflection points on the segment blood vessel line resembles the segment blood vessel line.
  • (B-4) Removal of the End Point
  • Then, in the fourth stage of the process, the characteristic point extraction section 22 chooses, from among three or four segment blood vessel lines extending from the diverging point on the blood vessel line, the two segment blood vessel lines that, if combined, resembles a straight line, and connects them as one segment blood vessel line, thereby removing the starting or terminal point, which was the end point of the two segment blood vessel line. Incidentally, if the width of the blood vessel (the blood vessel line) is one pixel, the number of segment blood vessel lines extending from the diverging point must be three or four, as described above with reference to FIG. 4.
  • More specifically, for example, as shown in FIG. 12(A), assume that the three segment blood vessel lines PBLA, PBLB, and PBLC are extending from the diverging points GP (GPA1, GPB1, and GPC1). The characteristic point extraction section 22 calculates the cosines (cos (θA-B), cos (θA-C), cos (θB-C)) of the crossing angles θA-B, θA-C, and θB-C of each pair of the segment blood vessel lines PBLA, PBLB, and PBLC.
  • Here, if the smallest cosine cos (θA-B) is less than a predetermined threshold (referred to as cosine threshold, hereinafter), this means that the crossing angle of the segment blood vessel lines is close to 180 degrees. In this case, the characteristic point extraction section 22 recognizes the pair of the segment blood vessel lines' segment blood vessel constituting-points rows GPA1, GPA2, . . . , GPA-END and GPB1, GPB2, . . . , GPB-END corresponding to the cosine cos (θA-B); recognizes the both ends of these segment blood vessel constituting-points rows; regards the points GPA-END and GPB-END, which have not been overlapped with each other, as the starting and end points; and recognizes the characteristic points between the starting and end points as one group.
  • As a result, the pair of the segment blood vessel lines PBLA and PBLB is combined. For example, as shown in FIG. 12(B), the number of the segment blood vessel constituting-points row GPAB-first, . . . , GPAB10, GPAB11, GPAB12, . . , GPAB-end of the combined segment blood vessel line PBLAB is one less than the number of the pair of the segment blood vessel lines' segment blood vessel constituting-points rows, which are not combined. This is because the two diverging points GPA1 and GPB2, which were the starting points of the segment blood vessel lines' segment blood vessel constituting-points rows, are replaced by one middle point GPAB11. Incidentally, combining the pair of the segment blood vessel lines PBLA and PBLB does not change the shape of the blood vessel line, or the segment blood vessel line PBLAB.
  • Whereas, if the smallest cosine cos (θA-B) is greater than the cosine threshold, the characteristic point extraction section 22 does not recognize any group. If there are other diverging points left, the characteristic point extraction section 22 recognizes the next diverging point as a processing target; if not, the characteristic point extraction section 22 ends the process.
  • On the other hand, for example, as shown in FIG. 13(A), assume that there are four segment blood vessel lines PBLA, PBLB, PBLC and PBLD extending from the diverging points GP (GPA1, GPB1, GPC1, and GPD1). The characteristic point extraction section 22 calculates the cosines ( cos (θA-B), cos (θA-C), cos (θA-D), cos (θB-C), cos (θB-D), cos (θC-D)) of the crossing angles θA-B, θA-C, θA-D, θB-C, θB-D and θC-D of each pair of the segment blood vessel lines PBLA, PBLB, PBLC and PBLD.
  • Here, if the smallest cosine cos (θB-D) is less than a second cosine threshold, this means that the crossing angle of the segment blood vessel lines is close to 180 degrees. In this case, the characteristic point extraction section 22 recognizes the pair of the segment blood vessel lines' segment blood vessel constituting-points rows GPB1, GPB2, . . . , GPB-END and GPD1, GPD2, . . . , GPD-END corresponding to the cosine cos (θB-D); recognizes the both ends of these segment blood vessel constituting-points rows; regards the points GPB-END and GPD-END, which have not been overlapped with each other, as the starting and end points; and recognizes the characteristic points between the starting and end points as one group.
  • As a result, the pair of the segment blood vessel lines PBLB and PBLD is combined. For example, as shown in FIG. 13(B), the number of the segment blood vessel constituting-points row GPBD-first, . . . , GPBD10, GPBD11, GPBD12, . . . , GPBD-end of the combined segment blood vessel line PBLBD is one less than the number of the pair of the segment blood vessel lines' segment blood vessel constituting-points rows, which are not combined. This is because the two diverging points GPB1 and GPD2, which were the starting points of the segment blood vessel lines' segment blood vessel constituting-points rows, are replaced by one middle point GPBD11. Incidentally, combining the pair of the segment blood vessel lines PBLB and PBLD does not change the shape of the blood vessel line, or the segment blood vessel line PBLBD.
  • In this case with the four diverging points, there are the uncombined segment blood vessel lines PBLA and PBLC left even after the pair of the segment blood vessel lines PBLB and PBLD are combined; if the cosine cos (θA-C) of the crossing angle θA-C of the remaining pair of the segment blood vessel lines PBLA and PBLC is less than the cosine threshold, for example, as shown in FIG. 13(C), the characteristic point extraction section 22 transforms the segment blood vessel constituting-points rows of the segment blood vessel lines PBLA and PBLC into one segment blood vessel constituting-points row GPAC-first, . . . , GPAC10, GPAC11, GPAC12, . . . , GPAC-end in the same way as it has done for the segment blood vessel constituting-points rows of the segment blood vessel lines PBLB and PBLD; and removes one of the starting points GPA1 and GPC1, which were the end points of the original segment blood vessel constituting-points rows.
  • Whereas, if the smallest cosine cos (θA-B) is greater than the cosine threshold, the characteristic point extraction section 22 does not recognize any group. If there are other diverging points left, the characteristic point extraction section 22 recognizes the next diverging point as a processing target; if not, the characteristic point extraction section 22 ends the process.
  • Incidentally, in FIGS. 12 and 13, the overlapping points have the same positional (or coordinate) information. But since each belongs to a different group, they are distinguished for ease of explanation.
  • In this manner, in the fourth stage of the process, the characteristic point extraction section 22 recognizes the blood vessel lines extending from the diverging points on the blood vessel line; recognizes the pair of the blood vessel lines whose crossing angle's cosine is less than the second cosine threshold; and combines the segment blood vessel lines' segment blood vessel constituting-points rows into one segment blood vessel constituting-points row, thereby removing either the starting or terminal point, which was the end point of the pair of the segment blood vessel constituting-points rows.
  • As described above, the characteristic point extraction section 22 detects the end, diverging and inflection points (the first and second stages); and extracts, from among these points, the blood vessel lines' characteristic points on group (segment blood vessel line row) basis with each group being based on the end and diverging points, so that a line passing through the characteristic points resemble both a blood vessel line and a straight line (the third and fourth stages).
  • For example, if the image (the image data D21) shown in FIG. 3(B) is input into the characteristic point extraction section 22, the characteristic point extraction process of the characteristic point extraction section 22 extracts the characteristic points from the image, as shown in FIG. 14, so that a line passing through the characteristic points resembles both a blood vessel line and a straight line.
  • The characteristic point extraction section 22 stores the data (the image data D1) of the image of the extracted characteristic points in the flash memory 13.
  • (2-2) Data Generation Mode
  • On the other hand, if the determination by the control section 10 is that it should start the data generation mode, the control section 10 enters the data generation mode, which is an operation mode, and makes a determination as to whether a plurality of image data sets D1 i (i=1, 2, . . . , n) is stored in the flash memory 13.
  • If there is a plurality of image data sets D1 i in the flash memory 13, the control section 10 starts a data generation process using these image data sets D22 i.
  • The following describes a distinguishing indicator for a blood vessel pattern and a pseudo blood vessel pattern, before the detailed description of the data generation process. In the following example, the pseudo blood vessel pattern is obtained as a result of taking a picture of a gummi candy (an elastic snack, like rubber, made of gelatin, sugar, and thick malt syrup) or radish.
  • (2-2-A) Distinguishing Indicator for the Blood Vessel Pattern and the Pseudo Blood Vessel Pattern
  • FIG. 15 shows the blood vessel pattern obtained from a living body's finger and the pseudo blood vessel patterns obtained from the gummi candy and the radish. As shown in FIG. 15, the blood vessel pattern (FIG. 15(A)) and the pseudo blood vessel patterns (FIG. 15(B)) look like the same pattern overall.
  • Here, as shown in FIG. 16, attention is focused on an angle θ of a segment connecting the characteristic points and an horizontal axis that passes through the end point of that segment. Then, the distribution of the angles of the image's horizontal direction with respect to the segments connecting the characteristic points of the pattern is represented with the length of the segment (the number of pixels constituting the segment) represented as frequency. As for the blood vessel pattern (FIG. 17), the concentration is observed at 90-degrees point and around it; as for the pseudo blood vessel pattern (FIG. 18(A)) obtained from the gummi candy and the pseudo blood vessel pattern (FIG. 18(B)) obtained from the radish, it spreads between 0 degree and 180 degrees, showing a lack of regularity. This is because the blood vessel pattern does not spread but has certain directivity (along the length of the finger).
  • Moreover, there is a tendency that the blood vessel pattern's segments resembling a straight line (FIG. 19 (A)) are longer than those of the pseudo blood vessel pattern (FIG. 19(B)) obtained from the gummi candy and the pseudo blood vessel pattern (FIG. 19(C)) obtained from the radish. Therefore, the number of the segments (the segment blood vessel lines) recognized as groups by the above characteristic point extraction process is less than that of the pseudo blood vessel patterns.
  • Accordingly, the distinguishing indicators of the blood vessel pattern and the pseudo blood vessel pattern may be: first, the spread of the angle distribution; second, the intensity of the angle distribution at the 90-degrees point and around it; and, third, the number of the segments recognized as groups.
  • The spread of the angle distribution, for example, can be represented by the variance of the distribution (or standard deviation). This means that if the segments connecting the characteristic points (those extracted from the pattern to represent the characteristic) are represented by lK (K=1, 2, . . . , N (N: integer)), the angles of the image's horizontal direction with respect to the segments are represented by θK, and the length of the segments is represented by LK, the average of the distribution of the angles θK of the segments lK is represented, because of the length LK of the segments being weighted, as follows:
  • θ _ = θ = 0 179 S θ θ K = 1 n L K = K = 1 n L K θ K K = 1 n L K ( 1 )
  • and the variance is represented as follows:
  • σ 2 = θ = 0 179 S θ ( θ K - θ _ ) 2 ( K = 1 n L K ) - 1 = K = 1 n L K ( θ K - θ _ ) 2 ( K = 1 n L K ) - 1 ( 2 )
  • Moreover, the intensity of the distribution can be represented by a ratio of the size of the distribution existing within a predetermined angular range around the 90-degrees point to the size of the total distribution. This means that if the angular range is “lower [degree]<θ<upper [degree]” and the size of the distribution is S, the intensity of the distribution is represented as follows:
  • P lower upper = 100 × θ = lower upper S θ θ = 0 179 S θ ( 3 )
  • Moreover, the number of segments recognized as groups is the number of groups allocated after the above characteristic point extraction process, i.e. the number of the remaining groups (the segment blood vessel constituting-points rows) after the characteristic point extraction section 22's inflection point detection process of recognizing the rows of the characteristic points (the segment blood vessel constituting-points rows) extending from the starting points through the inflection points to the terminal points and combining the groups (the segment blood vessel constituting-points rows) as one group so that it resembles a straight line.
  • Here, FIG. 20 shows the result of distinguishing between the blood vessel pattern and the pseudo blood vessel pattern obtained from the gummi candy using the three distinguishing indicators. In FIG. 20, the lightly plotted points are those obtained from the pseudo blood vessel pattern of the gummi candy; the number of samples is 635. Meanwhile, the darkly plotted points are those obtained from the blood vessel pattern, which is selected from the five blood vessel patterns generated as a result of taking a picture of a finger five times: the selected blood vessel pattern has the furthest Mahanobis distance from the center of the distribution of the lightly plotted points, and the number of samples is 127.
  • Moreover, in FIG. 20, “RfG” represents a boundary (referred to as pseudo blood vessel boundary, hereinafter) and the pseudo blood vessel pattern is determined based on this boundary. Specifically, its Mahanobis distance is 2.5 from the center of the distribution of the lightly plotted points. On the other hand, “RfF” represents a boundary (referred to as blood vessel boundary, hereinafter) and the blood vessel pattern is determined based on this boundary. Specifically, its Mahanobis distance is 2.1 from the center of the distribution of the darkly plotted points. By the way, the plotted point “•” exists inside the pseudo blood vessel boundary RfG or the blood vessel boundary RfF, while the plotted point “*” does not exist inside the pseudo blood vessel boundary RfG or the blood vessel boundary RfF.
  • It is obvious from FIG. 20 that the blood vessel pattern can substantially be distinguished from the pseudo blood vessel pattern; as long as a δ-C plane of the three-dimensional distribution of FIG. 20 is concerned, the blood vessel pattern can be completely distinguished from the pseudo blood vessel pattern, as shown in FIG. 21. Incidentally, in FIGS. 20 and 21, the spread of the angle distribution is represented by the standard deviation.
  • (2-2-B) Detailed Description of the Data Generation Process
  • The following provides a detailed description of the data generation process. The data generation process is performed according to a flowchart shown in FIG. 22.
  • That is, the control section 10 reads out a plurality of samples of the image data sets D1 i from the flash memory 13, and calculates the three distinguishing indicators for each blood vessel pattern of the image data sets D1 i (i.e. the variance of the angle distribution, the intensity of the angle distribution, and the number of the segments recognized as groups) (a loop of step SP1 to SP5).
  • Moreover, after the calculation of the distinguishing indicators of each sample's blood vessel pattern (step SP5: YES), the control section 10 substitutes a matrix with the each sample's blood vessel pattern and the blood vessel pattern's distinguishing indicators expressed in columns and rows respectively:
  • R f = ( σ 1 P 1 C 1 σ 2 P 2 C 2 σ 3 P 3 C 3 σ n - 1 P n - 1 C n - 1 σ n P n C n ) = ( σ i P i C i ) ( 4 )
  • wherein σ represents the variance of the angle distribution; P represents the intensity of the angle distribution; C represents the number of the segments recognized as groups (step SP6).
  • Then, the control section 10 calculates from the matrix of the distinguishing indicators the center of the distribution of the distinguishing indicators of each sample as follows (step SP7):
  • R f _ = 1 n [ K = 1 n σ K K = 1 n P K K = 1 n C K ] = 1 n [ K = 1 n m K ( 1 ) K = 1 n m K ( 2 ) K = 1 n m K ( 3 ) ] ( 5 )
  • and then calculates the inverse matrix of the covariance matrix (step SP8). Incidentally, the covariance matrix represents the degree of the spread of the distribution of the distinguishing indicators of each sample; its inverse number is used for the calculation of the Mahalanobis distance.
  • Then, the control section 10 generates the blood vessel pattern range data (which are data representing a range for which the determination of the blood vessel pattern should be made) by using the center of the distribution of the distinguishing indicators, which was calculated at step SP7, the inverse matrix of the covariance matrix, which was calculated at step SP8, and a predetermined blood vessel boundary number (whose Mahalanobis distance is “2.1” in the case of FIG. 20) (step SP9); stores the data in the internal memory of the authentication device (step SP10); and then ends the data generation process.
  • In this manner, using the following tendencies as the distinguishing indicators for the blood vessel pattern and the pseudo blood vessel pattern, the control section 10 generates the data (the center of the distribution of the distinguishing indicators, the inverse matrix of the covariance matrix, and the blood vessel boundary number) representing the range for which the determination of the blood vessel pattern should be made: the tendency that the blood vessel pattern does not spread but has certain directivity (along the length of the finger), and the tendency that of all the segments of the blood vessel pattern, the one resembling a straight line is longer than the others.
  • (3) Configuration of the Authentication Device
  • FIG. 23 illustrates the configuration of the authentication device. The data generation device 1 includes a control section 30 to which an operation section 31, an image pickup section 32, a flash memory 33, a external interface 34 and a notification section 35 are connected via a bus 36.
  • The control section 30 is a microcomputer including CPU that takes overall control of the authentication device 1, ROM that stores various programs and setting information, and RAM that serves as a work memory for CPU. Incidentally, the blood vessel pattern range data generated by the data generation device 1 are stored in ROM.
  • When a user operates the operation section 31, an execution command COM10 of a mode (referred to as blood vessel registration mode, hereinafter) in which the blood vessels of a registration-target user (referred to as registrant, hereinafter) are registered or an execution command COM20 of a mode (referred to as authentication mode, hereinafter) in which a determination as to whether a person is the registrant or not is made is given to the control section 30 from the operation section 31.
  • Based on the execution commands COM10 and COM20, the control section 30 makes a determination as to which mode it should start. Using a program corresponding to the determination, the control section 30 appropriately controls the image pickup section 32, the flash memory 33, the external interface 34 and the notification section 35 to run in blood vessel registration mode or authentication mode.
  • (3-1) Blood Vessel Registration Mode
  • More specifically, if the determination is that it should start the blood vessel registration mode, the control section 30 enters the blood vessel registration mode, which is an operation mode, to control the image pickup section 32.
  • In this case, in a similar way to that of the image pickup section 12 (FIG. 1) of the data generation device 1, the image pickup section 32 drives and controls a near infrared beam source LS and an image pickup element ID. The image pickup section 32 also adjusts the position of an optical lens of an optical system OP and the aperture of an aperture diaphragm DH based on an image signal S10 a that the image pickup element ID output as a result of taking a picture of an object put at a predetermined position of the authentication device 2. After the adjustment, the image pickup section 32 supplies an image signal S20 a output from the image pickup element ID to the control section 30.
  • The control section 30 sequentially performs the same preprocessing process and characteristic point extraction process as those of the preprocessing section 21 and characteristic point extraction section 22 (FIG. 2) of the data generation device 1 for the image signals S20 a, in order to extract an object pattern from the image and to extract a series of characteristic points on group (segment blood vessel constituting-points row) basis, which extends from the starting point to the terminal point via the inflection point.
  • Then, based on the blood vessel pattern range data stored in ROM, the control section 30 performs a process (referred to as distinguishing process, hereinafter) to distinguish the object pattern as a blood vessel pattern or a pseudo blood vessel pattern; if it recognizes the object pattern as a blood vessel pattern, the control section 30 stores the characteristic points of the object pattern in the flash memory 33 as information (referred to as registrant identification data, hereinafter) DIS, which will be used for identifying the registrant, thereby completing the registration.
  • In this manner, the control section 30 performs the blood vessel registration mode.
  • (3-2) Authentication Mode
  • On the other hand, if the determination by the control section 30 is that it should perform the authentication mode, the control section 30 enters the authentication mode and controls the image pickup section 32 in a similar way to when it performs the blood vessel registration mode.
  • In this case, the image pickup section 32 drives and controls the near infrared beam source LS and the image pickup element ID. The image pickup section 32 also adjusts the position of the optical lens of the optical system OP and the aperture of the aperture diaphragm DH based on an image signal S10 b that the image pickup element ID output. After the adjustment, the image pickup section 32 supplies an image signal S20 b output from the image pickup element ID to the control section 30.
  • The control section 30 sequentially performs the same preprocessing process and characteristic point extraction process as those of the above-described blood vessel registration mode for the image signals S20 b and reads out the registrant identification data DIS from the flash memory 33, in which the data DIS has been registered.
  • Then, the control section 30 performs the same distinguishing process as that of the above-described blood vessel registration mode; if it distinguishes an object pattern extracted from the image signals S20 b as the blood vessel pattern, the control section 30 then compares each of the characteristic points extracted from the object pattern as a group (segment blood vessel constituting-points row) extending from the starting point to the terminal point via the inflection point with the characteristic points of the registrant identification data DIS read out from the flash memory 33, thereby making a determination as to whether a person is the registrant (au authorized user) according to the degree of congruence.
  • Here, if the determination by the control section 30 is that he is the registrant, the control section 30 generates an execution command COM 30 in order to let an operation processing device (not shown), which is connected to the external interface 34, perform a predetermined operation. The control section 30 supplies this execution command COM30 to the operation processing device via the external interface 34.
  • The following describes the application of the operation processing device connected to the external interface 34: if a locked door is applied, the execution command COM30 transmitted from the control section 30 is to unlock the door; if a computer, which has a plurality of operation modes and whose current mode is limiting the use of some operation modes, is applied, the execution command COM30 transmitted from the control section 30 is to lift the limitation.
  • Incidentally, these two examples were described as the application. But there may be other applications. Moreover, in the present embodiment, the operation processing device is connected to the external interface 34. But instead of this, the authentication device 1 may contain the software and hardware of the operation processing device.
  • Whereas, if the determination by the control section 30 is that he is not the registrant, the control section 30 displays on a display section 35 a of the notification section 35 information to that effect, and outputs sound through a sound output section 35 b of the notification section 35, visually and auditorily notifying a user of the fact that he is not the registrant.
  • In that manner, the control section 30 performs the authentication mode.
  • (3-3) Detailed Description of the Distinguishing Process
  • The following provides a detailed description of the distinguishing process by the control section 30. The distinguishing process is performed according to a flowchart shown in FIG. 24.
  • That is, after having sequentially performed the preprocessing process and the characteristic point extraction process for the image signals S20 a or S20 b that are input during the blood vessel registration mode or the authentication mode, the control section 30 starts the procedure of the distinguishing process. At step SP11, the control section 30 detects the variance of the angle distribution, the intensity of the angle distribution and the number of the segments recognized as groups from the object pattern extracted from the image signals S20 a or S20 b.
  • This detection determines the position of the object pattern, whose object is the current target of image capturing, in the three dimensional space (FIG. 20) of the distinguishing indicators of the plurality of sample patterns recognized as the authorized blood vessel patterns.
  • Then, at step SP12, the control section 30 calculates the Mahalanobis distance between the center of the three-dimensional distribution of the distinguishing indicators and the position of the object pattern based on the blood vessel pattern range data (the center of the distribution of the distinguishing indicators, the inverse matrix of the covariance matrix, and the blood vessel boundary number) stored in ROM.
  • More specifically, the Mahalanobis distance DCP is calculated by:

  • D CP=√{square root over ((P−CT)T ·Cov −1·(P−CT))}{square root over ((P−CT)T ·Cov −1·(P−CT))}  (6)
  • wherein CT is the center of the distribution of the distinguishing indicators; Cov−1 is the inverse matrix of the covariance matrix; P is the position of the object pattern. The result of the calculation reveals where the object pattern, whose is the current target of image capturing, exists in the distribution (FIG. 20) of the plurality of sample patterns recognized as the authorized blood vessel patterns.
  • Then, at step SP13, the control section 30 makes a determination as to whether the Mahalanobis distance calculated at step SP12 is less than the blood vessel boundary number of the blood vessel pattern range data stored in ROM.
  • As shown in FIG. 20, the blood vessel boundary number represents the value of the boundary RfF with respect to the center of the distribution of the distinguishing indicators: the determination of the blood vessel pattern should be made based on the boundary RfF. Accordingly, if the Mahalanobis distance is greater than the blood vessel boundary number, this means that the extracted object pattern should not be recognized as an appropriate blood vessel pattern since it may be a pseudo blood vessel pattern or a completely different pattern from the blood vessel pattern.
  • In this case, the control section 30 proceeds to step SP14 and disposes of the object pattern extracted from the image signals S20 a or S20 b and its characteristic points, and informs a user, through the notification section 35 (FIG. 23), that it should take a picture again, before ending the distinguishing process.
  • Whereas, if the Mahalanobis distance is less than or equal to the blood vessel boundary number, this means that the extracted object pattern should be recognized as an appropriate blood vessel pattern.
  • In this case, the control section 30 proceeds to step SP15 and, if it is running in blood vessel registration mode, recognizes the characteristic points extracted as a group (segment blood vessel constituting-points row), which extends from the object pattern's starting point to the terminal point through the inflection point, as those to be registered; if it is running in authentication mode, the control section 30 recognizes them as those to be compared with the characteristic points already registered as the registrant identification data DIS. The control section 30 subsequently ends the distinguishing process.
  • In this manner, using the following tendencies as the distinguishing indicators for the blood vessel pattern and the pseudo blood vessel pattern, the control section 30 generates the blood vessel pattern range data (the center of the distribution of the distinguishing indicators, the inverse matrix of the covariance matrix, and the blood vessel boundary number): the tendency that the blood vessel pattern does not spread but has certain directivity (along the length of the finger), and the tendency that of all the segments of the blood vessel pattern, the one resembling a straight line is longer than the others. Based on the blood vessel pattern range data, the control section 30 eliminates the pseudo blood vessel patterns and the like.
  • (4) Operation and Effect
  • With the configuration described above, for each of the blood vessel patterns obtained from the image signals S1 input as a plurality of samples (a living body's finger), the data generation processing device 1 of the authentication system calculates a form value representing the shape of the pattern.
  • According to the present embodiment, using the following tendencies as the indicators, the form value is determined to represent the shape of the pattern: the tendency that the blood vessel pattern does not spread but has certain directivity (along the length of the finger), and the tendency that the segment resembling a straight line is longer than the others.
  • That is, the data generation processing device 1 calculates the following values as the shape values (FIG. 22: step SP1 to step SP5): firstly, the degree of the spread of the weighted distribution (FIG. 17) with the length of the segment used as frequency, as for the distribution of the angles (FIG. 16) of the reference axis (perpendicular to the direction of the circulation of blood) with respect to the segments connecting the characteristic points of the blood vessel pattern; secondly, the ratio of the size of the distribution existing within the predetermined angular range whose center is equal to the angle of the direction of the blood circulation (90 degrees) to the size of the total distribution; thirdly, the number of the segments (FIG. 19(A)).
  • Then, the data generation processing device 1 calculates the center of the three-dimensional distribution (FIG. 20) of those form values, and the inverse number of the value (the covariance matrix) representing the degree of the spread from the center, and stores them in the internal memory of the authentication device 2.
  • On the other hand, the authentication device 2 of the authentication system calculates the above-noted three form values for the pattern obtained from the image signals S20 a or S20 b that were input as those to be either registered or compared with the registered data. Then, using the inverse number of the covariance matrix, the authentication device 2 calculates the Mahalanobis distance between the position identified by the three form values in the three-dimensional distribution and the center of the three-dimensional distribution (FIG. 20) stored in the internal memory. If the Mahalanobis distance is greater than the predetermined threshold (the blood vessel boundary number (FIG. 20: “Rff”), the authentication device 2 disposes of the pattern (FIG. 24).
  • Accordingly, as for the blood vessel patterns obtained from the plurality of samples, the authentication system recognizes where the pattern obtained from those to be either registered or compared with the registered data exists in the three-dimensional distribution (FIG. 20) corresponding to the three indicators representing the characteristics of the blood vessel patterns, and whether it exists within the range extending from the center of the distribution to the boundary (the blood vessel boundary number (FIG. 20: “Rff”): existing inside the range means that it is a living body's pattern.
  • Accordingly, even if the pattern obtained from the image signals S20 a or S20 b that were input as those to be either registered or compared with the registered data is the pseudo blood-vessel pattern (FIGS. 19(B) and (C)), compared with the blood vessel pattern, the authentication system assumes that the pseudo blood vessel pattern is not the blood vessel pattern. This increases the possibility that the authentication system eliminates the pseudo blood vessel pattern before registering or comparing them.
  • Moreover, the data generation device 1 and the authentication device 2 calculate the form values after extracting the characteristic points of the blood vessel pattern so that the line passing through these characteristic points resembles both the blood vessel pattern and the straight line.
  • Accordingly, after emphasizing the characteristic of the blood vessel pattern, which has the tendency that the segment resembling the straight line is long, the authentication system calculates the form values representing the shape of the pattern. This allows the authentication system to precisely calculate the form values. This increases the possibility that the authentication system eliminates the pseudo blood vessel pattern after assuming that it is not the blood vessel pattern.
  • According to the above configuration, as for the blood vessel patterns obtained from the plurality of samples, the authentication system recognizes where the pattern obtained from those to be either registered or compared with the registered data exists in the three-dimensional distribution corresponding to the three indicators representing the characteristics of the blood vessel patterns, and whether it exists within the range extending from the center of the distribution to the boundary: existing inside the range means that it is a living body's pattern. This increases the possibility that the authentication system eliminates the pseudo blood vessel pattern after assuming that it is not the blood vessel pattern. Thus, the authentication system that is able to improve the accuracy of authentication can be realized.
  • (5) Other Embodiment
  • In the above-noted embodiment, the determination is made as to whether the input pattern is the blood vessel pattern or not based on the data representing the distribution of the blood vessel pattern obtained from the plurality of samples and the data (threshold) representing the boundary of the distribution, which is used for the determination of the blood vessel pattern. However, the present invention is not limited to this. The distribution of the pseudo blood vessel pattern may also be used when the determination is made as to whether the input pattern is the blood vessel pattern or not.
  • That is, the above-noted data generation process (FIG. 22) of the data generation device 1 stores the center of the distribution of the three distinguishing indicators of each blood vessel pattern obtained from the living body's samples, the inverse matrix of the covariance matrix, and the blood vessel boundary number (“RfF,” or “2.1” of the Mahalanobis distance, in the case of FIG. 20) in ROM of the authentication device 2 as the blood vessel pattern range data. At the same time, as for each pseudo blood vessel pattern obtained from non-living body's, samples, the above-noted data generation process (FIG. 22) stores the center of the distribution of the three distinguishing indicators of the pseudo blood vessel pattern, the inverse matrix of the covariance matrix, and a pseudo blood vessel boundary number (“RfG,” or “2.5” of the Mahalanobis distance, in the case of FIG. 20) in ROM of the authentication device 2 as pseudo blood vessel pattern range data.
  • On the other hand, as shown in FIG. 25 whose parts have been designated by the same symbols as the corresponding parts of FIG. 24, based on the blood vessel pattern range data, the authentication device 2 calculates the Mahalanobis distance (referred to as living body distribution-related distance, hereinafter) between the position of the input pattern (the object pattern whose object is the current target of image capturing) in the three distinguishing indicators' distribution and the center of the distribution; at the same time, based on the pseudo blood vessel pattern range data, the authentication device 2 calculates the Mahalanobis distance (referred to as non-living body distribution-related distance, hereinafter) between the position of the input pattern in the three distinguishing indicators' distribution and the center of the distribution (step SP22).
  • If the living body distribution-related distance is less than or equal to the blood vessel boundary number, the authentication device 2 makes a determination as to whether the non-living body distribution-related distance is less than or equal to the pseudo blood vessel boundary number (step SP23). If the non-living body distribution-related distance is less than or equal to the pseudo blood vessel boundary number, this means that, as indicated by the δ-P plane of the three-dimensional distribution of FIG. 20, for example, the input pattern exists in an area where the range, in which things should be determined as the blood vessel patterns, is overlapped with the range, in which things should be determined as the pseudo blood vessel patterns.
  • In this case, the authentication device 2 therefore disposes of the input pattern (the object pattern whose object is the current target of image capturing) and the like even when the living body distribution-related distance is less than or equal to the blood vessel boundary number (step SP14).
  • Whereas, if the living body distribution-related distance is less than or equal to the blood vessel boundary number and the non-living body distribution-related distance is greater than the pseudo blood vessel boundary number, the authentication device 2 recognizes the characteristic points extracted as a group (segment blood vessel constituting-points row) extending from the object pattern's starting point to the terminal point via the inflection point as those to be either registered or compared (step SP15).
  • In this manner, the distribution of the pseudo blood vessel pattern can be also used when the determination is made as to whether the input pattern is the blood vessel pattern. This increases the possibility that the authentication system
  • eliminates the pseudo blood vessel pattern after assuming that it is not the blood vessel pattern, compared with the above-noted embodiment.
  • Incidentally, after it determines that the living body distribution-related distance is less than or equal to the blood vessel boundary number, the authentication device 2 then makes a determination as to whether the non-living body distribution-related distance is less than or equal to the pseudo blood vessel boundary number (step SP23). However, instead of this, the following is also possible: for example, in such a case, a determination is made as to whether the living body distribution-related distance calculated at step SP22 is greater than the non-living body distribution-related distance.
  • Moreover, in the above-noted embodiment, as the living body's pattern, the form pattern (the blood vessel pattern) of the blood vessels is applied. However, the present invention is not limited to this. Other things, such as a form pattern of fingerprints, vocal prints, mouth prints, or nerves, can be applied if a corresponding acquisition means is used based on an applied living body's pattern.
  • By the way, the above-noted three distinguishing indicators can be used as the form values representing the shape of the pattern if the applied living body's pattern, like the blood vessel pattern or the nerve pattern, has the tendency that it does not spread but has certain directivity (along the length of the finger), or the tendency that the segment resembling a straight line is long. However, if the applied one is not a living body's pattern but has that characteristic, the form values may need to be changed according to the characteristics of the applied living body's pattern.
  • Incidentally, in the above-noted embodiment, if the applied living body's pattern has the above characteristics, the following values are used as the three distinguishing indicators: firstly, the degree of the spread of the weighted distribution with the length of the segment used as frequency, as for the distribution of the angles of the reference axis with respect to the segments connecting the characteristic points of the pattern; secondly, the ratio of the size of the distribution existing within the predetermined angular range whose center is equal to the angle of the direction perpendicular to the reference axis to the size of the total angular range of the distribution; thirdly, the number of the segments. However, the present invention is not limited to this. Only two of those distinguishing indicators may be used, or another, new distinguishing indicator, such as the one used for a determination as to whether the top three peaks, of all the peaks of the angle distribution, includes the 90-degrees point, can be added to those three distinguishing indicators. In short, as long as there are two or more distinguishing indicators, they can be used as the values representing the shape of the pattern.
  • Furthermore, in the above-noted embodiment, the blood vessel pattern range data stored in ROM of the authentication device 2 contains the center of the distribution of the three distinguishing indicators of each blood vessel pattern obtained from the living body's samples, the inverse matrix of the covariance matrix and the blood vessel boundary number (“RfF,” or “2.1” of the Mahalanobis distance, in the case of FIG. 20). However, the present invention is not limited to this. The blood vessel boundary number may be previously set in the authentication device 2; if only the inverse number of the covariance matrix is calculated during the calculation of the Mahalanobis distance (FIG. 24 (FIG. 25): step SP12), it may only contain the center of the distribution of the three distinguishing indicators and the covariance matrix.
  • Furthermore, in the above-noted embodiment, as extraction means that extracts the characteristic points from the living body's pattern so that the line connecting these characteristic points resembles the living body's pattern and the straight line, the preprocessing section 21 and the characteristic point extraction section 22 are applied. However, the present invention is not limited to this. The process of the preprocessing section 21 and the characteristic point extraction section 22 may be changed if necessary.
  • For example, the preprocessing section 21 performs the A/D conversion process, the outline extraction process, the smoothing process, the binarization process, and the thinning process in that order. Alternatively, some of the processes may be omitted or replaced, or another process may be added to the series of processes. Incidentally, the order of the processes can be changed if necessary.
  • Moreover, the process of the characteristic point extraction section 22 can be replaced by a point extraction process (called Harris corner) or a well-known point extraction process such as the one disclosed in Japanese Patent Publication No. 2006-207033 ([0036] to [0163]).
  • Furthermore, in the above-noted embodiment, the authentication device 2 including the image-capturing function, the verification function and the registration function is applied. However, the present invention is not limited to this. Various applications are possible according to purposes and the like: those functions may be implemented in different devices.
  • INDUSTRIAL APPLICABILITY
  • The present invention can be applied to the field of biometrics authentication.
  • DESCRIPTION OF SYMBOLS
  • 1 . . . DATA GENERATION DEVICE, 2 . . . AUTHENTICATION DEVICE, 10, 30 . . . CONTROL SECTION, 11, 31 . . . OPERATION SECTION, 12, 32 . . . IMAGE PICKUP SECTION, 12 a, 32 a . . . DRIVE CONTROL SECTION, 13, 33 . . . FLASH MEMORY, 14, 34 . . . EXTERNAL INTERFACE, 35 . . . NOTIFICATION SECTION, 35 a . . . DISPLAY SECTION, 35 b . . . SOUND OUTPUT SECTION, 21 . . . PREPROCESSING SECTION, 22 . . . CHARACTERISTIC POINT EXTRACTION SECTION

Claims (17)

1. A pattern identification method comprising:
a first step of calculating, for each of living body's patterns obtained from a plurality of living body's samples, two or more form values representing the shape of the pattern;
a second step of calculating the center of the distribution of the two or more form values and a value representing the degree of the spread from its center;
a third step of calculating a distance between the two or more form values of a pattern obtained from those to be registered or to be compared with registered data and the center of the distribution of the two or more form values using the value; and
a fourth step of disposing of the pattern if the distance is greater than a predetermined threshold.
2. The pattern identification method according to claim 1, wherein
the two or more form values include at least two of the following values:
a degree of the spread of the weighted distribution with the length of a segment used as frequency, as for the distribution of the angles of a reference axis with respect to the segments connecting characteristic points of the pattern;
a ratio of the size of the distribution existing within a predetermined angular range whose center is equal to the angle of a direction perpendicular to the reference axis to the size of the total angular range of the distribution;
the number of the segments.
3. The pattern identification method according to claim 2, further comprising
an extraction step of extracting the characteristic points from the living body's pattern obtained from the plurality of living body's patterns so that a line connecting these characteristic points resembles the living body's pattern and a straight line.
4. The pattern identification method according to claim 1, wherein
the first step calculates, for each of the living body's patterns obtained from the plurality of living body's samples, the two or more form values representing the shape of the pattern, and also calculates, for each of non-living body's patterns obtained from a plurality of non-living body's samples, the two or more form values;
the second step calculates, as for each of the living body's patterns, the center of the distribution of the two or more form values and the value representing the degree of the spread from the center, and also calculates, as for each of the non-living body's patterns, the center of the distribution of the two or more form values and a value representing the degree of the spread from the center;
the third step calculates the first distance between the two or more form values of the pattern and the center of the distribution of the two or more form values of each of the living body's pattern, and also calculates a second distance between the two or more form values of the pattern and the center of the distribution of the two or more form values of each of the non-living body's pattern, using the value representing the degree of the spread from the center; and
the fourth step disposes of the pattern when the second distance is within a second threshold used for the determination of the non-living body's pattern, even if the first distance is greater than a first threshold used for the determination of the living body's pattern and the first distance is within the first threshold.
5. The pattern identification method according to claim 1, wherein
the living body's pattern is a form pattern of blood vessels.
6. A registration device comprising:
storage means for storing, for each of living body's patterns obtained from a plurality of living body's samples, the center of the distribution of two or more form values representing the shape of the pattern and a value representing the degree of the spread from the center;
calculation means for calculating a distance between the two or more form values of a pattern obtained from those to be registered and the center of the distribution of the two or more form values stored in the storage means using the value; and
registration means for disposing of the pattern if the distance is greater than a predetermined threshold while registering the pattern in a storage medium if the distance is within the threshold.
7. The registration device according to claim 6, wherein
the two or more form values include at least two of the following values:
a degree of the spread of the weighted distribution with the length of a segment used as frequency, as for the distribution of the angles of a reference axis with respect to the segments connecting characteristic points of the pattern;
a ratio of the size of the distribution existing within a predetermined angular range whose center is equal to the angle of a direction perpendicular to the reference axis to the size of the total angular range of the distribution;
the number of the segments.
8. The registration device according to claim 6, further comprising
extraction means for extracting the characteristic points from the pattern so that a line connecting these characteristic points resembles the pattern and a straight line, wherein
the registration means registers the pattern's characteristic points extracted by the extraction means in the storage medium.
9. The registration device according to claim 6, wherein
the storage means stores, for each of the living body's patterns obtained from the plurality of living body's samples, the center of the distribution of the two or more form values representing the shape of the pattern and the value representing the degree of the spread from the center, and also stores, for each of non-living body's patterns obtained from a plurality of non-living body's samples, the center of the distribution of the two or more form values and a value representing the degree of the spread from the center;
the calculation means calculates the first distance between the two or more form values of the pattern and the center of the distribution of the two or more form values of each of the living body's pattern, and also calculates a second distance between the two or more form values of the pattern and the center of the distribution of the two or more form values of each of the non-living body's pattern, using the value representing the degree of the spread from the center; and
the registration means disposes of the pattern when the second distance is within a second threshold used for the determination of the non-living body's pattern, even if the first distance is greater than a first threshold used for the determination of the living body's pattern and the first distance is within the first threshold.
10. The registration device according to claim 6, wherein
the living body's pattern is a form pattern of blood vessels.
11. A verification device comprising:
storage means for storing, for each of living body's patterns obtained from a plurality of living body's samples, the center of the distribution of two or more form values representing the shape of the pattern and a value representing the degree of the spread from the center;
calculation means for calculating a distance between the two or more form values of a pattern obtained from those to be registered and the center of the distribution of the two or more form values stored in the storage means using the value; and
verification means for disposing of the pattern if the distance is greater than a predetermined threshold while comparing the pattern with registered data registered in a storage medium if the distance is within the threshold.
12. The verification device according to claim 11, wherein
the two or more form values include at least two of the following values:
a degree of the spread of the weighted distribution with the length of a segment used as frequency, as for the distribution of the angles of a reference axis with respect to the segments connecting characteristic points of the pattern;
a ratio of the size of the distribution existing within a predetermined angular range whose center is equal to the angle of a direction perpendicular to the reference axis to the size of the total angular range of the distribution;
the number of the segments.
13. The registration device according to claim 11, further comprising
extraction means for extracting the characteristic points from the pattern so that a line connecting these characteristic points resembles the pattern and a straight line, wherein
the verification means compares the pattern's characteristic points extracted by the extraction means with the registered data.
14. The verification device according to claim 11, wherein
the storage means stores, for each of the living body's patterns obtained from the plurality of living body's samples, the center of the distribution of the two or more form values representing the shape of the pattern and the value representing the degree of the spread from the center, and also stores, for each of non-living body's patterns obtained from a plurality of non-living body's samples, the center of the distribution of the two or more form values and a value representing the degree of the spread from the center;
the calculation means calculates the first distance between the two or more form values of the pattern and the center of the distribution of the two or more form values of each of the living body's pattern, and also calculates a second distance between the two or more form values of the pattern and the center of the distribution of the two or more form values of each of the non-living body's pattern, using the value representing the degree of the spread from the center; and
the verification means disposes of the pattern when the second distance is within a second threshold used for the determination of the non-living body's pattern, even if the first distance is greater than a first threshold used for the determination of the living body's pattern and the first distance is within the first threshold.
15. The verification device according to claim 11, wherein
the living body's pattern is a form pattern of blood vessels.
16. A program causing a computer that stores, for each of living body's patterns obtained from a plurality of living body's samples, the center of the distribution of two or more form values representing the shape of the pattern and a value representing the degree of the spread from the center, to execute:
a first process of calculating a distance between the two or more form values of a pattern obtained from those to be registered and the center of the distribution of the two or more form values stored in the storage means using the value; and
a second process of disposing of the pattern if the distance is greater than a predetermined threshold while registering the pattern in a storage medium if the distance is within the threshold.
17. A program causing a computer that stores, for each of living body's patterns obtained from a plurality of living body's samples, the center of the distribution of two or more form values representing the shape of the pattern and a value representing the degree of the spread from the center, to execute:
a first process of calculating a distance between the two or more form values of a pattern obtained from those to be registered and the center of the distribution of the two or more form values stored in the storage means using the value; and
a second process of disposing of the pattern if the distance is greater than a predetermined threshold while comparing the pattern with registered data registered in a storage medium if the distance is within the threshold.
US12/445,519 2006-10-19 2007-10-16 Pattern identification method, registration device, verification device and program Abandoned US20100008546A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2006285353A JP2008102780A (en) 2006-10-19 2006-10-19 Pattern discrimination method, registration device, collation device, and program
JP2006-285353 2006-10-19
PCT/JP2007/070511 WO2008047935A1 (en) 2006-10-19 2007-10-16 Pattern identifying method, registration device, collating device and program

Publications (1)

Publication Number Publication Date
US20100008546A1 true US20100008546A1 (en) 2010-01-14

Family

ID=39314143

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/445,519 Abandoned US20100008546A1 (en) 2006-10-19 2007-10-16 Pattern identification method, registration device, verification device and program

Country Status (6)

Country Link
US (1) US20100008546A1 (en)
EP (1) EP2075760A1 (en)
JP (1) JP2008102780A (en)
KR (1) KR20090067141A (en)
CN (1) CN101529470A (en)
WO (1) WO2008047935A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110016317A1 (en) * 2009-07-15 2011-01-20 Sony Corporation Key storage device, biometric authentication device, biometric authentication system, key management method, biometric authentication method, and program
US20130114863A1 (en) * 2010-09-30 2013-05-09 Fujitsu Frontech Limited Registration program, registration apparatus, and method of registration
US20130121594A1 (en) * 2011-11-11 2013-05-16 Hirokazu Kawatani Image processing apparatus, line detection method, and computer-readable, non-transitory medium
US20130136321A1 (en) * 2011-11-30 2013-05-30 Samsung Electro-Mechanics Co., Ltd. Fingerprint detection sensor and method of detecting fingerprint
US10469976B2 (en) 2016-05-11 2019-11-05 Htc Corporation Wearable electronic device and virtual reality system

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008052701A (en) * 2006-07-28 2008-03-06 Sony Corp Image processing method, image processing device, and program
JP5634933B2 (en) * 2011-03-31 2014-12-03 株式会社日立ソリューションズ Biometric authentication system for detecting pseudo fingers
JP6747112B2 (en) * 2016-07-08 2020-08-26 株式会社リコー Information processing system, image processing device, information processing device, and program

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5787185A (en) * 1993-04-01 1998-07-28 British Technology Group Ltd. Biometric identification of individuals by use of subcutaneous vein patterns
US20070217660A1 (en) * 2006-03-14 2007-09-20 Fujitsu Limited Biometric authentication method and biometric authentication apparatus
US20070286462A1 (en) * 2006-04-28 2007-12-13 David Usher System and method for biometric retinal identification

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3396680B2 (en) * 2001-02-26 2003-04-14 バイオニクス株式会社 Biometric authentication device
JP2002259345A (en) 2001-02-27 2002-09-13 Nec Corp Method/device for authentication for preventing unauthorized use of physical feature data, and program
JP2002279426A (en) * 2001-03-21 2002-09-27 Id Technica:Kk System for personal authentication
JP4555561B2 (en) * 2003-12-01 2010-10-06 株式会社日立製作所 Personal authentication system and device
JP4428067B2 (en) * 2004-01-28 2010-03-10 ソニー株式会社 Image collation apparatus, program, and image collation method
JP2006207033A (en) 2006-04-22 2006-08-10 Jfe Steel Kk Surface-treated steel sheet excellent in workability and corrosion resistance at worked area

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5787185A (en) * 1993-04-01 1998-07-28 British Technology Group Ltd. Biometric identification of individuals by use of subcutaneous vein patterns
US20070217660A1 (en) * 2006-03-14 2007-09-20 Fujitsu Limited Biometric authentication method and biometric authentication apparatus
US20070286462A1 (en) * 2006-04-28 2007-12-13 David Usher System and method for biometric retinal identification

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Miura et al, "Feature extraction of finger-vein patterns based on repeated line tracking and its application to personal identification", Machine Vision and Applications, 2004, pp.194-203. *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110016317A1 (en) * 2009-07-15 2011-01-20 Sony Corporation Key storage device, biometric authentication device, biometric authentication system, key management method, biometric authentication method, and program
US20130114863A1 (en) * 2010-09-30 2013-05-09 Fujitsu Frontech Limited Registration program, registration apparatus, and method of registration
US20130121594A1 (en) * 2011-11-11 2013-05-16 Hirokazu Kawatani Image processing apparatus, line detection method, and computer-readable, non-transitory medium
US9160884B2 (en) * 2011-11-11 2015-10-13 Pfu Limited Image processing apparatus, line detection method, and computer-readable, non-transitory medium
US20130136321A1 (en) * 2011-11-30 2013-05-30 Samsung Electro-Mechanics Co., Ltd. Fingerprint detection sensor and method of detecting fingerprint
US8666126B2 (en) * 2011-11-30 2014-03-04 Samsung Electro-Mechanics Co., Ltd. Fingerprint detection sensor and method of detecting fingerprint
US10469976B2 (en) 2016-05-11 2019-11-05 Htc Corporation Wearable electronic device and virtual reality system

Also Published As

Publication number Publication date
WO2008047935A1 (en) 2008-04-24
JP2008102780A (en) 2008-05-01
EP2075760A1 (en) 2009-07-01
KR20090067141A (en) 2009-06-24
CN101529470A (en) 2009-09-09

Similar Documents

Publication Publication Date Title
US20230351800A1 (en) Fake-finger determination device, fake-finger determination method and fake-finger determination program
US20100008546A1 (en) Pattern identification method, registration device, verification device and program
KR100480781B1 (en) Method of extracting teeth area from teeth image and personal identification method and apparatus using teeth image
US8485559B2 (en) Document authentication using template matching with fast masked normalized cross-correlation
US7885437B2 (en) Fingerprint collation apparatus, fingerprint pattern area extracting apparatus and quality judging apparatus, and method and program of the same
KR20180098443A (en) Method and apparatus for recognizing finger print
EP2068270B1 (en) Authentication apparatus and authentication method
CN110647955A (en) Identity authentication method
US11188771B2 (en) Living-body detection method and apparatus for face, and computer readable medium
US20220392262A1 (en) Iris authentication device, iris authentication method and recording medium
US8325991B2 (en) Device and method for biometrics authentication
Subasic et al. Face image validation system
JP5050642B2 (en) Registration device, verification device, program and data structure
JP2010240215A (en) Vein depth determination apparatus, vein depth determination method and program
JP2008287432A (en) Vein pattern management system, vein pattern registering device, vein pattern authentication device, vein pattern registering method, vein pattern authentication method, program, and vein data structure
US20110007943A1 (en) Registration Apparatus, Checking Apparatus, Data Structure, and Storage Medium (amended
US8320639B2 (en) Vein pattern management system, vein pattern registration apparatus, vein pattern authentication apparatus, vein pattern registration method, vein pattern authentication method, program, and vein data configuration
JP2898562B2 (en) License plate determination method
KR102316587B1 (en) Method for biometric recognition from irises
EP3702958B1 (en) Method for verifying the identity of a user by identifying an object within an image that has a biometric characteristic of the user and separating a portion of the image comprising the biometric characteristic from other portions of the image
JP2007179267A (en) Pattern matching device
CN110705352A (en) Fingerprint image detection method based on deep learning
Kovac et al. Multimodal biometric system based on fingerprint and finger vein pattern
KR20020038199A (en) Discrimination method for imitative iris in iris recognition system
CN101582115A (en) Authentication apparatus, authentication method, registration apparatus and registration method

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ABE, HIROSHI;REEL/FRAME:022543/0900

Effective date: 20090319

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION