 RESEARCH PAPER
 Open Access
A practical person authentication system using second minor finger knuckles for door security
 Daichi Kusanagi^{1},
 Shoichiro Aoyama^{1},
 Koichi Ito^{1}Email author and
 Takafumi Aoki^{1}
https://doi.org/10.1186/s4107401700165
© The Author(s) 2017
 Received: 8 April 2016
 Accepted: 1 March 2017
 Published: 24 March 2017
Abstract
This paper proposes a person authentication system using second minor finger knuckles, i.e., metacarpophalangeal (MCP) joints, for door security. This system acquires finger knuckle patterns on MCP joints when a user takes hold of a door handle and recognizes a person using MCP joint patterns. The proposed system can be constructed by attaching a camera onto a door handle to capture MCP joints. Region of interest (ROI) images around each MCP joint can be extracted from only one still image, since all the MCP joints are located on the front face of the camera. Phasebased correspondence matching is used to calculate matching scores between ROIs to take into consideration deformation of ROIs caused by hand pose changes. Through a set of experiments, we demonstrate that the proposed system exhibits the efficient performance of MCP recognition and also show the potential possibilities of second minor finger knuckles for biometric recognition.
Keywords
 Finger knuckle
 Biometrics
 Phaseonly correlation
 Door security
 Metacarpophalangeal joint
1 Introduction
Summary of researches on finger knuckle recognition
Author  Joint  Finger  Feature  Similarity  Database 

C. Ravikanth and A. Kumar [3]  PIP  I, M, R and L  Subspace (PCA, LDA and ICA)  Distance  Own (Flat plane) 
A. Kumar and C. Ravikanth [4]  PIP  I, M, R and L  Subspace (PCA, LDA and ICA)  Distance  Own (Flat plane) 
L. Zhang et al. [5]  PIP  I and M  BLPOC  Correlation  PolyU FKP DB 
A. Kumar and Y. Zhou [6]  PIP  M  Modified Finite Radon Transform  Distance  Own (Flat plane) 
L. Zhang et al. [7]  PIP  I and M  Improved Competitive Code and Magnitude Code  Distance  PolyU FKP DB 
A. Morales et al. [8]  PIP  I and M  Orientation enhanced SIFT  Distance  PolyU FKP DB 
Z. Leqing [9]  PIP  M  SURF  Distance  PolyU FKP DB 
G.S. Badrunath et al. [10]  PIP  I and M  SIFT and SURF  Distance  PolyU FKP DB 
M. Xiong et al. [11]  PIP  I and M  Log Gabor Binary Patterns  Distance  PolyU FKP DB 
L. Zhang et al. [12]  PIP  I and M  Competitive Code and BLPOC  Distance and correlation  PolyU FKP DB 
L. Zhang et al. [13]  PIP  I and M  Riesz Competitive Code  Distance  PolyU FKP DB 
Z.S. Shariatmadar and K. Faez [14]  PIP  I and M  Average Absolute Deviation and Gabor Filter  Distance  PolyU FKP DB 
L. Zhang et al. [15]  PIP  I and M  Phase Congruency and BLPOC  Distance and correlation  PolyU FKP DB 
G. Gao et al. [16]  PIP  I and M  Sparse Reconstruction and Adaptive Binary Fusion  Distance  PolyU FKP DB 
S. Aoyama et al. [17]  PIP  I and M  Local Block Matching (BLPOC)  Correlation  PolyU FKP DB 
K.Y. Chen and A. Kumar [19]  PIP  I  1D Log Gabor Filter  Distance  Own (Smartphone) 
S. Aoyama et al. [23]  PIP  I, M and R  Correspondence Matching (BLPOC)  Correlation  Doorhandle 
D. Kusanagi et al. [24]  PIP  I, M, R and L  Correspondence Matching (BLPOC)  Correlation  Own (Doorhandle) 
A. Kumar [25]  DIP and PIP  M  Improved Local Binary Patterns and 1D Log Gabor Filter  Distance  Own (Flat plane) 
A. Kumar and Z. Xu [2]  MCP  I, M, R and L  Local Radon Transform, Ordinal Code and BLPOC  Distance and correlation  Own (Flat plane) 
Proposed  MCP  I, M, R and L  Correspondence Matching (BLPOC)  Correlation  Own (Doorhandle) 
There are also a few works on finger knuckle recognition under practical situations. Kumar et al. [4] have proposed a finger knuckle recognition algorithm using multiple patterns acquired from the index, middle, ring, and little fingers. They demonstrated that the matching score calculated by combining four PIP joints is effective for person authentication. Cheng et al. [19] have proposed a contactless PIP joint recognition system using a camera embedded on smartphones. This was the first attempt to develop a practical person authentication system using PIP joints for smartphones. Therefore, the recognition performance was not necessarily good. Aoyama et al. [23] have proposed a finger knuckle recognition system for a door handle. This system acquired PIP joint patterns when a user takes hold of a door handle and recognized a person using acquired patterns. Hence, the users do not pay attention to the authentication process. This system also used the combined information of the four knuckles to improve performance of finger knuckle recognition. Kusanagi et al. [24] have developed an improved version of Aoyama’s system by using video sequences.
There are a few works on finger knuckle recognition using MCP and DIP joints compared with PIP joints. Kumar [25] has proposed a finger knuckle recognition algorithm using both major and first minor finger knuckle patterns, i.e., PIP and DIP joints. Combination of two joint patterns improved performance of finger knuckle recognition. Kumar et al. [2] have also considered the use of texture patterns around MCP joints to identify persons. Both works gave us the fundamental investigation of biometric recognition using minor finger knuckle joints, since the performance has been evaluated using images of a hand with the fingers and thumb spread apart put on a flat plane.
This paper focuses on the use of second minor finger knuckles, i.e., MCP joints, for biometric recognition and develop a practical person authentication system using MCP joints. We consider person authentication using MCP joints for a door handle which is inspired by the concept of Aoyama’s system [23]. Aoyama’s system has to embed a camera into a door, since this system captures texture patterns on PIP joints when a user took hold of a door handle, resulting in increasing the cost. Local images around each PIP joint are not always extracted from only one still images suggested by Kusanagi et al. [24]. On the other hand, our proposed system uses MCP joints for person authentication. Texture patterns on MCP joints can be captured using a camera attached on a door handle. In this case, MCP joints are located on the front face of the camera. Therefore, a local image around each MCP joint can be extracted from only one still image. Phasebased correspondence matching [26] is used to calculate matching scores between MCP joint patterns as well as the conventional PIP joint recognition systems [23, 24]. Through a set of experiments, we demonstrate that the proposed system exhibits the efficient performance of MCP recognition and also shows the potential possibilities of minor finger knuckles for biometric recognition.
 1.
This is the first attempt to use finger knuckle pattern on MCP joints for person authentication in a practical situation.
 2.
The prototype of a door security system using finger knuckle recognition is developed. The use of MCP joints makes it possible to develop a userfriendly person authentication system for door security.
2 Finger knuckle recognition system for door security
This section describes an overview of the proposed system. We develop the MCP joint recognition system inspired by the concept of finger knuckle recognition systems for door handles [23, 24].
Fingers have three joints, i.e., DIP, PIP, and MCP joints, as shown in Fig. 1. When a user takes hold of a door handle to open a door, it is easy to capture PIP and MCP joints by a camera. DIP joints are faced to the floor, and DIP joints of the index and middle fingers may be behind the thumb. Therefore, DIP joints are not suitable to use person authentication for door security.
The conventional systems using PIP joints consist of a handle, a camera, and a light source, where the camera has to be located so as to face toward PIP joints. When a user takes hold of a door handle, the system captures an image or a video sequence and recognizes a user using PIP joint patterns. The advantage is that the image acquisition process is not intrusive, that is, the user only has to open the door by taking hold of the door handle. The disadvantage is that the shape of PIP joints may be varied in each image acquisition due to hand pose variations, resulting in decreasing the recognition performance. In addition, the camera and the light source have to be embedded into the door. Hence, the door has to be refined and it takes much cost.
According to the fundamental investigation by Kumar et al. [2], MCP joints have sufficient distinctiveness for person authentication as well as PIP joints. MCP joints can be captured by attaching a camera onto a door handle and using the ambient light. Therefore, only a little effort is required to make a system for MCP joint recognition compared with the case of PIP joint recognition. Moreover, the variation of MCP joints is smaller than that of PIP joints, when a user takes hold on a door handle.
Specification of the developed system
Camera  PointGrey FL3U313E4CC [31] 
Image size 1280 × 960 pixels  
Lens  μtron 0420 
Focal length 15 mm  
Light source  Ambient light 
3 MCP joint recognition
3.1 Image acquisition
3.2 ROI extraction
This step extracts a ROI image from the captured hand image. The position of MCP joints is detected according to the valleys between fingers. The size of images is 1280 × 960 pixels as mentioned in Section 2. The captured image is resized into 640 × 480 pixels in order to reduce the amount of memory usage and the computation time, assuming that this algorithm is implemented on embedded systems. The input image is indicated by f(n _{1},n _{2}), where 1≤n _{1}≤480 and 1≤n _{2}≤640.
where e _{ l } and e _{ r } indicate the vertical coordinate of left and right ends of the hand, respectively. d ^{′} is used to detect e _{ r }, since the rightsided end on the handle may be detected as the edge of the hand, if d is used.
Figure 4 d shows the result of vertical projection of f ^{′}(n _{1},n _{2}). The three local minima of V(n _{2}) are detected as boundaries between fingers indicated by v ^{ m }(m=1,2,3), where each index of m corresponds to the boundary between index and middle fingers, middle and ring fingers, and ring and little fingers, respectively.
where i=1,2,3,4 and each index i corresponds to the index, middle, ring, and little fingers, respectively. The region with 150 × 150 pixels centered on u ^{ i } is extracted as the ROI image.
3.3 ROI matching
Phasebased correspondence matching [26] is used to calculate matching scores between ROI images, which employs (i) a coarsetofine strategy using image pyramids for robust correspondence search and (ii) a local block matching method using BLPOC. The image deformation is observed in ROI images captured in the different timing due to hand rotation, although ROI images extracted from MCP joints have smaller deformation than those from PIP joints. Such deformation can be approximated by small translations in a local area. Intensity variation can be observed in ROI images due to different illumination condition. BLPOC is one of the image matching methods robust against illumination changes. Therefore, we decide to employ phasebased correspondence matching as well as the conventional PIP joint recognition systems [23, 24].
where n _{1}=−K _{1},⋯,K _{1}, n _{2}=−K _{2},⋯,K _{2}, and \(\sum '_{k_{1},k_{2}}\) denotes \(\sum _{k_{1}=K_{1}}^{K_{1}}\sum _{k_{2}=K_{2}}^{K_{2}}\). Note that the maximum value of the correlation peak of the BLPOC function is always normalized to 1 and does not depend on L _{1} and L _{2}.

Step 1: For l=1,2,⋯,l _{max}−1, create the lth layer images I ^{ l }(n _{1},n _{2}) and J ^{ l }(n _{1},n _{2}), i.e., coarser versions of I ^{0}(n _{1},n _{2}) and J ^{0}(n _{1},n _{2}), recursively as follows:$$\begin{array}{@{}rcl@{}} I^{l}(n_{1},n_{2}) &=& \frac{1}{4}\sum_{i_{1}=0}^{1} \sum_{i_{2}=0}^{1} I^{l1}(2n_{1}+i_{1},2n_{2}+i_{2}),\\ J^{l}(n_{1},n_{2}) &=& \frac{1}{4}\sum_{i_{1}=0}^{1} \sum_{i_{2}=0}^{1} J^{l1}(2n_{1}+i_{1},2n_{2}+i_{2}). \end{array} $$

Step 2: For every layer l=1,2,⋯,l _{max}, calculate the coordinate \(\mathbf {p}_{l}=(p^{l}_{1},p^{l}_{2})\) corresponding to the original reference point p ^{0} recursively as follows:$$\begin{array}{@{}rcl@{}} \begin{array}{rcl} \mathbf{p}^{l} &=& \lfloor\frac{1}{2}\mathbf{p}^{l1}\rfloor = \left(\lfloor\frac{1}{2}p^{l1}_{1}\rfloor, \lfloor\frac{1}{2}p^{l1}_{2}\rfloor\right), \end{array} \end{array} $$(12)
where ⌊z⌋ denotes the operation to round the element of z to the nearest integer toward minus infinity.

Step 3: We assume that \(\mathbf {q}^{l_{\text {max}}}=\mathbf {p}^{l_{\text {max}}}\) in the coarsest layer. Let l=l _{max}−1.

Step 4: From the lth layer images I ^{ l }(n _{1},n _{2}) and J ^{ l }(n _{1},n _{2}), extract two small images f ^{ l }(n _{1},n _{2}) and g ^{ l }(n _{1},n _{2}) with their centers on p ^{ l } and 2q ^{ l+1}, respectively. The size of image blocks is W×W pixels.

Step 5: Estimate the displacement between f ^{ l }(n _{1},n _{2}) and g ^{ l }(n _{1},n _{2}) using BLPOC. Let the estimated displacement vector be δ ^{ l }. The lth layer correspondence q ^{ l } is determined as follows:$$\begin{array}{@{}rcl@{}} \begin{array}{rcl} \mathbf{q}^{l} &=& 2\mathbf{q}^{l+1}+\boldsymbol{\delta}^{l}. \end{array} \end{array} $$(13)

Step 6: Decrement the counter by 1 as l←l−1 and repeat from Step 4 to Step 6 while l≥0.

Step 7: From the original images I ^{0}(n _{1},n _{2}) and J ^{0}(n _{1},n _{2}), extract two image blocks with their centers on p ^{0} and q ^{0}, respectively. Calculate the BLPOC function between the two blocks. The peak value of the BLPOC function is obtained as a measure of reliability in local block matching. Finally, we obtain the corresponding point pairs and their reliability.
In this paper, we employ parameters: l _{max}=2, W=48, K _{1}/M _{1}=K _{2}/M _{2}=0.5 for BLPOC.
where i=1,2,3,4 and each index i corresponds to the index, middle, ring, and little fingers, respectively.
3.4 Score fusion
4 Experiments and discussion
This section describes experiments to evaluate performance of MCP joint recognition using the proposed system.
A hand image database is created using the proposed system as shown in Fig. 2. Images are collected from 28 subjects in two separate sessions, where the time interval between the first and second sessions is more than 1 week. The size of images is 1280 × 960 pixels as mentioned in Section 2. In each session, five images are captured from the left and right hands. To increase the number of combinations, we assume that the left and right hand images taken from the same subject are different from each other. The mirrorreversed image of the left hand image is used in the experiments. As a result, the database contains 560 images with 56 subjects and 10 different images of each subject. The number of genuine pairs is 2520 (=_{10} C _{2}×56), and the number of impostor pairs is 154,000 (=_{56} C _{2}×10×10).
The performance of the proposed method is compared with the conventional finger knuckle matching methods such as BLPOC [2, 5], CompCode [29], and LGIC [12]. BLPOC is used for PIP joints in [5] and MCP joints in [2]. The BLPOC function between two ROI images is calculated by Eq. (11), and its maximum peak value is obtained as a matching score. CompCode (competitive code) proposed by Kong et al. [30] is generated by applying a bank of Gabor filters with orientation parameters. ROI images are coded as orientations having the maximum response for each pixel. The matching score is calculated by the Hamming distance. LGIC, i.e., localglobal information combination, is a combination of BLPOC and CompCode. BLPOC is used to extract global features, while CompCode is used to extract local features. A translational displacement between ROI images is estimated by BLPOC, and the common areas are extracted according to the estimated displacement. A global matching score between common areas is calculated by BLPOC, while a local matching score is calculated by CompCode. The final matching score is obtained by a weighted sum of global and local matching scores. Kumar et al. [2] have suggested that BLPOC exhibited the best performance in finger knuckle recognition of MCP joints from their fundamental investigation. On the other hand, Zhang et al. [12] demonstrated that LGIC exhibited better performance than BLPOC in finger knuckle recognition of PIP joints. Hence, we decided to compare the performance of the proposed method with BLPOC, CompCode, and LGIC.
4.1 Experiment 1
4.2 Experiment 2
EERs [%] for each matching algorithm in 1st session (upper) and 2nd session (lower)
Algorithm  I  M  R  L  I+M+R+L 

BLPOC  6.93  4.10  7.32  8.67  3.31 
7.61  4.09  4.97  10.47  2.98  
CompCode  7.29  5.26  6.76  8.59  4.06 
6.39  6.84  7.24  8.17  3.36  
LGIC  5.13  3.46  6.48  6.78  3.09 
4.73  3.09  4.08  7.81  2.43  
Proposed  1.61  2.01  2.85  3.03  0.77 
2.22  1.24  1.40  4.02  0.78 
4.3 Computation time
The computation time of the proposed algorithm is evaluated using MATLAB R2013a on Intel Core i54250U (1.3 GHz). The computation time for ROI extraction and ROI matching is 141 and 91 ms, respectively.
5 Conclusion
This paper proposed a person authentication system using MCP joints for door security. The proposed system can be constructed by attaching a camera onto a door handle. This system can be applied to the existing doors with simple construction compared with the conventional systems using PIP joints for a door handle which need to embed a camera into a door. ROI images around each MCP joint can be extracted from only one still image, since MCP joints are located on the front face of the camera. ROI images captured in the different timing include deformation due to hand pose changes. The use of phasebased correspondence matching makes it possible to calculate reliable matching scores when ROI images have deformation compared with conventional methods. Through a set of experiments, we demonstrated that the proposed system exhibits the efficient performance of MCP recognition. Person authentication using finger knuckles may be difficult to introduce high security access applications such as border controls, since further investigation is required to demonstrate the uniqueness and the distinctiveness of finger knuckle patterns. On the other hand, this paper presented the potential possibilities of minor finger knuckles for biometric recognition. The proposed system will be acceptable for commercial applications such as building access control due to its convenience. In future, we will develop a multiple finger knuckle recognition system which employs major and minor finger knuckles.
A preliminary version of this paper is presented in ACPR 2015 [ 32 ].
Declarations
Acknowledgements
This work was supported, in part, by JSPS KAKENHI Grant Numbers 15H02721.
Authors’ contributions
DK carried out this study, made a database, performed the experiments, and drafted the manuscript. SA carried out this study, performed the experiments and their analysis, and helped to draft the manuscript. KI conceived of the study, performed the analysis of the experimental results, and drafted the manuscript. TA participated in the design and coordination of this study and helped to draft the manuscript. All authors read and approved the final manuscript.
Competing interests
The authors declare that they have no competing interests.
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License(http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
Authors’ Affiliations
References
 Jain AK, Flynn P, Ross AA (2008) Handbook of biometrics. Springer, US.View ArticleGoogle Scholar
 Kumar A, Xu Z (2014) Can we use second minor finger knuckle patterns to identify humans?Proc IEEE Comput Soc Conf Conf Comput Vis Pattern Recognit Workshop: 106–112.Google Scholar
 Ravikanth C, Kumar A (2007) Biometric authentication using fingerback surface. Proc IEEE Comput Soc Conf Conf Comput Vis Pattern Recognit: 1–6.Google Scholar
 Kumar A, Ravikanth C (2009) Personal authentication using finger knuckle surface. IEEE Trans Inf Forensic Secur4(1): 98–110.View ArticleGoogle Scholar
 Zhang L, Zhang L, Zhang D (2009) Fingerknuckleprint verification based on bandlimited phaseonly correlation. Lect Notes Comput Sci (CAIP2009)5702: 141–148.View ArticleGoogle Scholar
 Kumar A, Zhou Y (2009) Personal identification using finger knuckle orientation features. Electron Lett45(20): 1023–1025.View ArticleGoogle Scholar
 Zhang L, Zhang L, Zhang D, Zhu H (2010) Online fingerknuckleprint verification for personal authentication. Pattern Recog43: 2560–2571.View ArticleMATHGoogle Scholar
 Morales A, Travieso CM, Ferrer MA, Alonso JB (2011) Improved fingerknuckleprint authentication based on orientation enhancement. Electron Lett47(6): 380–381.View ArticleGoogle Scholar
 Leqing Z (2011) Finger knuckle print recognition based on SURF algorithm. Proc Int’l Conf Fuzzy Syst Knowl Discov: 1879–1883.Google Scholar
 Badrinath GS, Nigam A, Gupta P (2011) An efficient fingerknuckleprint based recognition system fusing SIFT and SURF matching scores. Proc Intl’ Conf Inf Commun Secur: 374–387.Google Scholar
 Xiong M, Yang W, Sun C (2011) Fingerknuckleprint recognition using LGBP. Proc Int’l Conf Adv Neural NetwPart II: 270–277.Google Scholar
 Zhang L, Zhang L, Zhang D, Zhu H (2011) Ensemble of local and global information for fingerknuckleprint recognition. Pattern Recognit44: 1990–1998.View ArticleGoogle Scholar
 Zhang L, Li H, Shen Y (2011) A novel Riesz transforms based coding scheme for fingerknuckleprint recognition. Proc Int’l Conf HandBased Biom: 1–6.Google Scholar
 Shariatmadar ZS, Faez K (2011) An efficient method for fingerknuckleprint recognition based on information fusion. Proc Int’l Conf Signal Image Process Appl: 210–215.Google Scholar
 Zhang L, Zhang L, Zhang D, Guo Z (2012) Phase congruency induced local features for fingerknuckleprint recognition. Pattern Recog45: 2522–2531.View ArticleGoogle Scholar
 Gao G, Zhang L, Yang Y, Zhang L, Zhang D (2013) Reconstruction based fingerknuckleprint verification with score level adaptive binary fusion. IEEE Trans Image Process22(12): 5050–5062.MathSciNetView ArticleGoogle Scholar
 Aoyama S, Ito K, Aoki T (2014) A fingerknuckleprint recognition algorithm using phasebased local block matching. Inform Sci268: 53–64.View ArticleGoogle Scholar
 PolyU FKP Database. http://www4.comp.polyu.edu.hk/~biometrics/. Accessed 8 Apr 2016.
 Cheng KY, Kumar A (2012) Contactless finger knuckle identification using smartphones. Proc Int’l Conf Biom Spec Interest Group: 1–6.Google Scholar
 Burge MJ, Bowyer K (2013) Handbook of iris recognition. SpringerVerlag, London.Google Scholar
 Kong A, Zhang D, Kamel M (2009) A survey of palmprint recognition. Pattern Recog42(7): 1408–1418.View ArticleGoogle Scholar
 Ito K, Nakajima H, Kobayashi K, Aoki T, Higuchi T (2004) A fingerprint matching algorithm using phaseonly correlation. IEICE Trans FundamE87A(3): 682–691.Google Scholar
 Aoyama S, Ito K, Aoki T (2013) A multifinger knuckle recognition system for door handle. Proc Int’l Conf Biom Theory Appl SystO18: 1–7.Google Scholar
 Kusanagi D, Aoyama S, Ito K, Aoki T (2014) Multifinger knuckle recognition from video sequence: extracting accurate multiple finger knuckle regions. Proc Int’l Joint Conf Biom1–8.Google Scholar
 Kumar A (2012) Can we use minor finger knuckle images to identify humans?Proc Int’l Conf Biom Theory Appl Syst55–60.Google Scholar
 Ito K, Iitsuka S, Aoki T (2009) A palmprint recognition algorithm using phasebased correspondence matching. Proc Int’l Conf Image Process1977–1980.Google Scholar
 Gonzalez RC, Woods RE (1992) Digital image processing. Pearson Education, New Jersey.Google Scholar
 Ross AA, Nandakumar K, Jain AK (2006) Handbook of multibiometrics. Springer, US.Google Scholar
 Zhang L, Zhang L, Zhang D (2009) Fingerknuckleprint: a new biometric identifier. Proc Int’l Conf Image Process1981–1984.Google Scholar
 Kong AWK, Zhang D (2004) Competitive coding scheme for palmprint verification. Proc Int’l Conf Pattern Recog1: 520–523.Google Scholar
 Flea, 3 1.3 MP Color USB3 Vision, Point Grey Research Inc. https://www.ptgrey.com/flea313mpcolorusb3visione2vev76c560camera. Accessed 8 Apr 2016.
 Kusanagi D, Aoyama S, Ito K, Aoki T (2015) A person authentication system using second minor finger knuckles for door handle. Proc Asian Conf Pattern RecogOS901: 1–5.Google Scholar