This paper proposes an accurate and efficient multi-modal authentication system that makes use of palm
and knuckleprint samples. Biometric images are transformed using the proposed sign of local gradient
(SLG). Corner features are extracted from vcode and hcode and are tracked using geometrically and
statistically constrained Lucas and Kanade tracking algorithm. The proposed highly uncorrelated features
(HUF) measure is used to match two query images. The proposed system is tested on publicly available
PolyU and CASIA palmprint databases along with PolyU Knuckleprint database. Several sets of chimeric
bi-modal as well as multimodal databases are created in order to test the proposed system. Experimental
results reveal that the proposed multi-modal system achieves CRR of 100% with an EER as low as 0.01%
over all created chimeric multimodal datasets.