To build the SNN, we have considered a 26-class problem, where uppercase and lowercase representations of characters are merged into a unique class called metaclass (e.g., “A” and “a” form the metaclass “Aa”). The main reason for such a choice is the weakness of the HRS in distinguishing between uppercase and lowercase characters (only 45 percent of the character cases are recognized correctly). The network takes a 108-dimensional feature vector as input and it has 100 units in the hidden layer and 26 outputs, one for each character class. The isolated characters are represented in the feature space by 108-dimensional feature vectors which are formed by combining three different types of features: projection histogram from whole characters, profiles from whole
characters, and directional histogram from six zones. These features were chosen among others through an empirical evaluation where the recognition rate and the feature vector
dimension were used as criteria [33].
To build the SNN, we have considered a 26-class problem, where uppercase and lowercase representations of characters are merged into a unique class called metaclass (e.g., “A” and “a” form the metaclass “Aa”). The main reason for such a choice is the weakness of the HRS in distinguishing between uppercase and lowercase characters (only 45 percent of the character cases are recognized correctly). The network takes a 108-dimensional feature vector as input and it has 100 units in the hidden layer and 26 outputs, one for each character class. The isolated characters are represented in the feature space by 108-dimensional feature vectors which are formed by combining three different types of features: projection histogram from whole characters, profiles from whole
characters, and directional histogram from six zones. These features were chosen among others through an empirical evaluation where the recognition rate and the feature vector
dimension were used as criteria [33].
การแปล กรุณารอสักครู่..