1. Introduction
Film radiography is a traditional imaging inspection technique for nondestructive examination of industrial equipment components in order to locate any cavities, inclusions, lack of fusion and so on that may have been formed during the manufacturing or operation process [2]. Radiographic testing with film is an expensive and time-consuming technique (exposure time and development of the film). Some attempts have been made to speed up and automate different stages of the radiographic inspection cycle. Digital radioscopy, using X-ray detectors coupled to an image acquisition and processing system, permits real-time inspection. However, as the resolution of film is higher than that of digital radiographs, films are still considered as a reference for all the imaging systems, especially in cases of detecting very small defects. To process the film by computer, one needs to digitize the film by a scanner. After that, digital image processing methods are used for both techniques to help a human operator in the interpretation of visual data. This makes the inspection system more reliable, but in return requires the development of high-level image processing methods to replace the expert's knowledge.