This paper describes a scheme for determining the relative location and orientation of a set of smart camera nodes and
sensor modules. The scheme is well suited for implementation on wireless sensor networks since the communication and computational requirements are quite minimal. Self localization is a basic capability on which higher level applications can be built. For example, the scheme could be used to survey the location of other sensor motes enabling a range of location based sensor analyses such as sniper detection, chemical plume detection and target tracking. Further, the ability to automatically localize a set of smart cameras deployed in an ad-hoc manner allows us to
apply a number of multi-camera 3D analysis techniques to recover aspects of the 3D geometry of a scene from the available
imagery. Ultimately we envision being able to construct
accurate 3D models of extended environments based on images
acquired by a network of inexpensive smart camera systems.