We use a passive articulated arm to track a calibrated end-effector mounted video camera. In real time, we can superimpose the live video view with the synchronized graphical view of CT-derived segmented object(s) of interest within a phantom skull (augmented reality (AR)) and provide the trajectory of the end-effector (translated to the focal point) in orthogonal image data scans and 3D models (VR). Augmented reality generation is a natural extension for the surgeon because it does both the 2D to 3D transformation and projects the views directly onto the patient view. However, there are distinct advantages for also having a VR (image guided surgery) view of the tools trajectory. Both AR and VR visualization have advantages and disadvantages depending on the stage of the surgery and surgeons should have the option to select. In this paper, we provide the software design and the network communication details of a multi-user, on-demand, near real-time simultaneous AR/VR system for surgical guidance.