I'm currently making a feeder for the vJoy driver (VS2013/vc120 in Win7 Pro SP1 x64), and I noticed a strange behaviour which also happens in the basic C++ example (provided with the SDK). Basically, all axes variables are UINT16, so they range between 0 and 65535. On the other hand, when I read data from the device driver, it's an INT16, so it ranges between -32768 and 32767. So far all of this is just as expected: input and output have the same resolution.