What you are looking at:
+Image based tracking using a red object as the target.
+tracking data is translated into camera rotations
+Based on Marco Rapino's BlenderTrack python script
-cheap web camera's auto white balance screw things up after a while, since it tries to shift colors to make the red object into a neutral gray object.
-slow: I successfully ported everything into one blender script to be run inside blender, but then decided to run the tracking script in a separate process to achieve 'multi-threading' :D This way, one CPU core can do all the motion estimation while the other handles the BGE. They still communicate via a simple socket connection.
-my environment setup is far from perfect, a more controlled lighting setup would yield better tracking result.