One of the possibilities with 3D applications that intrigues me the most is using Augmented Reality (AR). Based on properties of the camera such as focal length we can determine the projection matrix for the internal camera of an iOS device.
Given the projection matrix and pattern recongnition on frames of the video stream, it is possible to calculate the camera position in real-world coordinates.
This in turn gives us the opportunity to calculate where an object would appear to the camera if it were in world-space.
I'm currently testing this feature and fine-tuning the parameters in an iOS App I'm currently working on.
The Vesta planetoid, iPad stereo view (anaglyph) screen recording (STL model by NASA, JPL labs) as viewed in
3D Model View.
The model consists of 952.003 vertices and 1,894,998 faces.
I converted on of the NASA JPL STL files to be viewed in the 3D Model View App (I just combined two STL halves into one OBJ).
in fact the 3D Model View App could display the individual STL files, it just would display one at a time.....
more info at 3dmodelview.com and blog.codewerk.nl