For a school project on Augmented Reality we created a concept to engage communication in bus stops. Often you have to wait for a bus in a bus stop for about 15 minutes and when you're with more people the atmosphere in the bus stop often is awkward. We wanted to create a good atmosphere for communication in the bus stop by engaging people to play.
When you walk in to the bus stop a abstract "animal" comes to you to play, there are different "animals" with their own specific caracteristics. Once the animal has drawn you to play with him, it will attract you to someone else at the bus stop. Then the animals of both people will play together, and you have something to talk about with that person.
This demo was built using openFrameworks, it was my first oF project. If someone wants the code I can post it online somewhere.
We made a installation that enabled us to beam on the floor with a regular beamer (i.imgur.com/5MUI2l.jpg). It also increased the beamed surface. But because we used a wide-angle mirror it distorted the image, we fixed this by using ofCvCameraCalibration (openframeworks.cc/forum/viewtopic.php?p=17129#p17129). I rewrote the XML function in the demo app and made it save the xml calibration to a xml file. We then read out the XML file in our own app so we don't have to calibrate every time we start the app.
Then I use warpIntoMe to select only the beamed image as webcam input. Then the installation tracks blobs with persistent IDs, this is included in the experimental OpenCV library found here (openframeworks.cc/forum/viewtopic.php?f=10&t=393). For every blob that has the right size, a particle is created using the ofxRuiPhysics2d particle system. I rewrote the draw function to load a movie instead of just draw a circle.
That's basically it.
Loading more stuff…
Hmm…it looks like things are taking a while to load. Try again?