This is my Music Computing task 2 project, for Tim Blackwell at Goldsmiths, College - University of London
The sketch uses OpenCV's blob detection to see objects placed in front of the camera. The objects are then in turn looked at by the computer, which outputs the area and circumference of the object. The area and circumference are then used to determine the individual pitch, velocity, and duration of the notes that the synthesizer within the program should play.
After the pitch, velocity, and duration of the notes are played, the data from the played notes is sent to the parametric equation inside the Geometrics class. The geometrics class converts the data, and outputs lissajous curves that can be seen in the top left corner of the screen.
The goal of the project was to see how different objects sounded, and then how those sounds from the objects could be visualized as lissajous curves.
I can switch between two different visualizations of the curves as to give further options for visualization. I can also change the threshold of the input image from the camera so that the detection becomes selective, and only the object is seen by the computers vision.