This clip is variations on the same theme. Essentially what is happening here is that the quicktime movie of the fish in the background is being used as the displacement map on a plane being drawn in different modes.. as points, as linestrip, as quads, and as polygons. The quicktime is also mapped on to the plane. Compose fx are used on the scene output, and the color difference you see between the 3L object and the background is due to the transfer mode, as well as the alpha choices for the object.
What is important to note here is that the DISTORTION SLIDE parameter is set to 30% in order to slow down the motion of the displacement map. This results in the "flowing" motion of the displacement of the plane as the fish are moving.
So what you are seeing here is not a 2d core image fx, but instead displacement and texturing being applied to a 3d object which is a plane, and that plane is poistioned above the background quicktime. So the displacement effect follows the motion of the fish.
Music is "Mute As a Fish" from the austrian electronic artist Lichtenberg. These clips were created and rendered in reatime using the built in dvr in thrill.artificialeyes.tv at 320x240 pixels in the motion jpeg a codec at lossless settings. The uploaded clip is an mp4.
Playing around earlier this evening making the Flotsam response vimeo.com/1353820 , i got interested in organic water movements in 3L, and this is the result. Using the same process as that video, but instead mapping a movie texture on to the planes both for texturing and to generate the displacement map to move the spheres as if they were floating on water. This is a view from underwater, looking up at the surface.
The video contains 3 variations on the same look, with just small lighting and perspective changes as i was dialing it in.
The scene output is again mapped on to the spheres at 50% with 50% white, to get the organic texturing in the 2nd 3rd and 3rd part of the video.
Additionally, i used the sinefold fx on the scene output in DIFFERENCE MODE to generate extra bubbly edges in the last 3rd of the video.
The quicktime movie used as a texture and displacement map was ripples from the surface of a swimming pool. This gave me much smoother displacement than in the Flotsam Response video.
This clip was generated and recorded in realtime using the built in dvr in 3L at 320x240 resolution motion jpeg a codec at lossless settings. The clip that was uploaded is an mpeg4 at 2000 bitrate settings.
Audio is from the "Movement in A, Study 33" from Ashley Wales.
For more information about the realtime 3d generative vj tool 3L, go to thrill.artificialeyes.tv
I saw a video earlier today from vimeo.com/user365510 More Soon, called "Flotsam", which is a RealFlow simulation.
As usual, i started wondering about how i could duplicate this in 3L., but with audio reactivity. This is the first test result, which took about 15 minutes to set up and tweak the settings on before recording it in realtime using the built in dvr.
The original clip was generated and recorded in realtime in the motion jpeg a codec at 320x240 resolution, with lossless settings. The uploaded clip was an mpeg4 at 2000 bitrate.
First, i created a circle, and a sphere, and then used the circle as a template with x/y axis data to map multiples of the sphere on with the MULTIPLE MODULE. Then, the circle is rotated on it's X AXIS to put it in "surface perspective" to match generally the perspective of the Flotsam clip. The circle is set with the Top texture to black, and the bottom texture to SCENE POST. which means that the scene output is mapped on to a black opaque circle, which gives a distorted view of the circles, since the circle is at an angle in relation to the camera viewpoint, giving the undulating "reflection" of the bobbing spheres, and "ripples"
Then, to get the surface "undulating", i used the DISTORTION MODULE on the circle with a random distortion value on the low frequency with slider automation. The distortion amount slider is set to audio automation on the mid frequency.
So what ends up happening here is that the audio frequencies are driving the distortion of the circle plane. The circle plane is serving two purposes:
1. it is the path data for the multiple object module. So when the surface distorts, it moves the position of each sphere if there is distortion happening at the x/y axis of each sphere... making them "bob".
2. the circle is also serving as a surface to map the scene view back on to the circle, creating the ripples and reflections. So if the surfac is distorting, the texture is distorting with it.
The alpha settings for the circle are S alpha and D color. This allows the spheres to show through the surface of the texture, as if you are seeing the bottoms of the spheres under the surface of the water.
I then made a third object, a plane, and rotated it on its x axis to match the rotation position of the circle plane, but placed it BELOW the circle plane with the object position y slider. So that puts the plane below the surface of the "water". I then mapped the scene output to this plane as well, and reduced the alpha value to 25%, and added some blue ambient color using the ambient slider for the plane in the object texture module for the plane. The alpha settings for the plane are S Alpha and D color, Black on the top texture, scene output on the bottom texture. This is the very subtle larger blue ripples you see mostly in the upper part of the screen at certain times in the video. This plane is also using the DISTORTION, MODULE, with the SCENE OUTPUT as the distortion map, and audio mid frequency driving the distortion amount. This gives a layered effect to the "ripples"
The sphere is white with default alpha settings.
The highlighting on the scene is done by just one light, positioned above the surface of the "water" and behind the viewer's head, with a combination of 20% diffuse, 10% ambient and 40% specular lighting. Because of the different alpha settings and different textures on each object, the each react differently to the light to give the overall scene its appearance. Minor tweaks or additional lights could change the scene appearance dramatically.
more information on 3L, the realtime generative 3d vj tool at thrill.artificialeyes.tv
These videos were all created using the same 3d model of a head, but with different lighting, alpha channel, and texture mapping settings. Viewed together as a group, they show a SMALL range of the huge variation that can be achieved with 3L by changing just a few variables of a narrow range of parameters.
All of the examples were performed and recorded live using the built-in dvr of 3L, with audio analysis driving the different parameters, and a midi controller for camera motion control. These are all unedited "live jams".