1. anatoleya.co.uk

    I really like infrared black and white photography. For those unfamiliar with it - as infrared light reflects differently to regular light it means that [with your cam in b/w mode] green becomes white and blue becomes black/dark grey. So, if the weather's right [bright sunshine and clear blue skies] you get white grass and trees with dark water and dark skies.

    I was pleased to discover my Olympus EPL1 is sensitive to infrared light, so I made this video out of a sequence of photographs using the continuous shot mode and wheeling the camera around on my bike.

    METHOD/EQUIPMENT: I used an Olympus EPL1 camera with a Panasonic Lumix pancake lens with an infrared filter, attached it to my bike and wheeled it around in continuous shot mode. The EPL1 doesn't take a release cable so I used a makeshift device that physically holds the shutter button down in continuous shooting mode. Here's a photo: flic.kr/p/9F5pJ7

    CAMERA SETTINGS: As the EPL1 isn't as sensitive to infrared light as I would've liked this meant I had to increase the ISO to 1600 for all the motion shots with the shutter slowed down as much as possible without blurring the images. I found 1/50th and 1/60th sec to be ok. For the stationary scenes I lowered the ISO to between 400-800 with the shutter between 1/10th and 1/30th sec. I kept the aperture at f/1.7.

    SOFTWARE: I turned the image sequence into a video using AVIDEMUX [a free linux program], then I edited the video using Open Shot Video Editor [another free linux program].

    LOCATIONS/ROUTE: Video begins at Primrose Hill, then around Regents Park, then to the Artemis statue in Hyde Park using a fisheye lens, then a stationary scene in Regents Park, another in Hyde Park overlooking the Serpentine, then finishing with a view of London from the Greenwich Observatory.

    I know the infrared effect isn't as strong as I would've liked, but there was a very thin layer of cloud in the sky. I waited and waited for the perfect weather... but it never came... so I made the video anyway.

    MUSIC: I made a piano piece to go with the video - plugged my electric piano into my computer and recorded myself. I added a delay bit at the end for more resonance.

    On soundcloud:

    # vimeo.com/23055792 Uploaded 11.6K Plays 13 Comments
  2. 'Aquarius Dreams'
    Official music video

    Music by Cornelia

    Video by Martyn Thomas

    Martyn used painstaking stop motion techniques with each frame of film sketched by hand, then individually photographed against a backlight.

    ★ Follow: twitter.com/iamcornelia

    ★ iTunes: bit.ly/nvmxO4

    Aquarius Dreams is out now on Camp Mozart with remixes by Scratcha DVA, Kid Specific and Will Ward from Circle Traps.

    "Cornelia is full of surprises. One of the most inventive and forward looking vocalists out there. Aquarius Dreams is perfectly pitched leftfield pop, love the DVA mix too. Can't wait for the album!"
    Jamie Woon


    # vimeo.com/26348317 Uploaded 694 Plays 2 Comments
  3. Project by Daniel Franke & Cedric Kiefer

    produced by:


    Music: Machinefabriek "Kreukeltape"

    Text: Sandra Moskova

    The basic idea of the project is built upon the consideration of creating
    a moving sculpture from the recorded motion data of a real person. For
    our work we asked a dancer to visualize a musical piece (Kreukeltape by
    Machinenfabriek) as closely as possible by movements of her body. She was
    recorded by three depth cameras (Kinect), in which the intersection of the
    images was later put together to a three-dimensional volume (3d point cloud),
    so we were able to use the collected data throughout the further process.
    The three-dimensional image allowed us a completely free handling of the
    digital camera, without limitations of the perspective. The camera also reacts
    to the sound and supports the physical imitation of the musical piece by the
    performer. She moves to a noise field, where a simple modification of the
    random seed can consistently create new versions of the video, each offering
    a different composition of the recorded performance. The multi-dimensionality
    of the sound sculpture is already contained in every movement of the dancer,
    as the camera footage allows any imaginable perspective.

    The body – constant and indefinite at the same time – “bursts” the space
    already with its mere physicality, creating a first distinction between the self
    and its environment. Only the body movements create a reference to the
    otherwise invisible space, much like the dots bounce on the ground to give it
    a physical dimension. Thus, the sound-dance constellation in the video does
    not only simulate a purely virtual space. The complex dynamics of the body
    movements is also strongly self-referential. With the complex quasi-static,
    inconsistent forms the body is “painting”, a new reality space emerges whose
    simulated aesthetics goes far beyond numerical codes.

    Similar to painting, a single point appears to be still very abstract, but the
    more points are connected to each other, the more complex and concrete
    the image seems. The more perfect and complex the “alternative worlds” we
    project (Vilém Flusser) and the closer together their point elements, the more
    tangible they become. A digital body, consisting of 22 000 points, thus seems
    so real that it comes to life again.

    nominated for the for the MuVi Award:

    see video in full quallity:

    HQ Stills

    # vimeo.com/38840688 Uploaded 690K Plays 199 Comments
  4. 'Noise Ink' is a pseudo 2D fluid/ink animation system which responds to body movement via a Kinect camera. A combination of optical flow and perlin noise control the visuals.

    It was originally created for the 2011 Auckland Arts Festival: vimeo.com/21620801

    Project page with more information: behance.net/gallery/Noise-Ink-body-reactive-installation/1244325

    Processing source code and binaries available on my GitHub: github.com/trentbrooks/Noise-Ink

    1) Download the source code or binaries
    2) If your downloading the source, you will also need to download the open kinect libraries- github.com/shiffman/libfreenect/tree/master/wrappers/java/processing. Copy the 'wrappers/java/distribution/openkinect' folder into your Processing libraries.
    3) Connect your kinect camera via usb
    4) Launch the NoiseInk app or run from Processing. The kinect depth image should appear.
    5) Adjust the minimum and maximum depths with keys 'a,z,s,x' until you get a clear white silhouette of your body with everything else black.
    6) That's it, press 'space' to begin. Pressing 'space' will toggle the kinect menu on or off.

    - Daniel Shiffman for the openkinect libraries (github.com/shiffman/libfreenect/tree/master/wrappers/java/processing) and nature of code tutorials ( shiffman.net/teaching/nature/).
    - Generative Gestaltung (generative-gestaltung.de/) for perlin noise articles.
    - Patricio Gonzalez Vivo ( patriciogonzalezvivo.com ) & Hidetoshi Shimodaira (shimo@is.titech.ac.jp) for Optical Flow example (openprocessing.org/visuals/?visualID=10435).
    - Memotv (memo.tv/msafluid_for_processing) for inspiration.

    # vimeo.com/21691884 Uploaded 9,786 Plays 10 Comments

pb rupturas

brenda fucuta

Browse This Channel

Shout Box

Heads up: the shoutbox will be retiring soon. It’s tired of working, and can’t wait to relax. You can still send a message to the channel owner, though!

Channels are a simple, beautiful way to showcase and watch videos. Browse more Channels.