1. A short introduction to the themes and processes used to create 'Fake Fish Distribution'; an album in 1000 variations by Icarus (Ollie Bown and Sam Britton).

    # vimeo.com/38404732 Uploaded 1,264 Plays 1 Comment
  2. Karmalize Productions 2011
    Shot & Edited by Alex Gaylon
    Assistant Director - Caroline Creaghead
    Music - Ryan Dann
    Special Thanks to Scott Draves, Founder of the Electric Sheep
    created for The Filmshop Presents...Eye Candy

    Electric Sheep is a collaborative abstract artwork founded by Scott Draves. It's run by thousands of people all over the world, and can be installed on any ordinary PC or Mac. When these computers "sleep", the Electric Sheep comes on and the computers communicate with each other by the internet to share the work of creating morphing abstract animations known as "sheep".

    Anyone watching one of these computers may vote for their favorite animations using the keyboard. The more popular sheep live longer and reproduce according to a genetic algorithm with mutation and cross-over. Hence the flock evolves to please its global audience. You can also design your own sheep and submit them to the gene pool.

    The result is a collective "android dream", blending man and machine to create an artificial lifeform.

    # vimeo.com/27688359 Uploaded 32.7K Plays 3 Comments
  3. IMC Colloquium Series "Modeling Electronic Dance Music: Intelligent Generation using Human Transcription and Analysis"

    Arne Eigenfeldt

    School for the Contemporary Arts

    Date: Mar 09, 2012


    Artist Philip Galanter proposes that that generative art "refers to any art practice where the artist uses a system, such as a set of natural language rules, a computer program, a machine, or other procedural invention, which is set into motion with some degree of autonomy contributing to or resulting in a completed work of art." Placed within the paradigm of electroacoustic music, generative music is, therefore, music that either uses a computer to autonomously generate the resulting music, and/or electroacoustic means - such as a synthesizer or sampler - to realize its results. One important aspect of generative music is that it is based within composition, rather than improvisation; in fact, I have proposed the concept of real-time composition to further discriminate its compositional basis from the improvisational elements of live electroacoustic music.

    When creating a generative system, rules are required to limit the possible choices; in most cases, these rules are used to generate new compositions in the style of the composer. One difficulty with generative systems is validating the success of the system - in other words, whether the system has interpreted the rules correctly, or whether the rules themselves accurately model the desired style. In the above mentioned system, it is really only the creator of the system that can make this judgement: listeners can reject the musical result, but the system's creator can argue that they are making aesthetic judgements of the music, rather than the system. However, if the aim of the system is to create music consistent within a given genre, it is possible to judge the success of the system - both artistically and practically - by the relationship of its output to the original corpus.

    The Generative Electronica Research Project, part of ongoing research into musical metacreation - the potential of endowing machines with creative behavior - is pursuing the potential of creating software that generates electronic dance music in specific styles. We have selected 100 complete musical examples in the genres of Breakbeat, House, Drum & Bass, and Dubstep, and are using a combination of machine and human analysis of these works to derive rulesets, which, in turn, are used to generate new music consistent within the genres. Unlike the work of David Cope, who used a set corpus of existing music by composers such as Bach, Mozart, Beethoven, and Joplin to create new compositions through recombinance - stitching together music from given examples - we are using generative methods - including probabilistic methods and genetic algorithms - to create new music.

    This presentation will discuss how our methods differ from other generative music systems, and other music information retrieval (MIR) programs, and present musical examples of our ongoing research.


    Arne Eigenfeldt is a composer of live electroacoustic music, and a researcher into intelligent realtime music systems. His music has been performed around the world, and his collaborations range from Persian Tar masters to contemporary dance companies to musical robots. His research has been presented at conferences such as ICMC, NIME, SEAMUS, ISMIR, EMS, and SMC. He is an associate professor of music and technology at Simon Fraser University, Canada, and is the co-director of the MetaCreation research group (metacreation.net), which aims to endow computers with creative behaviour.

    # vimeo.com/39426292 Uploaded
  4. A Semiconductor work by Ruth Jarman and Joe Gerhardt.

    Audio Data courtesy of CARISMA, operated by the University of Alberta, funded by the Canadian Space Agency. Special Thanks to Andy Kale.

    Made for the exhibition Invisible Fields at Arts Santa Monica in Barcelona Spain.

    20 Hz observes a geo-magnetic storm occurring in the Earth's upper atmosphere. Working with data collected from the CARISMA radio array and interpreted as audio, we hear tweeting and rumbles caused by incoming solar wind, captured at the frequency of 20 Hertz. Generated directly by the sound, tangible and sculptural forms emerge suggestive of scientific visualisations. As different frequencies interact both visually and aurally, complex patterns emerge to create interference phenomena that probe the limits of our perception.

    05.00 minutes. / HD / 2011
    HD single channel and HD 3D single channel.
    20Hz is co-commissioned by Arts Santa Monica + Lighthouse . Supported by the British Council.


    # vimeo.com/30668685 Uploaded 477K Plays 108 Comments
  5. Documentation of the interactive ecosystem "Eden" by Jon McCormack.
    Eden is an interactive, self-generating, artificial ecosystem. A cellular world is populated by collections of evolving virtual creatures. Creatures move about the environment, making and listening to sounds, foraging for food, encountering predators and possibly mating with each other. Over time, creatures evolve to fit their landscape. Eden has four seasons per year and each year lasts 600 eden days. One eden year passes by in about fifteen minutes of real time. Sensors in the space track the movement of people. The longer someone stays, the more food is produced for the virtual creatures around the area where that person is standing. If the work produces sounds that are interesting, people will most likely stay longer. So over time, the creatures lean that by making interesting sequences of sound, they can attract and keep visitors around, increasing their food supply and hence, chances of survival. All the sound you hear is generated by the artwork.

    For more information see: jonmccormack.info/~jonmc/sa/artworks/eden/

    # vimeo.com/11032248 Uploaded 3,098 Plays 2 Comments


alice ecila

Inspiring works, research and practice in generative creativity (making systems which make things)

Browse This Channel

Shout Box

Heads up: the shoutbox will be retiring soon. It’s tired of working, and can’t wait to relax. You can still send a message to the channel owner, though!

Channels are a simple, beautiful way to showcase and watch videos. Browse more Channels.