Performed at: International Conference of Live Coding 2017, CMMAS, Morelia, Mexico, December 2017.
CYOF uses analysis of the performer's previous and current live coding performances is used as a basis to predict the future of the ongoing performance in realtime, and to determine the improvisational originality of the current performance. Musical Information Retrieval and text analysis techniques are used to analyse an archive of code and audio files from the performer's previous performances.
The live coding performance (using SuperCollider's JITLib to live programme synthesis) is augmented by a visualisation which shows a representation of the past, present and potential future of the current performance. This aims to gently encourage the performer into more innovative improvisation by using archive material to determine the originality of the current performance in relation to the performer's own archive. Realtime audio feature data relating to the past, present and potential future is mapped to greyscale, with features in the y axis and time in the x axis. The performer's current code as it is being typed alongside the most likely (in orange) and least likely (in blue) possible future code.
The piece is part of an on-going project to develop visualisations which show augmented information relating to the live coding performance.