A ‘Real Time Image Conductor’ Or A Kind Of Cinema? : Towards Live Visual Effects
In this article I describe a new project that investigated methods for the incorporation of filmic visual effects (VFX) into new artworks and performed environments. VFX are the computer-generated processes used in the film industry for building, manipulating and compositing photo-realistic live action and animated elements. VFX artists create dramatic, believable sequences that would be impossible to achieve using in-camera techniques. Traditionally moving image visuals in a performative / gallery / club context have been primarily experienced as playback mediums, in which material is fixed in time and is played from beginning to end. Real-time visuals on the other hand require the intervention of a performer or a user to ‘cut up’ or re-order images live. In the case of the VJ or live filmmaker, he or she chooses the video clips in real-time, selects the options for effects and determines the compositing of images and effects. Since 2005 a number of (traditional) film makers have moved away from structural narrative cinema and towards ‘live cinema’: remixing their films for audiences as a live performed experience. This raises interesting possibilities to extend the genre within a performative art based approach. The ‘live cinema’ experience has been generally limited to pre-shot or captured visuals, which are processed or remixed. As yet few filmmakers or VJs have attempted to incorporate ‘live’ visual effects as part of this cinematic experience. It is the tension between remixing (cutting up) and creating images which include live visual effects that I identify here as a key area for debate. Using the early live cinema works of Peter Greenaway and Mike Figgis as a starting point I investigate how ‘live’ this cinema really is or could ever hope to be.