CSR Logo USYD Lion  
         
 
About CSR
People
Projects
 

Fish-Bird: Background

Previous

Fish-Bird Circle B – Movement C is an interactive autokinetic artwork that investigates the dialogical possibilities between two robots, in the form of wheelchairs, that can communicate with each other and with their audience through the modalities of movement and written text. Assisted by miniature thermal printers, the chairs write intimate letters on the floor, impersonating two characters (Fish and Bird) who fall in love but cannot be together due to ‘technical difficulties'. Spectators entering the installation space disturb the intimacy of the two objects, yet create the strong potential (or need) for other dialogues to exist. The visitor can see the traces of previous conversations on the floor, and may become aware of the disturbance that s/he has caused.

 

Fish-Bird installed at Artspace, Sydney, 2006.

 

 

 

 

 

 

Developing at Artspace, Sydney, in 2004.

Dialogue occurs kinetically through the wheelchair’s ‘perception’ of the body language of the audience, and as the audience reacts to the ‘body language’ of the wheelchairs. A common initial reaction of Fish and Bird to the unexpected disturbance would be to correspond on trivial subjects, such as the weather… Through emerging dialogue, the wheelchairs may become more ‘comfortable’ with their observers, and start to reveal intimacies on the floor again.

The research project and associated media art installation is the result of an extensive collaboration between a new media artist and robotics researchers. It confronts continuing issues and concerns regarding dialogue between humans and machines. The collaboration was initiated by a media artist (Mari Velonaki) who sought to realise an ambitious project, Fish-Bird – originally envisaged eight years prior. At the time of conception, the project could not be realised due to many ‘technical difficulties’! In seeking to realise the artwork, Velonaki initiated dialogue with the Australian Centre for Field Robotics (ACFR), the second largest field robotics research group in the world. From these discussions a group of interested researchers was formed; Drs Rye, Scheding and Williams form the scientific core of the Fish-Bird research group.

Debugging hardware, Artspace 2004.

 

First prototype of Fish-Bird installed at Ars Electronica, Linz, 2004.

Work towards the realisation of Fish-Bird began shortly after this. A spiral development model was used throughout the three-year project, whereby experiment and testing of early prototype stages were used to inform research and development in later stages of the project. The physical wheelchairs were constructed in 2004 and operated with a relatively primitive motion control system that was driven by ‘blob tracking’ using two laser scanners. The wheelchair behaviour was purely reactive at this stage, and text was selected through the reactive system. The first prototype of Fish-Bird was demonstrated at Ars Electronica in 2004 as part of the Unnatural Selection – Australian Media Art exhibition.

The second stage of the project added detailed motion tracking using overhead cameras, and Kalman filters that fused the information flows from the cameras and laser scanners to provide optimal estimates of the movement of wheelchairs and participants. A scripting language was devised and written to give complex individualised movement capacity to the robots. The scripting language also gave the robots the capability to perform detailed ‘choreographed’ sequences and to select text from a wide variety of categories.

In 2006 the final stage of the project was completed. This stage added to the robots seven behavioural patterns linked to the seven days of the week, together with artificial ‘emotional’ states that describe how each robot ‘feels’ about itself, about the other robot, and about the participants in the installation space. These emotional states are used to shape how each robot moves in the space and from which category text is selected. The capability of including text chosen from internet news sites — depending on the geographic location of the installation — was added, as was the possibility of composing text in real time. The software system attempts to identify the behaviour of each participant in the installation space from their movement, and adjusts each robot’s ‘emotional’ state on the basis of identified patterns. The system uses state-of-the-art tracking, filtering and identification techniques that are the subject of two PhD research topics within the project.

Next: Conceptual Foundation

  USYD Crest