- Takeaway Festival 2006
- Takeaway Festival 2007
- Takeaway Festival 2008
- Takeaway Festival 2009
- Mini TKW
I am interested in the human reaction to physical motion, particularly the characteristics of motion that make something appear elegant as it moves. In addition, I am fascinated by how one can encourage empathy with a machine by the way it moves. I am convinced that a moving, tangible object has a different impact on people to a 2D image displayed on a screen.
I propose a robot made of two stepper motors mounted so that one rotates in a horizontal plane and one in a vertical plane. They move a wand-like appendage in space, much like a orchestra conductor's baton. An RFID reader mounted on the window glass reads people's Oystercards and generates a pattern of motion from the tag number and thus generates a gesture that is unique (and repeatable) associated with that card. It is an elegant, sweeping kind of motion, changing the path and speed according to the tag number. When a tag hasn't been read for a while, the robot will drift around, giving the impression of "looking" for people. I would not want it to be ever completely static, as that is one cue we use to recognise machine motion rather than natural motion (animals are rarely 100% still).
Gesture generation algorithm
Mifare cards (including Oystercards) store a tag number as 4 bytes (allowing 4,294,967,295 different tag numbers). The algorithm splits the bytes into nibbles (4 bits), giving a array of eight values each ranging from 1 to 16. Sixteen (arbitrary) points in space, arranged around the periphery of the robot's workspace, are assigned these values. When a card is read, the robot starts moving towards the point representing the first nibble in its tag number. Before it can get there, the point associated with the next nibble starts attracting the robot. Again, before it gets there, the following nibble attracts it. This continues until all eight points are processed. The motion is smoothed so that the robot moves in sweeping curves. Thus, each card will generate a pattern unique to that card, but one that is the same every time the card is read. The gesture generated “belongs” to that card user.
I have written a simulation program that generates gestures from Oystercard tag numbers. When it runs, it is clear that the robot is “trying” to get to a destination, but is always prevented from doing so. This triggers and empathic response from an observer and one “feels sorry” for it. An association between being thwarted from ones attempts to travel to a destination and the use of Oystercards is encouraged.
The base of the robot will be made sufficiently heavy to prevent it toppling over.
For safety reasons, the display area would have to be enclosed to prevent access to the arm while it is moving. A safety interlock switch would be mounted on the access door to disable power when it is opened. Only authorised people would be allowed access to the machine. It is designed to be operated continuously with no maintenance required.
July 2007 - present
Self-employed technology consultant and educator.
Visiting Electronics Tutor on the Design Products course at the Royal College of Art.
Visiting tutor of Physical Computing at the Computing Dept of Goldsmiths.
Visiting scholar at the Lansdown Centre for Electronic Art at Middlesex University.
2002 - 2007
Imperial College London: Post-doctoral Reasearch Associate in the Mechatronics in Medicine Laboratory, Mechanical Engineering Department. Working on mechatronic systems for surgical training and intervention.
1996 - 2000
Imperial College London: PhD in Mechanical Engineering
1994 - 1995
Research Scientist at the National Physical Laboratory
1992 - 1994
University of Kent: MSc by research in Electronic Enginnering
1992 - 1994
University of Kent: BSc(Hons) in Computer Systems Engineering
A while ago, I filmed 25 people individually in a recording studio. I played them each note in the classical human vocal range, and asked them to sing it for four seconds. This was surprisingly
difficult, even for more experienced singers. Few people could sing in tune, or cover anything like the vocal range I requested. But the attempts were valiant, and the results were varied and unique.
I then edited them into a series of individual audiovisual samples. With the programming expertise of artist/code genius Evan
Raskob, I developed a custom computer program that allows these audiovisual samples to be played back in the manner of a vintage Fairlight synthesiser.
The result is an instrument that allows this amateur choir to be played like a piano, with four octaves range, twenty notes of polyphony and pressure-sensitivity. The instrument has been field-tested by Chad Lelong, a professional jazz pianist.
Uniquely, it also allows the singers to be seen as they perform. A dynamic visual composition is built up as the instrument is played. Their faces are projected onto a series of identical
hanging shapes, to form a ghostly disembodied choir. These hanging shapes also contain the speakers that play back the audio samples, creating a richly directional soundstage.
As people play the instrument, the results are recorded as midi files and added to a database. When the instrument is not being played, it becomes a ‘player piano’, playing selections from it’s database until someone hits a note on the keyboard.
In this way, a long musical composition is built up, consisting of the musical contributions of those people who decide to play the instrument.
This project looks at the possibilities for audio-visual composition when visual elements are generated at the same time as music so that the playing style and structure of the music takes into account how it appears. During testing of Pitch Control, Chad, our pianist, instinctively altered his playing style so that he produced a visual composition and rhythm that he found more aesthetically
pleasing. This is the result I was hoping for. Pitch Control, as an installation, invites the public to explore these audiovisual
possibilities for themselves.
Marcus Lyall is a director and artist, who has worked extensively with video in live music settings.
He has designed and directed video content for concert tours, working with groups including The Chemical Brothers, U2, The Rolling Stones and Bon Jovi. This work has included animation, live action film and interactive design.
His video installation 'Slow Service' was shown at galleries including the ICA and FACT in the UK, the Seoul Museum of Modern Art and the Australian Centre for the Moving Image.
Marcus is currently developing a number of interactive art projects that explore the relationships between performance, music and the moving image.
He also directs TV commercials and music videos.
The exhibition is open:
Monday to Friday 10:00 - 17:00
Saturday 23 May 12:00 - 17:00
Saturday 30 May 12:00 - 15:00
Please Note: the Exhibition finishes on Saturday 30 May at 15:00
In 2008, the Takeaway Festival team received funding from the Arts Council to support a commissions programme for
artists using RFID (Radio Frequency Identification) or whose work redefines the accepted meaning of the term “musical instrument”. This programme attracted nearly 30 submissions from the UK, the USA, Greece, Sweden, Norway, Germany, the Netherlands
The Exhibition includes both the commissioned artists and some of the other artists who submitted proposals.
Marcus Lyall (UK): Pitch Control
Take a seat, limber up your fingers and play away as keyboard notes are replaced by recorded singing whilst the heads of 30 different singers are projected in the room to form a virtual choir.
Alex Zivanovic (UK): RFID gesture-generating robot
Swipe a commonly used RFID card and the robot will produce a graceful performance unique to your information.
Yoon Chung Han (USA): Jellyfish musical instrument
Create your own sound composition over four octaves with Jellyfish, the interactive sound installation.
Ryan Jordan (UK): Sensory Response Systems
Sensory Response Systems is an exploration of audiovisual performance using an array of sensors responsive to physical movements. It also looks at reshaping and replicating the body through the use of fabric, textiles and technology.
Ger Ger, with Jakob Kort (Germany): SOUND NOMADS
The constant search for noises, sounds and rhythms is at the heart of SOUND NOMADS’ approach to creating ephemeral - interactive sensor based - playgrounds.
Neil Mendoza, Anthony Goh, Simeon Rose (UK): RFID art
Swipe your transport card or other RFID-based object and you will be invited to recreate a famous piece of art. The unique nature of the RFID tag will assign an area of the artwork which will flash up on screen. Use your hands to draw your version and feed into the collaborative work.
Bart Koppe (Netherlands): Mixing Cities
Mixing Cities brings together, in real time, the sounds of several cities in an audiovisual installation. By choosing and switching between the cities you can make your own journey between the cities and get a different experience of distances and space.
Martin Howse (UK/Germany): Local Resonance Amplifier
Reacting to changes in electromagnetic emissions and signals, the Local Resonance Amplifier acts as a parasitic device revealing the hidden interactions between communications technology, power lines, biological phenomena and geological properties.
This workshop will demonstrate the power and ease of use of Textpattern, a free and open-source content management system that is highly flexible and versatile, which makes it suitable for all kinds of websites, even weblogs. The demonstration will take you step-by-step through the installation, customisation and more advanced features of this web application, enabling you to effortlessly create dynamic websites that are easily updatable, standards-compliant, RSS-able and even multi-lingual. No prior HTML/CSS skills are necessary but a basic understanding of web-design/publishing concepts would be beneficial to follow the workshop.
Olivier Ruellet is a London-based media artist and educator whose practice encompasses a broad range of disciplines: drawing, video art and animation, generative art and interactive installations. Alongside his artistic activity, he has worked commercially for the past six years as a web-design consultant and has taught web-design for the last 4 years at Thames Valley University.
Take the next step towards creating that hit app as we look at iPhone-specific functionality, location-based services, tilt mechanisms and networking, as an extension of the Pong app from Workshop Day 1. The workshop will include a review of what is currently out there and discussion of how existing apps work and the capabilities they use.
This workshop is for people with some experience of the Arduino system.
It is NOT suitable for complete beginners. It will show you how you can
use RFID (radio frequency identification) technology (contactless
smartcards like the Oystercard) with the Arduino system. It will also
show you how to get the Arduino to communicate with software such as
Processing running on a laptop. You must bring your own laptop (Windows,
Mac or Linux) and attend at least the morning session.
Alex is a freelance technology consultant, educator and artist,
specialising in the field of Physical Computing (sensing and controlling
the physical world with computers). He is a visiting tutor at the Royal
College of Art and Goldsmiths and a visiting scholar at the Lansdown
Centre for Electronic Arts at Middlesex. Previously, he was a researcher
at Imperial College London, developing mechatronic systems for medical
use, including medical robotics and virtual reality training systems for
Artists and makers have new consumer and open source devices to play with (e.g. arduino + wii nunchuck/ processing + wiimote for example). The culture of sharing has enabled ideas to rapidly spread. As the tools are getting easier and cheaper, what do we want to build today, collaborate on tomorrow, set free in the future? Transitlab.org is a place where I blog about hybridity, art/science and collaboration. I will talk about light responsive devices, my current project on open biology and how I am opening these projects up.
Dr Brian Degger is an artist/researcher and biotechnologist. His work with Blast Theory on I Like Frank in Adelaide formed the basis of a paper on how artists access cutting edge technology. He is a maker, showing the Arduino-based LightResponsiveDevices at the recent UK Maker Faire in Newcastle. Currently he is a digital fellow at Digital City where he is investigating new opportunities for artists to interact with the life sciences.
Open learning communities inspired by the free culture movement, offer alternatives to the bricks and mortars model of universities as knowledge factories. I will be presenting a sample of these emerging hybrid-flexible online learning models- that are being developed wth the help of social technologies within the academia.
Paula Roush is an artist-educator-researcher. She is a lecturer in digital photography at the London South Bank University where she teaches courses on artists publications and self-publishing practices, performativity and surveillance space. She also teaches the theory module for the MA in Art and Media practice at the University of Westminster. She is the founder of msdm [msdm.org.uk], a platform with a fluctuating body of collaborators exploring mobile strategies of display & mediation. With a focus on performative installation, msdm has developed a a transdisciplinary practice that encompasses online technologies and site-specific approaches to participatory live art.
Through his most recent project Liminal: A Question of Position a large scale digital media project that was concerned with interactions between the city, new media, technologies and cultural diversity the talk will be an exploration of the different kinds of collaborations that take place when building immersive environments in which stories can happen that are determined by peoples agency and intervention. Taking a look at collaborations with other artists and institutions which set out to find different ways of engaging audiences in the process of authorship through the unique opportunities afforded by interactive digital media and thereby contest the notion of artistic specificity.
Gary Stewart has been Head of Multimedia at Iniva, the Institute of International Visual Arts based in the new gallery space Rivington Place, London since 1995 where he curates and implements Iniva’s digital programme – encompassing installations, exhibitions, public and online projects. Working with electronic media as an artist, designer, producer and curator over the last twenty years he has been involved in pioneering initiatives and projects which interrogate the relationship between culture, technology and creativity.
In the space of 3 years 2005 - 2007 the Sony Corporation has gone
from an approach that was tantamount to kicking their audience in the
teeth to instead essentially handing them some of their most valued
assets to use as their own. While this hasn't been a fundamental
company wide transformation there have been some very real instances
of Sony, one of the largest corporations in the world realising that
they are much better off collaborating with their audience than
spying on them. This is a quick snapshot of what they did and the
impact it has had.
Leo has worked in digital media since 1996 in New York, Sydney and
London. He has worked agency-side for brands including Sony CE,
Levi’s, MTV and Diageo. He has also worked client-side establishing
the digital arm of Australian publishing company IPMG, which was sold
in 2001 to News Corporation Ltd. He is a member of the Internet
Advertising Bureau’s (IAB) Social Media Council and sits on the board
of the National Gallery Company. www.linkedin.com/in/leoryan