- Takeaway Festival 2006
- Takeaway Festival 2007
- Takeaway Festival 2008
- Takeaway Festival 2009
- Mini TKW
in September 2005, the concept of "Web 2.0" began with a conference brainstorming session between publishers O'Reilly and MediaLive International. What started as an annoying industry buzzword has taken real meaning, generally accepted refering to user controlled - user generated web sites. Blogging software, Photo-sharing and Social Networking Spaces are 3 of the biggest 'web 2.0' types of websites. The ideas behind the web2.0 shift are so powerful they changed the business model of the web, many people claim it a final a move to complete World Wide Web visionary Tim Berners Lee's idea of the participatory Internet. But what does it mean for you? what can it do for you? and how do you do it? Find out more about the different ways to participate in the internet including using a hosted blog, how to use the web's social spaces, how sites like Flikr and Delicious work and for those with web space, how to install your own blogging software.
James Smith is an artist, lecturer and experienced web application developer. He has taught Technical Studies at Ravensbourne College of Art and Communication for the MA in Network Media and led workshops in information architecture, scripting languages, interface design, computer game design and content management systems. He has created 4 CD-Rom based educational games, developed the first web based research community site at King's College, created an early "vodcasting" prototype for documenting the Technics festival in 2000 and created the 'Know Your Rights' interactive kiosk in the Millennium Dome. He is currently Technical Director of the Launchlab project at Space Studios in North London. (http://www.launchlab.org.uk). For fun he hosts an internet radio program called 'Monkey Hear, Monkey Do" on http://www.radionoodles.ork.uk
A workshop about making music with a GameBoy. Participants will get the opportunity play around with the software that is used to make game-boy music, after having heard a little about how to use the software. If you have a laptop and headphones or a gameboy with a littlesounddj cartridge and headphones pleas bring it to the workshop.
On the laptop we can install a gameboy emulator. this way you can keep practicing even after finishing the workshop. For the people who would like to learn how to make hardware that syncs to LSDj, there will be the opportunity to play around with some test boards with ic's..
p.s. Gijs will also bring new hardware that he made to the workshop.. just for showing what is all possible
Gijs Gieskes is an industrial designer from the Netherlands, mainly creating instruments for use with video and audio performances.
Processing makes computer programming for graphics more accessible for people with little previous experience, especially for those who are visually aware. There is a large and lively community using the programme including students, artists, musicians, architects, designers and researchers. Processing is an open source programming language and development platform. It can be downloaded for free at the Processing website. By simplifying the syntax and compilation process of the programming language Java whilst simultaneously extending its visual capabilities, Processing makes computer programming more accessible for people with little previous experience, especially those "visually minded".
David Muth is a London based musician, programmer and artist. Having grown up in Salzburg, Austria, he relocated to the UK to study at Middlesex University, where he received an MA in Digital Arts.
His work has been shown on numerous occasions internationally, with venues and events including the Musée d'Art Contemporain in Montreal, the Kiasma Museum of Modern Art in Helsinki and ISEA2006 in San Jose. David also lectures at the Royal College of Art and Ravensbourne College of Design and Communication, and is part of Kaffe Matthew's interdisciplinary research initiative ""Music for Bodies".
In this workshop we'll introduce the ARDUINO philosophy and explore the basics of physical computing while learning how to build simple interactive objects. ARDUINO is an open-source physical computing platform based on a simple i/o board, and a development environment for writing ARDUINO software. It can be used to develop interactive objects, taking inputs from a variety of switches or sensors, and controlling a variety of lights, motors, and other outputs. ARDUINO projects can be stand-alone, or they can be communicate with software running on your computer (e.g. Flash, Processing, MaxMSP.)
Massimo Banzi is the co-founder of the ARDUINO project and has worked on many interaction design project for clients like: Prada, Artemide, Persol, Whirlpool, V&A Museum and Adidas.
I'm proposing to show participants how easy it is to create pod casts using open source tools, and to distribute them on the web using open source blogging platforms. Ideally, participants should have some audio or video material already digitised on their laptops, as I presume there won't be time to take them through capturing any material. Perhaps other demonstrators will be able to show participants how open source tools such as Audacity can be used to record audio.
I'll be showing people how to use ffmepeg to transcode video for podcasting, and how Audacity can be used to prepare audio recordings for podcasting.
Since graduating from Ravensbourne College with BA (Hons) Product and Furniture Design in 1998, Adam has worked in a number of HE Institutions ranging from Central Saint Martin's to Kent Institute of Art & Design, and at present Ravensbourne College of Design and Communication. He has also given sessional tutoring at the Royal Melbourne Institute of Technology (RMIT), and Queensland University of Technology (QUT). Outside of education, Adam contributes energetically to his role in the Apple Distinguished Educator Program in which he is involved in digital media technologies.
When the gradually rotating paintbrush strokes the apple's dehydrating, rubbery skin a copper switch is activated, expelling intriguing and unnerving real- life, pre-recorded human 'slurps'.
Equipment Used: Apple, motor, paintbrush, speakers, copper filings, copper wire, glue, metal, wood.
Dimensions: 400mm x 300mm x 200mm.
Weblink for kinetic documentation: Touching Apples
2004 - Present. Sculpture BA (Hons) Edinburgh College of Art.
2006 International Exchange: Arizona State University. Arizona, USA.
2002 - 2003. Art Foundation. Brighton City College: Merit.
2001 A-levels. Art. A. English Literature. B. Biology. B.
2006 Albertina’s Café. Edinburgh. Money Splash.
2007 Royal Scottish Academy. Edinburgh. Annual Student Exhibition.
2007 Outrageous Art. Edinburgh. Left to Right.
2007 Artist Run Space. Edinburgh. To Let.
2006 Artist Run Initiative. Scotland. Punch!
2006 Total Kunst. Edinburgh. Dazzlement.
2006 Contact Gallery. Edinburgh. Cavity.
2006 Edinburgh Central Library. Artist Book Exhibition.
2006 The Ice House. Phoenix, Arizona. USA. Models and Mechanisms.
2006 The Ice House. USA. Off the Wall.
2006 Wet Paint Gallery. Tempe, Arizona. Small and Beautiful.
2005 The Sugar Cube. Scotland. Sugar Cube.
2004 Edinburgh College of Art. Andrew Grant Drawing Exhibition.
2004 Edinburgh College of Art. Chaos at the Fringes of Spring.
2002 Brighton and Hove City College. England. Foundation Exhibition.
Commissions and Awards
2006 Commission for ma Sound Designer Matina Staikoudi. A Woman’s World.
2006 Private Outdoor Commission. Brighton, East Sussex. The Feather Man.
2006 Temporary Outdoor Sculpture Commission. Arizona, USA. Binnin Around.
2005 Awarded The Helen A. Rose Bequest For Sculpture. Edinburgh College of Art.
2002 Private Commission for Phylidda and Glenn Earl of Tidebrook Manor. Reflections. 2000 Awarded ‘Artist of the Year’. St. Leonards Mayfield. East Sussex.
NooSpeak aims to enhance Web search with human-like interaction, in order to provide users with a more natural and entertaining experience. It merges an existing widely used service (Google Search) with artificial intelligence technology (a pattern knowledge-base written in AIML and its parser) to create a chatting interface to the search engine.
The project starts from the assumption that search is highly computer-centred, forcing people to formulate their questions in an unnatural form, and produces a poor and frustrating experience. The goal of the project is therefore to build an interaction system that facilitates users by adopting their natural language, yet preserves the efficiency of the search engine in finding information on the Web. The first prototype of such a system uses a chatterbot to pre-process human inputs, passing the chatterbot responses to the search engine, from which results it extracts one to be delivered to the user. Search results are then presented inside an interface that borrows from instant messaging software conventions, so that user has a constant overview of her chatting/searching history. The chatterbot role is a diagnostic one, helping users to clarify and refine their search, possibly correcting spelling mistakes.
In order to ease user’s task, the prototype offers real-time suggestions based on popular searches relevant to what user is typing.
It also displays two weighted lists (chat clouds) of semantically related words and synonyms, one gathering data from all users inputs and the other built upon the corpus of Google responses. Chat clouds’ function is to enrich user’s vocabulary, aiding her to find new solutions and build relationships between topics.
A further stage of the project aims to allow users to customize the service: by creating an account, they would be able to actively teach the chatterbot about their preferences, and to see the system improving its performances by analysing their individual data.
Italian, graduated in Communication Design from the Politecnico di Milano in 2005, Matteo is currently attending his second year MA Communication Design (Digital Media specialisation) at Central Saint Martins College of Art and Design. He is also working as a Flash developer at Milo Creative Ltd, a new media company specialized in educational games. NooSpeak is his Master Thesis Project.
an audio narrative featuring the voices of a location. This may be followed with a map.
I am interested in using the environment as a platform for narrative, whereby the user may project audio and visual experiences in a simple and lightly entertaining way
The perception of a “functioning” human being is controlled by the senses – all of them at once. Separating these senses with technical tools allows one to focus on one sense separately. If we split the senses by technical device we split the body –transforming sensory information into data, projected onto a screen and made accessible in a new way.
A human is transformed into a tool - blindfolded in front of an empty canvas. Another human tries to use this tool to paint. The human tool has a camera attached to his head – the controller is limited by the information given through this device. The human tool gets instructions from the controller via headphones. – with this limited amount of perception and communication the controller tries to create a painting and the tool reacts according to the command.
Further explanation and a video that shows how it works can be found on
Lukas Birk born in Austria 1982.
Journalistic education in Austria.
Former journalist for Austrian/German radio stations and news papers.
Currently studying at Thames Valley University - Digital Arts and Photography. (3rd year). Working as photographer (Time Out Guides – Austrian Travel Guides…)
Photographic exploration northern India, Nepal, Burma, Cambodia in 2003 with exhibition in Bregenz/Austria 2004.
Stuck agility at a single space – searching for a secret through photography 2005 - 2006
TimeSpace – animation 2006
Ygen – animation 2006
The sea is waiting 8mm shortfilm 2006
Fragments of Beijing 8mm shortfilm 2007
Human Tool - interactive Painting installation 2006 -2007
Kafkanistan – tourism in Afghanistan (45 minute documentary film and book both not published yet) 2005 – in progress
0044 7891 822 971
Interactivity has become ambient. Individual people are no longer isolated resulting from the scaling up of networks and the scaling down of the apparatus for transmission and reception. Various communication devices always carried are continuously emitting and receiving information. This continuous data flow is both invisible and often, by the majority of people, unknown. Today’s hand-held devices can be seen as extensions of the human body allow ubiquitous, inescapable network interconnectivity.
The ‘Sonification of You’ aims to make this data flow ‘visible’ to those people carrying the active devices. Our equipment will passively scan the various radio spectrum frequencies used by mobile phone devices, Bluetooth, WiFi networks, and others used by mobile devices, within a given space. The data information then represented by assigned audio sounds that will indicate activity, distance, and strength of signals.
Drawing on methods for monitoring large computer networks, the result is to create a background ‘sound’ for a room that is representational of the people, and their devices, present.
The invisible become audible and therefore visible. Allowing individuals to become aware of their constant connectivity.
December 2006, Live for NOMUSIC Festival X, 1-2am GMT, 13 December.
November 2006, Live at Trampoline: Platform for new media art, Broadway Cinema, Broad Street, Nottingham, UK.
September 2006, Live at Paraflows, Vienna (Austria) - documentation
Sonification of You live at Paraflows annual convention for digital arts and cultures, Viennia, Austria, 9-16 September 2006.
March 2006, Sonification of You was active for the duration of FRAMED, a Slade Centre of Electronic Media event, as part of Node.London. As part of the FRAMED Broadcasts, extracts recorded at FRAMED played on Resonance 104.4fm: 23, 24, 25 March 2006 between 1-7am, and 24 and 30 March 2006 between 7-8pm
Possible with the kind support of UCL Information System's Remote Support Team