INTRODUCTION TO BODY MAPPING FOR SOUND GENERATION
Max number of participants: 15 people
Sign up through this form: https://docs.google.com/forms/d/e/1FAIpQLSc71oHDxS-N4meyuN-9AAAr9uwogWp0el6PacMSgdhUSRi-vQ/viewform
PLEASE BRING YOUR OWN COMPUTER AND HEADPHONES
This workshop will give an overview of some of the most common tools for tracking body movement for sound generation and sound control. We will focus on two main tools for motiontracking, the software Eyesweb (http://www.infomus.org/eyesweb/) Eyecon (http://eyecon.frieder-weiss.de/). This softwares are Windows only.
Motion tracking with cameras: both a 2d camera as well as a kinect
The following sensors will be considered: smartphone (bring yours!) and Senstestage (https://www.sensestage.eu/) and Eyecon (http://eyecon.frieder-weiss.de/).
The data provided by these sensors will be analyzed with different tools and it will available to all participants during the tests. Different possible strategies for using this data to produce and control sound will be considered through a few simple examples using the main sound generation softwares (Supercollider, Pd, Max/MSP, Ableton Live with Max-for-live), that are multiplatform (OSX, Windows, Linux)
Participants are invited to bring one (or possibly more) concrete situations where they would like to use this technology. We will take some times for problem solving and suggest possible solutions.
The participants must bring:
a laptop running OSX, Linux or Windows
headphones or earphones
motivation to learn and use body movement to generate computer music
Very welcome (though not necessary) are also:
One the these softwares: Supercollider, Pd, Max/MSP or Ableton Live with Max-for-live - already installed on your laptop and working.
An iOS or Android smartphone
Cameras or accelerometers that can be used during the workshop
Concrete working situations where you want to use this technology
ABOUT THE WORKSHOP HOLDER
Marcello Lussana (1979) combines since years his interest in music, philosophy and technology. Focal point of his work is the interaction between music and human movement, where body and computer are connected through a complex understanding of body perception and dedicated interfaces. After completing his study at the university of Foreign Languages in Bergamo, in 2006 he completed the Master in Technology and Communication in Torino; in 2010 he merged this humanist, technological and music interesest, attending the Master “Sound Studies” at the University of Arts in Berlin (UdK).
Since 2012 he is the music director of the Motioncomposer project: http://motioncomposer.com through which he has been involved in the European project Metabody http://metabody.eu. Since 2008 he lives in Berlin and produces computer music for audio-visual Performances, Dance, Theater and Live Electronics. In July 2016 he became a PhD candidate at the Humboldt university in Berlin on the subject of interactive music and consciousness with the professors Jin-Hyun Kim (Humboldt University Berlin) and Alberto de Campo (University of Arts Berlin).