During the last decades the scientific field of artificial intelligence has undergone a major change in direction. The general cognitive paradigm of constructing an artificial humanoid thinking machine was becoming more and more problematic. It turned out that creating an intelligent robot capable of communication is not just a matter of computational power. As a direct answer to this the 90s saw the rise of a promising and remarkable research domain in artificial intelligence called affective computing. In this field, originally founded by Prof. Dr. R. Picard at MIT, researchers are studying innovative emotional man-machine interactions. From these efforts a new breed of humanoid robots including the KISMET (first generation) and Nexi (second generation) was developed at MIT.
The groundbreaking vision affective computing offers has also had a profound influence on the field of computational and cognitive musicology. Specifically it has renewed interest in the connections between music and emotions. Since the tentative studies of K. Hevner in the 1930s this connection could only be thought of in terms of classical psychology. But by applying the field of affective computing a whole new perspective opens up.
This point of view lies at the foundation of the construction of the EMO-Synth. In implementing the system numerous new principles and ideas have also come to surface. Two seminal ideas in this context are the following:
- The EMO-Synth can be seen as a brand-new virtual tool by which the artist can extend and complement his own creative process. By using the system it becomes possible for the user to understand his own personal creativity on a new level.
- By using the system a totally new virtual platform arises in which the boundaries between artist, audience and the generated artifacts are redefined. To what level are these artifacts a creation of the artist or the audience? To which extent can we speak of artistic input when the EMO-Synth truly participates in the creative process?
Of course the last two statements start from a certain perspective on the artistic process. It is my personal point of view that art communicates through human emotions. Every work of art implies a certain kind of expression that can be understood intuitively rather than rationally. It is this intuitive and emotionally loaded component that the EMO-Synth tries to uncover and simulate. The realization of the EMO-Synth project also implied a radical break with the old notion of static art. Doing this is not a new idea but it was only through recent developments on both the methodological and technical level that the implementation of systems like the EMO-Synth becomes possible. Recent artificial intelligence techniques and techniques stemming from affective computing give a whole new insight into both the esthetic or emotional content of art objects.
In developing the EMO-Synth I feel it as my personal goal that the project will redefine and reshape part of the whole new discussion about the place of automatically generated art in the general artistic discourse.
Producer & Founder of the EMO-Synth
- Video from TEDxTalk at TEDxFlanders online
- EMO-Synth at TEDxFlanders on march 26th in Stadsschouwburg Antwerp (BE)
- WORKSHOP "BIOFEEDBACK IN ARTISTIC PERFORMANCE CONTEXT" at the NEUE MUSIKSCHULE BERLIN ON 04/02/2014
- EMO-Synth residency, events and performances in Berlin at Liebig 12 gallery from 08/06 till 14/06
- New pictures from EMO-Synth performances in Zagreb (Croatia, 2013) now online.
- EMO-Synth on 20th of april at Nuit Blanche in Sint-Niklaas
- EMO-Synth on 02/03 at Museum Night Fever 2013 at the Musical Instruments Museum in Brussels from 19u30 till 24u00
- EMO-Synth on 27/02/2013 on national Belgian radio station Klara
- EMO-Synth performances during "Nuit Blanche" at Sint-Niklaas for WARP/Stichting Jan Buytaert on 21/04/2013
- EMO-Synth performances during Museum Night Fever 2013 at the Musical Instruments Museum in Brussels on 02/03/2013
Stay informed on our latest news!
Montagecinema Revived by the EMO-Synth was supported by