Tod Machover of MIT’s Media Lab invented the musical technology behind Guitar Hero, and here he talks about what’s coming next. Listen for some brand-new ways to interface with music — to play it, compose it, enjoy it. Machover then introduces Dan Ellsey, a composer with cerebral palsy who uses some new tools to write and perform his own music. Onstage, Ellsey conducts his “My Eagle Song,” in a soaring performance that underscores music’s power to move you and give you chills. (Recorded March 2008 in Monterey, California. Duration: 20:35.)
Watch Tod Machover and Dan Ellsey’s talk and performance on TED.com, where you can download it, rate it, comment on it and find other talks and performances.
Subscribe to the TED Blog >>
The first idea I’d like to suggest is that we all love music a great deal, it means a lot to us. But music is even more powerful if you don’t just listen to it, but you make it yourself. So, that’s my first idea. And we all know about the Mozart effect, the idea that’s been around for the last five to ten years: That just by listening to music or by playing music to your baby in vitro, that it’ll raise our IQ points 10, 20. 30%.
Great idea, but it doesn’t work at all. You can’t just listen to music, you have to make it somehow. And I’d add to that, that it ‘s not just making it, but everybody, each of us, everybody in the world has the power to create and be part of music in a very dynamic way, and that’s one of the main parts of my work. So, with the MIT Media Lab for quite a while now. We’ve been engaged in a field called the act of music. What are all the possible ways that we can think of, to get everybody in the middle of a musical experience? Not just listening, but by making music.
And we started by making instruments for some of the greatest performers – we call these “Hyper Instruments” – for Yo Yo Ma, Peter Gabriel, Prince, orchestras, rock band – instruments where there all kinds of sensors built right into the instrument, so the instrument knows how it’s being played. And just by changing the interpretation and feeling, I can change my cello into a voice, or change a whole orchestra into something that nobody has ever heard before.
When we started making these, I started thinking why can’t we make wonderful instrument like that for everybody, people who aren’t fantastic Yo Yo Mas or Princes. So, we’ve made a whole series of instruments. One of the largest collections is called the Brain Opera. It’s a whole orchestra of about a 100 instruments. All designed for anybody to play using natural skill. So, you can play a video game, drive through a piece of music, use your body gesture to control huge masses of sound, touch a special surface to make melodies, use you voice to make a whole aura. And when we make the Brain Opera, we invite the public to come in to try these instruments and then collaborate with us to help make each performance of the Brain Opera. We toured that for a long time. It is now permanently in Vienna, where we built a museum around it.
And that led to something that you probably do know. Guitar Hero came out of our lab and my two teenage daughters and most of the students at the MIT Media Lab are proof that if you make the right kind of interface, people are really interested in being in the middle of a piece of music and playing it over and over and over again.
So, the model works, but it’s only the tip of the ice berg, because my second idea is that it’s not enough to want to make music in something like Guitar Hero, and music is very fun, but it’s also transformative. It’s very, very important, music can change your life, more than almost anything, it can change the way you communicate with others, it can change your body, it can change you mind.
So, we’re trying to go to the next step of how you build on top of something like Guitar Hero. We’re very involved in education. We have a long term project called Toy Symphony, where you make all kinds of instruments that are also addictive, but for little kids, so the kids will fall in love with making music, want to spend their time doing it, and then will demand to know how it works, how to make more, how to create. So, we make squeezey instruments, like these “Music Shapers” that measure the electricity in your fingers, “Beat Bugs” that let you tap in rhythms, they gather you rhythm, and like hot potato, you send your rhythm to your friends and have to imitate or respond to what your doing, and a software package called Hyper Score, which lets anybody use lines and color to make quite sophisticated music, extremely easy to use, but once you use it, you can go quite deep in to music of any style. And then by pressing a button, it turns into a music notation, so that live musicians can play your pieces.
We’ve had good enough, really very powerful effects with kids around the world and now people of all ages using Hyper Score. So, we’ve got more and more interested in using these kind of creative activities in a much broader context, for all kinds of people who don’t usually have the opportunity to make music.
So, one of the growing fields that we’re working on at the Media Lab right now is music, mind and health. A lot of you have probably seen Oliver Sack’s wonderful new book called Musicophelia. It’s on sale in the bookstore. It’s a great book. If you haven’t seen it, it’s worth reading. He’s a pianist himself. He details his whole career of looking at and observing incredibly powerful effects that music has had on peoples’ lives in unusual situations.
So, we know for instance that music is always the last thing that people with advance Alzheimer’s can still respond to. Maybe many of you have noticed this with loved ones, you can find somebody who can’t recognize their face in the mirror, or can’t tell anyone in their family, you can still find a shard of music that that person will jump out of their chair and start singing, and with that, you can bring back parts of people’s memories and personalities. Music is the best way to restore speech for people who have lost it to strokes, movement to people with Parkinson’s disease. It’s very powerful for depression, schizophrenia, many, many things.
So we’re working on understanding those underlying principles and then building activities, which will let music really improve people’s health. And we do this in many ways. We work with many different hospitals. One of them is right near Boston, called Tewksbury Hospital. It’s a long-term state hospital, where several years ago, we started working with Hyper Score and patients with physical and mental disabilities. This has become a central part of the treatment of Tewksbury hospital, so everybody there clamors to work on musical activities. It’s the activity that seems to accelerate people’s treatment the most. And it also brings the entire hospital together as an entire community. I wanted to show you a quick video of some of this work before I go on.
They’re manipulating these musical rhythms. That’s the real experience, not only do they learn how to play and listen to rhythms, but to train you musical memory and playing music in a group. Keep their hands on music to shape themselves, change it, to experiment with it, to make their own music. So Hyper Score lets you start from scratch very quickly. Everybody can experience music in a profound way, we just have to make different tools.
The third idea I want to share with you is that music, paradoxically, I think even more than words, is one of the very best ways of showing who we really are. I love giving talks, although, I strangely feel more nervous giving talks than playing music. If I were here playing cello or playing synth or sharing my music with you. I’d be able to show things about myself that I can’t tell you in words, more personal things, perhaps, deeper things.
I think that’s true for may of us, and I want to give you two examples of how music is one of the most powerful interfaces we have for ourselves for the outside world. The first is a really crazy project that we’re building right now called Death and the Powers. And it’s a big opera, one of the larger opera projects going on in the world right now. And it’s about a man, rich, successful, powerful who wants to live forever. So he figures out a way to download himself into his environment, actually into a series of books. So this guy wants to live forever, he downloads himself into this environment. The main singer disappears at the beginning of the opera and the entire stage becomes the main character, it becomes his legacy.
And the opera is about what we can share, what we can pass on to others. The people we can love and what we can’t. Every object in the opera comes alive and is a gigantic music instrument, like the chandelier takes up the whole stage. It looks like a chandelier, but it’s actually a robotic music instrument. So, as you can see in this prototype, gigantic piano strings, each string is controlled by a little robotic element, and there are little bows that stroke the strings, propellers that tickle the strings, acoustic signals that vibrate the strings.
We also have an army of robots on stage. These robots are the kind of the intermediary between the main character, Simon Powers, and his family. There are a whole series of them, kind of like a Greek chorus. They observe the action. We design these square robots that we’re testing right now at MIT called “operabots.” These operabots follow my music. They follow the characters. They’re smart enough, we hope, not to bump into each other. They go off on their own. And they can also, when you snap, line up exactly the way you’d like to. Even though they’re cubes, they actually have a lot of personality.
The largest set piece in the opera is called “The System.” It’s a series of books. Every single book is robotic, so they all move, they all make sound, and when you put them all together, they turn these walls, which have the gesture and personality of Simon Powers, who’s disappeared, but the whole physical environment becomes this person. This is how he’s chosen to represent himself. The books also have a high pact LEDs on spines. So, it’s all displayed. And here’s the great baritone James Maddalena as he enters the system. This is a sneak preview.
This premier is in Monaco, in September 2009, if any chance that you can make it. Another idea with this project, here’s this guy building his legacy through this very unusual for, through music and the environment. But we’re also making this available online and in public spaces as a way of each of us to use music and images of our lives to make our own legacy or to make a legacy of someone we love. So, instead of being grand opera, this opera will turn into what we’re thinking of as this personal opera.
If your going to make a personal opera, what about a personal instrument. Everything I’ve shown you so far, whether it’s a hyper cello for Yo Yo Ma or squeezey toy for a child, the instruments stayed the same and are valuable for a certain class of person, or virtuoso, or a child. But what if I could make an instrument that can be adapted to the way I personally behave, to the way my hands work, to what I do very skillfully, perhaps, to what I don’t do so skillfully. I think that this is the future of interface, the future of music, the future of instruments.
And Id like to invite two very special people on stages, so that I can give you an example of what personal instruments will be like. So, can you give a hand to Adam Boulanger, PhD student from the MIT Media Lab, and Daniel Ellsey. Dan, thanks to TED and to Bomberdier Flexjet, Dan is here with us today all the way from Tewksbury. He’s a resident at Tewksbury Hospital. This is by far the farthest he’s strayed from Tewksbury Hospital, I can tell you that. Because he’s motivated to meet with you today and show you his own music. So, first of all, Dan, do want to say hi to everyone and tell everyone who you are?
Dan: Hello. My name is Dan Ellsey. I am 34 years old and I have cerebral palsy. I have always loved music and I’m excited to be able to conduct my own music with this new software.
Machover: And we’re really excited to have you here, really Dan. So we met Dan about three years ago, three-and-a-half years ago, when we started working at Tewksbury. Everybody we met there was fantastic, did fantastic music. Dan had never made music before and it turned out that he was really fantastic at it. He’s a born composer. He’s very shy too.
So, turned out he’s a fantastic composer and over the last few years has been in constant collaborator for us. He has made many, many pieces. He has made many CDs. Actually, he is quite well known in the Boston area, mentors people at the hospital and children locally in how to make their own music. And I’ll let Adam tell you. So Adam is a PhD student at MIT, an expert in music technology and medicine. And Adam and Dan have become close collaborators. What Adam’s been working on in this last period is not only how to have Dan be able to easily make his own pieces, but how he can perform his piece by using this kind of personal instrument. So, you want to say a little bit about how you guys work.
Adam: Yes. So, Tod and I entered into a discussion following the Tewksbury work and it was really about how Dan is an expressive person and he’s an intelligent and creative person. And it’s in his face, it’s in his breathing, it’s in his eyes. How come he can’t perform one of his pieces of music? That’s our responsibility and it doesn’t’ make sense.
So we started developing a technology that will allow him with nuance, with new precision, with control and despite his physical disability, to be able to do that, to be able to perform his piece of music. So, the process and the technology, basically, first we need an engineering solution, so, you know, we have a fire wire camera, it looks at an infrared pointer. We went with the type of gesture, metaphor that he was already used to with his speaking controller. And this actually the least interesting part of the work, you know, as the design process, we needed an input, we needed continuous tracking, and the software we look at the types of shapes he’s making.
But then the really interesting aspect of the work followed the engineering part, where basically, we’re coding over Dan’s shoulder at the hospital extensively to figure out, you know, how does Dan move. What’s useful to him as an expressive motion? You know, what’s his metaphor for performance? What types of things does he find important to control and convey in a piece of music. So all the parameter fitting, really the technology was stretched at that point to fit just Dan. And you know I think this is a perspective shift. It’s not that with our technologies, they provide an access that allows us to create pieces of creative work. But what about expression? What about that moment when an artist delivers a piece of work? You know, do our technologies allow us to express? Do they provide structure for us to do that? And, you know, that’s a personal relationship to expression that is lacking in the technological spheres.
So, you know, with Dan we needed a new design process and new engineering process, to sort of discover his movements and his path to expression to allow him to perform and so that’s what we’ll do today.
Machover: So let’s do it. So Dan do you want to tell everyone about what you’re going to play now?
Dan: This is “My Eagle Song.”
Machover: So Dan is going to play a piece of his called “My Eagle Song.” In fact, this is the score of Dan’s piece. Completely composed by Dan in Hyper Score. So he can use his infrared tracker to go directly into Hyper Score. He’s incredibly fast at it. Faster than I am.
Dan: Yes I am.
Machover: He’s really modest too. So he can go into Hyper Score. You start out by making melodies and rhythms, and he can place those exactly where he wants. Each one gets a color, goes back into the composition window, draws the lines, places everything the way he wants to. Looking at the Hyper Score, you can see it also, so you can see where the sections are, something might continue for a while, change, get really crazy and then end up with a big bang at the end.
So that’s the way he made his piece, and as Adam says, we then figured out the best way to have him perform his piece. It’s going to be looked at by this camera, analyze his movements, it’s going to let Dan bring out all the different aspects of his movements that he wants to. And you’re also going to notice a visual on the screen. We asked one of or students to look at what the camera is measuring. So, but instead of making very literal, showing you exactly the camera tracing, we turned it into a graphic that shows you the basic movement and shows the way it’s being analyzed.
I think it gives an understanding how we’re picking out movement from what Dan’s doing, but I think it will also show you if you look at that movement that when Dan makes music, his motions are very purposeful, very precise, very disciplined and they’re also very beautiful. So, in hearing this piece as I mentioned before, the most important thing is the music’s great and it’ll show you who Dan is. So, are we ready Adam?
Machover: Ok, now Dan will play his piece “My Eagle Song” for you.