Dr. JoAnn Kuchera-Morin works on the AlloSphere, one of the largest scientific and artistic instruments in the world. Based at UC Santa Barbara, the AlloSphere maps complex data in time and space. Dr. Kuchera-Morin, a composer, demoed the AlloSphere at TED2009 in February, showing five films of scientific data mapped visually and sonically into compelling art. Last week I talked with Dr. Kuchera-Morin about the AlloSphere — what it does, how it works, who uses it, and how you turn raw data into sound. From the interview:
Some of my mathematician colleagues are working with 6-dimensional figures. What happens when your math starts to get so complex that you can’t draw it by hand anymore? Scientists have such tremendously rich math data that the instruments they use now can’t actually see it. You get measurements from it, but can you take those math coordinates that describe it and map it visually and sonically?
There are scientists who have lost the ability to perceive their data. Now they might have the ability to perceive this data again through portals that let them see and hear their data, not just see a string of numbers.
More: If you’re around Santa Barbara next week, hear JoAnn Kuchera-Morin’s work at the Primavera Festival
More: For a celebration of boundary-breaking science research, read “In search of the black swans,” Physicsworld April 2009So what’s going on at the AlloSphere? What do you hope to use it for?
Different people from different facets understand the nature of the instrument, and understand what we’re trying to do to bring art, science and engineering together. Some come from the practical level of research, with implications for industry and development. But we’re also working with mathematical concepts quantifying things that are almost spiritual. We’re advancing the nature of who and what we are and the nature of the universe.
We’re all just out there mapping, looking for constructs.
We’re looking for patterns. We’re looking for beauty. The way that we appreciate beauty deals with the nature of complexity, uniqueness, subtle changes over time that catch you by surprise. It’s something we look at as artists, and our scientists are looking for this too.
I was talking to a Nobel Prize-winning physicist on campus. And he said, “Why do I need this? My work is data. It’s numbers.” And I said, “Have you ever been working on a problem on your computer screen, you’ve been really stuck, and then one of your colleagues walks through your door, and from three and a half feet away says, ‘That data doesn’t look right’?” That’s the value of looking for patterns in data. Patterns you can’t see when you’re up close.
We intuitively know these things. Cultures have been weaving these patterns, plowing these patterns, etching these patterns. This is what we are.
Tell me more about mapping data sonically.
Sonification is really important. We hear vibrations up to 20-20,000 hz, and we see up in the gigahertz range. But it’s all data, and you can take data and convert it into another form. What happens is, I take visual data, drop it 27 octaves and do the proper transformations so I can hear it. Think of a mathematical equation that can be seen and heard. If you take any math data and sonify and visualize it, you’ll be able to see and hear patterns — you’ll see balance, continuity, contrast, surprise, all the things that catch people’s attention.
Audio and visual data come with different mindsets. When I started building media systems back in the ’80s, it was from an audio point of view. In audio, you always have to be in real time — your data is mathematically moving from one point to another point. When we digitize a sound, we’re taking a snapshot of that sound and freezing it over time. Aspects of that constraint have plagued us in the sonic world.
On the other hand, animation is about taking still shots and speeding them up over time; they’re coming from a different place. I’m convinced everything has to be built from the ground up with the transitions in place, visually and sonically.
What’s it like inside the AlloSphere? Where do you sit?
You don’t sit. You’re actually standing on the bridge. The bridge is designed so you’re looking at a horizon, and your eye horizon is geared for you to be standing. We’re expecting people not to be passive viewers.
The whole instrument should be a multi-user interactive space. Researchers will be in there with gloves on, and they can reach out and grab data, pull data to them. 15 researchers can be in there interacting and doing multiple things with the data at the same time. We’re going to have to think about how to track people as they move around.
The instrument will always be used as a production instrument. Artists work this way, and so do scientists — they want to see, to observe. This could be a very large microscope for them to look into.
Something about your talk that struck me is the teamwork between artists and scientists, and the fact that artists seem to be equal partners in this work.
Between artists and scientists, the connection is so tight, a lot tighter than most people think.
All of art is mathematics. I’m a composer, and when I was writing chord progressions, there was a discipline that removed me from the math — I wasn’t aware I was using linear algebra principles. When you move up and learn more, you start learning all the mathematical constructs and moving into nontraditional mathematics. These are things that composers in the mid-’40s were doing, with these large-scale orchestral works. We’re pushing ourselves with our vocabulary, and our music is getting more and more complex. All of a sudden, that leads us to instruments like a computer to manipulate complex music. The content has been driving the technology.
Similarly, some of my mathematician colleagues are working with 6-dimensional figures. What happens when your math starts to get so complex that you can’t draw it by hand anymore? Scientists have such tremendously rich math data that the instruments they use now can’t actually see it. You get measurements from it, but can you take those math coordinates that describe it and map it visually and sonically?
There are scientists now who have lost the ability to perceive their data. Now they might have the ability to perceive this data again through computers that have portals that let them see and hear their data, not just see a string of numbers.
And this work is described as “art,” not just as a really supercharged presentation tool.
For me, why what I’m doing is art is, I’m pushing science into domains where people don’t know how to visualize and sonify that data.
In the demo, what you saw was not scientists coming in and saying, “Here’s what I think this should look like.” They came in with data and said, “This is what I think I’m going to see, but I’m not sure.” And us saying, “Well, how about if we tried it this way instead?”
And sometimes there would be resistance. We’d say, “Give us the leap of faith to try it this way, and let us see if we can see something new.” I mean, how do atoms look, how do electrons look? There are always these various ways that we can look at or hear data, and you might see something by looking in one way or another way.
I was talking to one of my scientist colleagues — I came into his lab waving a research paper and I said, “Hey, I’m trying to understand this. Help me!”
And he said, “You’re not supposed to understand this. I barely understand this.”
And I said, “You don’t understand: I’m an artist, I’m ignorant, I have no fear, and I have all the time in the world. Let me take your data and let me play.”
We are having some of the most exciting conversation on the AlloSphere bridge! It’s scary to the scientists, because when we make these big leaps of faith, we can make mistakes. You have to learn to distance yourself from your data and become a third-party observer, especially when it’s so complex and so small — how can you wrap your minds around it? The math is so complex with one electron spin, how can we understand any more than that? But we’ll build on what we have.
These are things that have the artists sparked as well as the scientists sparked. We are the Beethovens of the 21st century. We’re mapping terrain in space and time.
What’s next for the AlloSphere — when will it be ready to go? And can we come see it in person?
We’re revving up. We’re engineering the instrument — it’s about 4 to 6 months away from lights-on. We’ve got slices of it lit up right now, and have had various configurations of the whole sphere lit up. But to match the two hemispheres together is an engineering feat of magic.
Not to say it’s not lit right now. There’s research going on right now. After TED, we had half of a hemisphere lit up, 360-degree audio, 24 channels of sound.
Come see it. After TED we had a group drive up the coast to see the AlloSphere, and different groups from the TED community have come up from LA — both industry partners interested in potential use, and artists and scientists interested in using it. When they see it, they say, “Words and video can’t describe the real experience.”
More: For a celebration of boundary-breaking science research, read “In search of the black swans,” Physicsworld April 2009
More: If you’re around Santa Barbara next week, explore CREATE and hear JoAnn Kuchera-Morin’s work at the Primavera Festival