Science

Exclusive interview with TED Prize-winner Jill Tarter of SETI

Posted by:

tarter_interview.jpg

Astronomer Jill Tarter is director of the Center for SETI Research at the SETI Institute. She was awarded the TED Prize in 2009, and at the TED Conference she wished that the TED community would “empower Earthlings everywhere to become active participants in the ultimate search for cosmic company.” (Her talk on why the search for alien intelligence matters is now online.)

Yesterday the TED Blog interviewed Tarter over the phone about her TED Prize wish. She talked about some of the challenges and practicalities of SETI research, her new plans to help bring the world into the search for cosmic company, and a few new ideas about extraterrestrial intelligence that intrigue her. It’s a fascinating look at the pragmatic thinking that goes into this “stellar” project. Read the complete interview, below.

You’ve said the Allen Telescope Array can augment the search for signals by orders of magnitude. Expand on that.

Right away, the Allen Telescope Array is almost a factor of 1,000 improvement of what we could do.

In the past, we’ve used other people’s telescopes for the real-time targeted searching. And if we’re lucky we get maybe five percent of their time. In theory, if we get this all working correctly, we can be on the air almost 100 percent of the time. We can do our SETI targeted searches at the same time that radio astronomers are doing their traditional astronomical surveys. So that’s a factor of 20.

We have the opportunity to build more back-end processing hardware and look at multiple stars at the same time. In the past, using the single-dish telescopes that we’ve used, you can only look one direction in the sky. Now, with an array built out of a lot of small telescopes, you look at a huge patch of the sky. You go from looking through a soda straw at a teeny piece of the sky to looking through a wide-angle camera. And in that piece of the sky there are many stars that I would like to look at.

I can actually build my equipment at the back end of the telescope such that it takes the data from all of the separate antennas and adds the signal together with different time delays and different phase shifts — it’s as if I were picking out up to eight individual pixels in this large field of view. I can look at up to eight different stars at the same time. So there was the factor of 20, and then there’s this other factor of eight.

And then I have another factor of almost 10 in terms of building more compute power to increase the amount of the spectrum that I can look at at any moment. So I can build more signal processing equipment. Hopefully for the first time we can do it with commodity servers in real time because until now we’ve built our own signal processing equipment. It looks like the industry has now made your standard cluster fast enough to do the signal processing. So all it means is, buying more of that, we can expand the bandwidth.

That’s what I can do in real time. Altogether, that’s at least a factor of 1,000.

What else is going to help augment the search?

I think about the other resources TED can potentially bring to the process — that is, getting the rest of the world involved, and not just doing the search with the equipment that we currently have in real time in Hat Creek.

There’s the idea of being able to record data, and having signal processing experts and open source developers around the world help us to build new algorithms. Because when I say if I put more compute power at the back end of this thing I can do more searching, I’m talking about the limited class of signals that we now look for in real time. And these are basically signals that are compressed in frequency. They’re narrow-band signals. They’re artifacts. They’re relatively easy to find. They’re certainly quite distinct from astrophysical signals. But there might be other classes of information-bearing signals. Information-rich encoded signals that will also propagate well through the interstellar medium.

We have not done much exploration of this class of signals because we haven’t had the compute power to go looking for them. But if we could start out by recording data and having people develop algorithms for this class of signals in higher dimensions for us, then we could take the best algorithms and see if we can get them made efficient enough to run real time and put those on the telescope as well. And now you open up a whole universe of looking for something completely different — something we weren’t sensitive to before. So I’m eager to do that, and ultimately we don’t know how this development process will work and whether the algorithms will ever be quite efficient enough to do in real time.

Hopefully, then, we’ve gotten smart people around the world and we’ve got them thinking about this search for more complex signals. But what about the folks that don’t have that technological know-how? Can we get them involved too, if they’re passionate and eager to participate? We thought, “Well, the eye is just a fabulous pattern-detection machine. A lot of years of evolution to make that work well.” And so perhaps what we could do is involve people in using their eyeballs to find these complex signals.

Crowd-sourcing.

Right. If we can push the technology and get a big enough data pipe in and out of the observatory, maybe that could actually impact the real-time observing so that people could see a pattern, compare it with known patterns of interfering signals that we’ve seen before, and impact the next observation that gets made. Say, “No algorithm found this yet, but I think there’s something there, and I think you ought to go back and follow up on it in the next observing path.” And we don’t know if that will work yet because there’s a lot of unknowns about how much data you can get in and out of this remote site. It’s a long way up I-5. Perhaps that’s too ambitious.

Perhaps the piece we’ll be able to do is sort through data offline and say, “Gee, there’s a pattern here. Gee, there’s a pattern there.” And act kind of like a human TiVo. So they could go through a lot of data and then build a new data set which is signal rich so that the developers wouldn’t be working on just the run-of-the-mill, raw, recorded data, but would have a set of recorded data that was chock full of complex patterns to try their algorithms on.

So, say I’m looking at this data, as one of your crowd-sourced eyeballs. What does the data look like? Something like static on a television screen?

If you’re working in the frequency and time dimensions looking for the kinds of signals that we’re now searching for, it would look like a snowy TV screen. In two dimensions, frequency, usually, we display horizontally, and time vertically. And then the kinds of signals that we look for now are straight lines through that two-dimensional slice. Narrow straight lines. And sometimes the lines are interrupted because they’re actually pulsed. They’re narrow-band pulses. And again, we’ve honed our skills to be pretty good, and our algorithms now can find that kind of artifact when your eye can no longer see it beneath the noise.

If you take the complex signals, the kinds of things that actually contain a lot of information, like what we generate today with our telcom, those algorithms can find that kind of signal if it’s really strong because enough bits and pieces of the signal in the frequency and time dimensions will put extra energy to some straight line somewhere that it will pull it up.

But usually the information and the energy in the signal was spread over other dimensions of chirp, repetition space, dispersion — all kinds of different ways you can encode signals. These are the higher dimensions I was talking about. And they become much more efficient for data transfer. When we use them on Earth, we have a particular encoding scheme, and the transmitter and the receiver both know what that scheme is, and you can build optimum detectors for any kind of signal. So the different schemes used for spacecraft communication or for cell phone technology are all different ways of putting information into the spectrum most effectively. And they wouldn’t look the same. When you looked at frequency and time, you might see absolutely nothing. But if you looked in other dimensions, you might actually see a pattern.

Are we talking about incidental signals or intentional, broadcast signals?

It’s always easier to find something that’s broadcast. If it’s incidental, there’s only enough power in it to serve the audience for which it’s intended. Our television transmitters leak out from the Earth. And actually, there’s a sphere surrounding the Earth from the earliest television signals, maybe 70 years ago, that’s going out one light year per year. But it’s really weak. Because we weren’t transmitting to anybody except to somebody in the next county. We weren’t trying to transmit to somebody at the next star. So the power involved is very small.

You can imagine that there have got to be economics of something for any technological civilization, some conserved resource that would probably mean that you don’t put any more power or energy into a transmission than is necessary. The chances are that what we are going to detect is going to be intentional, and it’s going to be attention-getting. But again, our definition of what is attention-getting — versus an advanced technologies definition of what is attention-getting — may not yet be the same. We’re a very young technology. Very primitive.

You can apply analysis to signals that are compressed in frequency, simply because astrophysics can’t do that. And if you find such a thing, you’ve either found someone else’s technology, or a new brand of astrophysics, a new field of astrophysics. But now that we’ve had more years of technology development, it might be that other kinds of signals make sense.

Is there a star, or a region in the sky, that intrigues you?

We know that life and technological civilizations arose and evolved around one particular type of star — a sort of medium-weight. It’s not really massive, and it’s not a tiny dwarf either. This kind of star burns stably for billions of years. We’re about halfway through the life of our star.

We don’t want to look at stars that are much more massive than the Sun because they use up their nuclear fuel in hundreds of millions of years and we don’t think you could get the evolution of a technological society that quickly. That’s a guess based on our example of one, and how long evolution as a natural process seems to take.

So the idea of looking for stars that are much lower mass than the Sun wasn’t in vogue for a long time because the idea was that those stars would be so faint that any habitable planet would have to be very, very close to the star and it would get tidally locked so that one face of the planet would continually face the star. You’d have a near side and a far side, or a hot side and a cold side. And we thought that it would be impossible to maintain any atmosphere in such a configuration.

But that was on the basis of early calculations, and we’ve now done more sophisticated calculations, and it looks like with a modest amount of greenhouse gas, you could retain an atmosphere and you could have circulation winds which would distribute the heat from the sub-solar point around to the backside. And you might end up with habitable parts of the planet at the terminator, halfway around the planet from the star.

And so, these M dwarfs that live forever — there’s no dwarf star like this that’s ever been born in the galaxy that has yet died. Their lives are tens of billions of years in length. They might, after a few billion years, become a reasonable place to host life. This is stuff we’re just beginning to study. Let’s just say that M dwarf stars are back on the table, whereas they were off for a while.

So, generally speaking, you’re searching for stars within certain parameters …

The kinds of stars that we’re interested in looking at are roughly the solar mass and smaller. And we’ll pick stars that are at least a couple of billions of years old to give evolution time to have produced a technology.

We’ll also pick stars for which companions — nearby stars — aren’t awkwardly placed, because stars that are too close to one another would tend to disrupt the planets in orbit around one of the stars. There are a couple of stable configurations. You can have two stars very close to one another and planets circling around the pair of stars. Or you can have two stars that are widely enough separated that the planets can be circling one and not be disrupted by the second. Or you can have stars that are essentially isolated, like our own sun. We’ll use that kind of thing as a criterion.

We’ll use what astronomers call the metallicity. For an astronomer, anything heavier than helium is a metal. If it’s known, we’ll use the ratio of the amount of iron to hydrogen in the star as an indicator of whether there’s enough stuff there to make rocky planets. Earlier generations of stars in the galaxy could well have had planets. But really, there was only hydrogen and helium to work with, so they’d all be gas giants and not small, rocky planets. It took several generations of star formation in the galaxy to build up the retinue of metals: carbon, nitrogen, oxygen, sulfur, phosphorus — stuff that we’re made out of — and the minerals that form a rocky planet that life as we know it enjoys.

We’ll also look at all of the stars that we know to have planetary systems because those are stars that are somewhat special and we know something more about them.

So we’re using a lot of biases here to pick out stars. At the moment we’ve got about a quarter of a million stars that roughly satisfy the criteria.

And outside of those biases?

We will also look in another way. We’ll say, “We’ve made a lot of assumptions. Why don’t we give up most of those assumptions, except the fact that a technological civilization is going to be associated with a stellar host.” And then the idea is to just look in the directions that there are huge numbers of stars, and not make these assumptions and biases and target individual stars. Just survey regions of the galaxy where the star density is a maximum. That’s generally toward the center of the galaxy and along the plane of the Milky Way galaxy toward the center.

We’ve picked out 20 square degrees on the sky that contain probably 10 billion stars. And we will survey those. And while we’re surveying that region of the sky, the astronomers will piggyback on our observing. And we’ll be piggybacking on theirs. Those stars are all mostly very far away. The galactic center is 26,000 light years away. Most of those stars are at such a large distance that any transmitter that is coming from there would have to be extremely strong.

But we don’t know that there aren’t these cosmic miracles.

Credit: TED.com / SETI Institute
Watch Jill Tarter’s TEDTalk on SETI (TED Prize winner!) >>