What can you learn from your muscles? A lot, according to Alexander Grey, the chief technology officer of Somaxis, who has created sensors that measure muscle workload. In a talk given at TED@New York — one of 14 events that was part of the 2013 Talent Search — Grey demonstrates how people can use these sensors to stave off the onset of fatigue while running, recognize stress injuries before they happen, and even play the guitar — without ever picking one up.
In your bio, it says that you’ve created these muscle sensors to be used “personally and interpersonally.” What are some future uses for the muscle sensors?
One of the things I’m looking at is this idea of live, non-local racing. With athletes, you could stream your fatigue data and heart rate data and go for a run with someone in Tokyo for example. So, then that [data] would go to your iPhone, and they’ve got one too, so you’re all in the network. You can literally connect people globally to have them share their own biometrics with each other.
So that would be one application, but there’s a lot of others. One of the features is the midi controller. You can actually make it so that by moving your body, the sensor creates music or controls lights. So, you could have a new type of performance art that integrates body movement with sounds.
One of the things I’m trying to coordinate right now is making a platform for group experimentation. So, the idea would be that anyone could create an experiment like: what works better for me before a run — bananas or pasta? People could sign up for that, and they could go on a pre-set distance run with whatever it is they’re eating beforehand, and then all the data aggregates. Then someone could say: how about barefoot running versus regular running with shoes, what’s going to make a bigger difference in my calf fatigue over time? Someone creates an experiment, a bunch of people join, so all of a sudden with all these kinds of sensors in different places, people can discover other things on their own — it’s like a platform for discovery.
Do you have background in medicine — either Eastern or Western?
I studied electrical engineering and computer science for two years, and then I changed majors and got my degree in molecular biology. Then I started working and abandoned biology, and went back to tech. I was doing failure analysis for microchips, and then I started this thing up, which is a mixture of bio and tech. I also have a degree in acupressure, and actually it was very useful to understand the body, the different types of bodies, and the different types of muscles. It’s a lot of hands on stuff, a lot of massage is integrated there, so you just get a sense of the different kinds of tissues, the way that people work, and the dimensions that people have.
We heard you say something about robots earlier. Tell us more!
If you perhaps have a prosthetic limb right now, they have a lot of systems for translating the data. They have muscle-driven prosthetics, but there are tons of wires and they’ve got to go inside and all this stuff — it’s pretty invasive. So, if you have some kind of thing you could just put on, and it allows you to control your prosthetic, I think that would be a great improvement to what we’ve already seen.
How can these sensors help the everyday person feel better — let’s say, someone who is not particularly active, but feels body pain and fatigue often?
Repetitive strain is something that’s a big problem in the workplace. I came up with a way to do predictive screening, so that if you run people through a 2-minute test, typing and mousing, there’s an algorithm which crunches their numbers and looks at how you’re using those muscles while you do it. Therefore, it determines the risk factor for developing a repetitive strain injury in the future even if you have no symptoms right now. That was actually my original business plan, but I was told the market wasn’t big enough, so I had to start in sports and fitness first.
It’s a hardware platform, there are some great ideas that people have run by me that I’m really excited about, but our eventual role will be to provide the platform. We’ll release an API, and let people do whatever they want: make that workplace monitor, make the yoga monitor, make the meditation aid.
This is why the big trend is self-monitoring, you know, what can I do on my own? Even if you have healthcare, the irony is that the doctor sees you for such a short period of time, that if you could bring them all this other data and say, “Hey, look at all the stuff I’ve collected — look at the trends,” it’d be a way to expand the doctor’s insight. It may not be the same grade as medical equipment, but the information it gives is very good, and I think that at that price point the value to people is pretty solid.
Watch out for more Q&As from the TED@NY event throughout this week. Head to TalentSearch.TED.com to watch and rate these talks, as well as those from the 13 other stops along the TED2013 Talent Search tour.