Friday, November 14, 2008

A sense of mobile sensor research

Third of four parts on my mobile phone university road tour: Part 1 (teaching mobile phones), Part 2 (university research), Part 4 (Georgia Tech).

One of the biggest and longest-established research projects on mobile devices is the Center for Embedded Network Sensing at UCLA. The center is headed by Prof. Deborah Estrin of the CS department. Estrin’s mom and dad were also CS professors there, and her big sister Judy is a well-known Silicon Valley technologist, entrepreneur and author.

In August 2002, Prof. Estrin and her colleagues won a 10-year, $40 million NSF grant that created the center. Most of the core technology seems to come from UCLA, but the remote sensing work is conducted in a partnership with UCLA, UCR, UCM, USC and Caltech.

“Sensing” is a big topic. The unifying theme is what us social scientists call “data mining” — compiling so much data that analysis can be done by sifting through the gigabytes (terrabytes?) of observations to deduce patterns. Much (although not all) of this research is on topics without immediate commercial application, on the assumption that academia should pursue the gaps in knowledge that won’t be filled by private industry.

The sensor community is more than just Deborah Estrin. They have their own conference, the annual ACM-sponsored SenSys, which met last week. In addition to the UCLA crowd, my travels also took me to SenSys participants Andrew Campbell of Dartmouth and Sam Madden of MIT.

Some of the sensing work (at UCLA and elsewhere) appears to be rural (or even seaborne) observation of the natural environment. These observations are done by devices that sit in the field, whether a large expensive device, or dozens (or hundreds) of inexpensive devices. For the latter, the Berkeley TinyOS and Motes are a popular choice.

But mobile phones make great sensing devices for more populated areas, in two different ways. First, individuals go places (like Westwood at rush hour) that you want to observe. Secondly, you may have a particular interest in the actions, travels, activities or concerns of the individual who’s carrying the phone. (Estrin and Campbell co-chaired the UrbanSense workshop before SenSys ’08 that focused just on this area of research.)

Mobile phones now make this sort of research all possible. They have data communications capabilities (50 years ago, we called this “telemetry”). They have sensors and computers to process the sensor data. Best of all, on a college campus lots of teen and twenty-something volunteers will carry them everywhere. (If you’re lucky, they’ll even buy the phone with parental rather than research project money.)

The sensors we heard about were:

  • location: this might be via cell towers, GPS, WiFi hotspots, or other sources of context. A big concern is that some phones/carriers make it difficult to access these services, either for business model reasons (carriers want a revshare) or privacy reasons (so you can’t plant spyware on your girlfriend’s phone and monitor her location 24/7). A practical consideration is for the power requirements, particularly for GPS.
  • motion: after location, by far the most popular sensor was the accelerometer, normally put in a phone to rotate the display (or the picture taken by the camera). Researchers couldn’t get enough of them, and wished there were more, as they used various tricks to try to infer phone (or wearer) position, motion and orientation. I suspect many would have drooled over the compass in the Nokia 5140, but this seems to be a very small niche so far.
  • camera: there was a lot of interest in image capture, but my sense (!) was that it’s held back because it still takes a human to point the camera in the right direction. (Last week there was an entire workshop on image sensing at SenSys ’08).
  • microphone: this can either be used to capture conversations by the participant, or (as the Dartmouth team tried) to measure the ambient noise as a way of inferring activity.
For mobile phone research, the UCLA CENS team is focused on “participatory sensing,” to learn what is going on around the phone owner. Their concept is a data acquisition “campaign,” and created a platform (called “Campaignr”) to manage such solicitations. Some of the projects including monitoring bicycle routes and encouraging high school students to agonize over their carbon footprint with PEIR.

At Dartmouth, people are not only carrying the sensors, but the sensing is “people-centric.” Their current application is CenceMe. As last week’s conference paper explains:
CenceMe … combines the inference of the presence of individuals using off-the-shelf, sensor-enabled mobile phones with sharing of this information through social networking applications such as Facebook and MySpace. … We present the design and tradeoffs of split-level classification, whereby personal sensing presence (e.g., walking, in conversation, at the gym) is derived from classifiers.

In essence, mobile phones can create mobile sensor networks capable of sensing information that is important to people, namely, where are people and what are they doing?
The classification scheme tells social network friends what I’m doing, and also classifies me as to my personality traits (nerdy, party animal, cultured, healthy, or greeny). The CenceMe app has other intriguing ideas — like taking a picture at a random time. Overall, it seems to bring new meaning to the idea “reach out and touch someone.”

Sam Madden of MIT is an old friend, who I met when he was a 15-year-old Mac programmer before he became a hotshot. His research is focused on less on the people and what they’re doing, and more on gathering the data and maintaining the network connections. For example, rewritten Wi-Fi drivers in CarTel attempt to acquire a hotspot in milliseconds, not seconds, while Macque attempts to gather relevant data in an energy-efficient fashion.

All three schools were using Nokia (S60) phones donated by the Nokia Research Center, mainly the Nokia N95. UCLA had also used a Samsung Windows Mobile phone, Dartmouth was trying the iPhone, while Madden’s group at MIT (one of several using mobile phones there) was explicitly trying to make its WaveScript data acquisition language support as many platforms as possible.

However, Nokia played a much more basic (and unfortunately scarce) role in facilitating this work. All three researchers were invited by Henry Tirri of Nokia Research to a February 2005 workshop in Helsinki, to mark the launch of SensorPlanet. This sort of research leadership by an industrial firm is rare — Microsoft and Google are doing it, IBM used to it (but has largely pulled back), and Apple never did.

No comments: