Commentary

Making Media at MIT

  • by September 1, 2005
Media Fabrics: Content and Context - A Storytelling Bonanza

The four-sided table sits low to the ground and sports a plain glass top. It looks like any other coffee table; it is anything but typical. Dubbed the "interaction table," it uses LCD display and interaction technology to enable people to share photos, stories, games, and more. It is the creation of MIT graduate student Ali Mazalek, who's worked closely with Professor Glorianna Davenport, principal research scientist at the MIT Media Lab. The project is part of the Media Fabrics unit of the Lab helmed by Davenport, an accomplished documentary filmmaker, video editor, and sculptor, known for creating random access video editing techniques.

Media Fabrics has nothing to do with actual cloth and everything to do with weaving a story via digital media and cinema. Davenport's research (and that of her graduate students) explores the issues related to collaborative creation of digital media experiences where storytelling is split among fellow creators. This work goes well beyond simple notions of "consumer-generated media," to the overall democratization of digital media. Davenport's definition of Media Fabrics can be distilled thus: "How creators, editors, and audience can blend into one, weaving and navigating paths within the media fabric." She talks about creating "networks of storytelling," non-linear storytelling, and a "safe environment for expression." Don't let the unusual name trip you up; Media Fabrics' projects have plenty of relevance to today's consumer-generated mediaverse.

advertisement

advertisement

For example, take Aisling Kelliher's "Montage" project. Working with Davenport, Kelliher is building an online software tool for creating multimedia narratives in real-time. The media-rich stories can be published online as part of a collaborative site with flexible privacy controls that can determine who can see, manipulate, and comment on individual media and stories. "We are looking at how we create various gradations or levels of privacy," Davenport says, explaining that with blogs, there are often slivers of information that creators don't want to share with everyone.

Dubbed "Confectionary," the software Kelliher is building shows what can be done with Weblogs when you add media. "You have secrets in this media. [Confectionary] is a Media Lab idea that could inspire more people to tell harder stories, more secretive stories, or have montaged pieces of stories," Davenport explains.

Paul Nemirovsky's work on "Emonic Environments" takes into account the process of improvisation, something he is passionate about as an improvisational jazz pianist. Nemirovsky's work explores "how we can create different pieces of media [music, video, film] without having to plan in advance," he says. "Certain characteristics of an improvisational process are common across art and media, making a film, making music... Paul is trying to analyze what makes improvisational interaction in real-time, and how do we make software that gives us that," Davenport explains.

Perhaps Hyun-Yeul Lee has the most unusual of all the Media Fabrics projects: She is exploring the notion that objects - such as a chair or a table - have a history. "Objects can acquire history through their transactions," she says. For example, what story does JFK's rocking chair tell, or that greeting card you sent your grandmother in 1981. You never know what the recipient of the card thought or felt about the card. "Objects have a life of their own. They witness a very rich history over time. Objects' lives are continuous," Lee says.

Walking past a furniture design shop, Lee adds, "I imagine the chairs are screaming at me. They're locked up in the store and no one's buying them. What is the life of an object? What kind of life does it have?" Good question.

Intuitive Intelligence Links the Physical and Virtual Worlds

Imagine meeting someone for the first time and having that person know everything you share in common in your personal and professional lives.

If both of you were wearing special wristbands embedded with RFID (radio frequency identification) tags, chances are good that by simply shaking hands, such an exchange of information could take place.

Sound far-fetched? Not at all, according to Pattie Maes, an associate professor in the MIT Media Lab who specializes in Ambient Intelligence. "If we both have the wristbands with RFID tags, we could shake hands and an earpiece will tell me where you work, that you're into dogs, or that you're a mom." But what if there are things you don't want someone else to know? The wristband would allow you to decide what kind of information to make public. In essence, the wristband is like a wearable computer. So what exactly is the field of Ambient Intelligence? It envisions a world where intelligence and intuitive interfaces are embedded in everyday objects. The interfaces recognize and respond to the presence and behavior of individuals in a personalized and relevant way.

Maes' interest in finding ways to link the physical and virtual worlds grew out of her research on Collaborative Filtering Technology, the software now ubiquitously deployed by Amazon.com, Netflix, and other e-commerce ventures that makes recommendations to consumers based on their previous purchase habits or subjects they've shown an interest in.

"Collaborative filtering technology helps you narrow down the whole space of choices out there by looking at what other people similar to you have shown an interest in. It's about patterns that form and correlations," Maes explains.

In 1993, Maes and her students devised the software that keeps track of consumers' interests and previous behaviors, and also facilitates ratings. Using the technology, she and her students created a Web-based system for making recommendations on books, movies, and music. Eventually, they founded a company called Firefly. Barnes & Noble and Launch, now owned by Yahoo!, became customers. Microsoft purchased Firefly in 1998; the company, founded by Maes, three students, and two Harvard Business School professors, had 80 people at the time. Firefly's technology was gradually integrated into Microsoft's Passport.

In the beginning, "we tried to convince [corporate] sponsors to do something with the technology," says Maes. 'It was so new and different, they couldn't figure out what to do with it....For most companies, it was still way too early.

Currently, Maes and her students are developing sensors that will detect a person's behavior in the physical world. For example, what if just by holding a magazine, you could enable your cell phone to become a source of information about the magazine? "My cell phone would give me options for a keyword search, to find out others' opinions about the magazine, to find out whether the magazine's content is of interest to me, what other kinds of people have read it, and so forth," she explains. A lot of the work will depend on barcodes being replaced with RFID tags on books, music, and DVDs, food, pharmaceuticals, and other packaged goods.

So, for example, "you could buy a book, and leave a message to yourself on page 5, storing that information with the book itself. You could do a keyword search in the book and even retrieve messages in it," Maes says. "The goal for me for the next 10 years is to really integrate the physical and virtual worlds more closely."

Common Sense Machines

Push Singh is trying to give computers a little common sense, the kind of common sense human beings use when they're, well, behaving rationally.

Singh, a postdoctoral associate at the MIT Media Lab, has focused for the better part of a dozen years on trying to make computers think like people do, all for the purpose of building a machine that can think. He says that by giving machines some common sense, they will be able to understand people better and help us in solving our problems, or at least in simplifying them.

He and his associates built a Commonsense Computing Web site that invites people to participate in creating common-sense facts - simple things like "When I'm hungry, I like to eat." The OpenMind system, a participatory and collaborative database of common-sense facts, was launched in 2000 and has grown to include nearly 1 million facts or assertions generated by nearly 16,000 people. Visitors register to use the system and are presented with facts that they can describe and explain.

"The mind is a very complicated machine," Singh says. "We're trying to find out how it works, and we're tying to build machines that think about the same kinds of things that people think about."

For example, Singh says, "If you type 'gift for my baby brother,' in Yahoo!, it gives you matches...words. It just finds things with the words 'gift for my baby brother.'" But, he adds, the computer makes no inferences about your relationship with your brother, or whether a gift is a good or bad thing. Singh is building an engine that can reason and intuit about such search queries.

Singh says there is plenty of interest in such software. "Companies want to be able to capture the implicit knowledge out there. ...It's the methodology of collecting implicit knowledge and common-sense knowledge of things.

"The system should be able to reason about privacy and understand the consequences of somebody else knowing a particular fact about you," Singh explains, adding, "The system should be able to take a few facts about an individual and know which of these should be made available to family and friends, versus professionals, companies, and work contacts." The more detailed information you offer, the better inferences the system will make.

"Whether people realize this or not, there is an enormous amount of information that Google and the other search engines have about you. They say they don't have personally identifiable information about people, but I'm not convinced that's true," Singh asserts. "We want a machine that can think about the world like a person can, maybe even better than that. It will help people make decisions about the ways they spend their time, introduce them to other people, and it would help you understand yourself better. And, it would be a better model of you because it would be based on everything you did."

Sound far-fetched? Don't underestimate the machine, or Singh.

Next story loading loading..