This is a video prototype of an iPhone media player that uses physical objects to control media playback. It is based on Radio Frequency Identification (RFID) that triggers various iPhone interactions when in the range of a wireless tag embedded inside a physical object.
RFID is becoming more common in mobile phones (under the term Near Field Communication or NFC) from manufacturers such as “Nokia”:http://www.nearfield.org/2008/05/thoughts-on-nokias-nfc-developments. By looking at “Apple’s patents”:http://watchingapple.com/2007/05/connecting-iphone-to-your-wireless-home/ we know that the technology is being considered for the iPhone. With the “iPhone SDK 3.0”:http://developer.apple.com/iphone/program/accessories/ external hardware accessories can be accessed by iPhone software, so third party RFID or NFC readers are also possible.
So what kinds of applications would emerge if an iPhone had an NFC reader? Here we have prototyped a simple media player, which triggers the playback of content on the touch of a tag, and created a set of augmented objects that have relationships to different kinds of audiovisual content.
h3. A lens for media
Compared to other mobile handsets the iPhone is a particularly media-friendly device, with a large, bright screen and high quality audiovisual playback. What if this screen could act as a ‘lens’ to content that resides in the world?
In a screen-based interface content may be buried many levels deep inside an information architecture. But in a physical RFID-driven interface a simple gesture can offer quick and direct access to content. Physical objects afford tangible manipulation that screens cannot, and this is great for playful products. Our “Bowl prototype”:http://www.nearfield.org/2007/12/bowl-token-based-media-for-children showed a natural blending of media consumption and playful activitiy in children, where media viewing became less passive and a more active experience.
“Durrell Bishop”:http://www.designinginteractions.com/interviews/DurrellBishop has discussed these ideas in a more general way, what if objects were “augmented with new properties”:http://www.flickr.com/photos/timo/3295486725/, that can be perceived through an iPhone lens?
h3. Media objects
In this video demo, the objects have been chosen to physically or visually represent the content. There are some obvious relationships, such as the Moomin figure leading to a favourite episode of a Moomin animation. The less obvious relationships such as the wooden house leading to home videos were chosen because they just somehow felt right. In fact the exact relationship may be of secondary importance, as over time the behaviour of the physical and digital objects becomes known and transparent through exploration and repetition.
Some of the objects felt particularly satisfying. The “Make Podcast object”:http://blog.makezine.com/archive/weekend_projects/ for instance — where touching the ‘geek’ plays the latest ‘Weekend project’ — shows how an object can be used for exploring a dynamic stream of content.
h3. Going further
This video prototype is basic and intended to open up for discussion and new exploration around the experience of media selection through physical objects. At the moment the interaction is a trigger, but what if the phone doesn’t just react as _output_ but also as _input_ to physical objects? How do we programme and manage our sets of media and applications in these objects?
Overall this points towards opportunities around the distribution of media through physical objects, it is an example of general ideas around an ‘internet of things’ or ‘spimes’ applied to the world of media. What opportunities would the distribution of RFID-embedded products open up in terms of media, gaming, services and marketing? What does this mean for the future of products?