For all its good points, the graphical user interface (GUI) that we're so familiar with today is a poor substitute for the manner in which humans truly interact with their environment. Sure, drag and drop is a fairly intuitive action, but it does little to replicate our more primitive and adaptive behaviors for sorting and working with our hands - which is exactly the issue that tangible user interfaces (TUI) are seeking to solve.
MIT Media Lab graduate students David Merrill and Jeevan Kalanithi and associate professor Pattie Maes have tackled the issue with Siftables, which they describe in their research as "a novel platform that applies technology and methodology from wireless sensor networks to tangible user interfaces in order to yield new possibilities for human-computer interaction. Siftables are compact devices with sensing, graphical display and wireless communication. They can be physically manipulated as a group to interact with digital information and media."
Translation: Picture a mini iPhone, about 2 inches by 2 inches. You have a dozen of them sitting on a table in front of you, each with a different picture, graphic, text block or application on it. Shuffle the different Siftables in front of you to create new orders, ideas or combinations of ideas. It's essentially your computer screen, fractured into a dozen pieces, each piece able to communicate with the other pieces while also understanding its relation to the other pieces. The Siftables react to each other and their changing surroundings - like good little nomads.