Interactivity & Installation

Thursday, March 31, 2011

Touch Free Gesture Control for mobile devices

Using Kinect type of gesture recognisation for mobile devices. (Concept)



Thursday, March 24, 2011

Flip Phone by Interaction Designer, Kristiian Ulrich Larsen

What is being creative? from Kristian Ulrich Larsen on Vimeo.



For more info:
http://idkul.com/

Tuesday, March 8, 2011

Sixth Sense: an open source Wearable Gestural Interface.

ABOUT
'SixthSense' is a wearable gestural interface that augments the physical world around us with digital information and lets us use natural hand gestures to interact with that information.

We've evolved over millions of years to sense the world around us. When we encounter something, someone or some place, we use our five natural senses to perceive information about it; that information helps us make decisions and chose the right actions to take. But arguably the most useful information that can help us make the right decision is not naturally perceivable with our five senses, namely the data, information and knowledge that mankind has accumulated about everything and which is increasingly all available online. Although the miniaturization of computing devices allows us to carry computers in our pockets, keeping us continually connected to the digital world, there is no link between our digital devices and our interactions with the physical world. Information is confined traditionally on paper or digitally on a screen. SixthSense bridges this gap, bringing intangible, digital information out into the tangible world, and allowing us to interact with this information via natural hand gestures. ‘SixthSense’ frees information from its confines by seamlessly integrating it with reality, and thus making the entire world your computer.

The SixthSense prototype is comprised of a pocket projector, a mirror and a camera. The hardware components are coupled in a pendant like mobile wearable device. Both the projector and the camera are connected to the mobile computing device in the user’s pocket. The projector projects visual information enabling surfaces, walls and physical objects around us to be used as interfaces; while the camera recognizes and tracks user's hand gestures and physical objects using computer-vision based techniques. The software program processes the video stream data captured by the camera and tracks the locations of the colored markers (visual tracking fiducials) at the tip of the user’s fingers using simple computer-vision techniques. The movements and arrangements of these fiducials are interpreted into gestures that act as interaction instructions for the projected application interfaces. The maximum number of tracked fingers is only constrained by the number of unique fiducials, thus SixthSense also supports multi-touch and multi-user interaction.

The SixthSense prototype implements several applications that demonstrate the usefulness, viability and flexibility of the system. The map application lets the user navigate a map displayed on a nearby surface using hand gestures, similar to gestures supported by Multi-Touch based systems, letting the user zoom in, zoom out or pan using intuitive hand movements. The drawing application lets the user draw on any surface by tracking the fingertip movements of the user’s index finger. SixthSense also recognizes user’s freehand gestures (postures). For example, the SixthSense system implements a gestural camera that takes photos of the scene the user is looking at by detecting the ‘framing’ gesture. The user can stop by any surface or wall and flick through the photos he/she has taken. SixthSense also lets the user draw icons or symbols in the air using the movement of the index finger and recognizes those symbols as interaction instructions. For example, drawing a magnifying glass symbol takes the user to the map application or drawing an ‘@’ symbol lets the user check his mail. The SixthSense system also augments physical objects the user is interacting with by projecting more information about these objects projected on them. For example, a newspaper can show live video news or dynamic information can be provided on a regular piece of paper. The gesture of drawing a circle on the user’s wrist projects an analog watch.

The current prototype system costs approximate $350 to build. Instructions on how to make your own prototype device can be found here (coming soon).



Source & more info:
http://www.pranavmistry.com/projects/sixthsense/

Skinput, projected interface

What is “Skinput”?
Posted On 09 Mar 2010 By Izanagi.

Do you remember SixthSense? (if not, check out this extensive review)

Well, now here is Skinput!


Chris Harrison, from Carnegie Mellon University, has developed Skinput, a way in which your skin can become a touch screen device or your fingers buttons on a MP3 controller. Harrison says that “as electronics get smaller and smaller they have become more adaptable to being worn on our bodies, but the monitor and keypad/keyboard still have to be big enough for us to operate the equipment.” This can limit just how small our devices can get. However, with the clever acoustics and impact sensing software developed by Harrison and his team, we can use our skin in the same manner as a keypad. Add a small pico projector attached to an arm band (or elsewhere), and your wrist becomes a touch screen!


Chris has apparently also teamed up with Microsoft‘s Dan Morris and Desney Tan on this project. Does it mean that this will be a Microsoft offering at some point? Who knows? But that’s a big downside if you ask me! At least SixthSense technology by Pranav Mistry is slated to be open source! And although they have been doing better lately with projects such as Windows Phone 7 Series, Project Natal, and the mythical Courier tablet, Microsoft still has a hurdle to overcome from not being considered as an “innovator” to the younger generation. (who are the most important when it comes to adopting something such as Skinput!)

Here’s the jist of how it works: The user wears an armband, which contains a very small projector that projects a menu or keypad onto a person’s hand or forearm. The armband also contains an acoustic (sounds) sensor. Why? Because when you tap different parts of your body, it makes unique sounds based on the area’s bone density, soft tissue, joints and other factors.

The software in Skinput is able to analyze the sound frequencies picked up by the acoustic sensor and then determine which button the user has just tapped.

Sounds pretty wild, huh? Well apparently there’s quite a bit of acoustic distinction when it comes to WHERE you’re tapping on your body — even within a small range. So when would it come in handy? Always. “This approach provides an always-available, naturally-portable, and on-body interactive surface,” Chris Harrison wrote on his YouTube page. More information can be found on Chris’ [project website] as well.

VERDICT: Overall, I still think SixthSense makes more “sense”! (no pun intended) I feel as if you have more freedom with the ability to project ANYWHERE as opposed to just “on your body”. It almost feels as if Skinput should be incorporated INTO SixthSense somehow instead of being a standalone technology. Maybe that will be the case eventually. Also, the fact that Microsoft seems to have their hands on it makes me a bit weary. Either way, both technologies are still in development and are great indicators of the direction computing is heading in the next decade! Minority Report, here we come!!!



Original link:
http://techknowbabel.com/2010/03/09/what-is-skinput/

Another report:
NEC Develops Technology to Control Devices with the Tap of an Armm