Sixth Sense Technology
Abstract
The Sixth sense wearable gestural
interface that augments the physical world around us with digital information
and let's us use natural hand gesture to interact with that information. The
neckworn projector and camera combination was first proposed by MIT media lab
student steve mann. The concept was further developed by Pranav Mistry, while
he was also a student at MIT media lab. sixth sense bridges the gap by bringing
intangible,digital information out into the tangible world, and allowing us to
interact with that information via natural hand. sixth sense comprises pocket
projector, a mirror and the camera.The hardware components are coupled with in
a pendant like mobile wearable device.
INTRODUCTION
This technology is a revolutionary
way to interface the physical world with digital information. Modern
technologies include the touch screen techniques which is used widely and it
makes ease of operation and saves utilization time. Sixth sense is a wearable
gestural interface that augments the physical world around us with digital
information and lets us use natural hand gestures to interact with that
information. But the bottle necks of this method such as position of camera,
for capturing gestures interprets the accuracy in the projected output, lead to
use of commands instead of hand gestures. The position of camera is a major
constraint in the image capturing and projected output efficiency and accuracy.
Therefore the actions which we regularly perform in our daily life, are
converted to commands and are trained to a speech IC. They are stored as a database
in the integrated circuit and corresponding actions are performed when the
speech is recognized from the user.
It’s a hi-tech device seamlessly integrate
Analog information with our everyday physical world. The voice is directly
performed into operation within fractions of seconds, and the action is
projected on the surface. It’s a portable device and eases the operation which
we regularly perform. Basically the sixth sense technology concept involves the
use of hand gestures. The fingertip will contain colored markers and hence
gestures performed will be captured by the camera. Then it’s given to the
mobile device for the corresponding action to be performed. The action is
projected on the surface through the projector. Software algorithms and
computer vision technologies will be used to enable the action from the mobile
device for the corresponding gesture captured in the camera. This gesture based
technology is used for variety of applications like performing basic actions,
locating points in the map, watching video in newspaper, dialing number in
hand etc. The slight modification of this method lead to the use of commands
that is analog information into real world. The analog data is converted into
digital and performed as action, as all times the hand gestures cannot be used.
This was how the wearable device is fit to
the human body. Here color markers are used in the finger tips. In our
technology we use commands for performing the same operations. Many high
technology speech integrated circuits evolved which makes our operation
enhanced with more advanced features.
To ensure accurate gesture recognition and an intuitive interface a number of constraints are applied. A region in the front of the projection screen is defined as the active zone and the gestures are ignored, if the gestures are performed out of this area. Gestures are also defined by a set start posture, end posture and dynamic motion between the start and end postures. Perhaps the use of gestures is most powerful when combined with other input modalities, especially voice. Allowing combined voice and gestural input has several tangible advantages. The first is purely practical-ease of expression. Ease corresponds to the efficiency with which commands can be remembered and expressiveness, size of command vocabulary.
To ensure accurate gesture recognition and an intuitive interface a number of constraints are applied. A region in the front of the projection screen is defined as the active zone and the gestures are ignored, if the gestures are performed out of this area. Gestures are also defined by a set start posture, end posture and dynamic motion between the start and end postures. Perhaps the use of gestures is most powerful when combined with other input modalities, especially voice. Allowing combined voice and gestural input has several tangible advantages. The first is purely practical-ease of expression. Ease corresponds to the efficiency with which commands can be remembered and expressiveness, size of command vocabulary.
DESIGN AND WORKING
The Sixth sense innovation embraces a pocket projector, a mirror and a Polaroid held in a head-mounted, handheld or pendant-like, wearable gadget. Both the projector and the Polaroid are linked with a versatile figuring gadget in the client's pocket. The projector endeavors visual data allowing surfaces, dividers and physical questions around us to be utilized as interface while the Polaroid distinguishes and tracks clients' hand signals and physical items utilizing workstation vision based actions. The product system form the feature stream information caught by the Polaroid and tracks the areas of the shaded markers at the tips of the client's fingers. The developments and plans are translated into signals that demonstration as teamwork directions for the expected provision interfaces. The components used in Sixth Sense Technology are;
1. Camera.
2. Mobile Component.
3. Projector.
4. Mirror.
5.Colored Marker.
fig. Block Diagram of sixth sense technology.
1. Camera.
It captures
the image of the object which is viewed and trails the user’s hand
gesture. The camera identifies the visuals which are captured by the
user’s hand gesture. Smart phones receive the data from the camera for
processing. Here camera is the digital eye which connects to the outside
world in the digital format.
2. Mobile Component.
The sixth sense device incorporates a web
empowered cell phone which forms the information which is sent by the Polaroid.
The cell phone seeks the web and comprehends the hand motions with the
assistance of the hued markers put at the fingertips. Fundamental preparing
deals with figure calculations.
3. Projector.
The data that is deciphered through the cell phone can
be anticipated into any surface. The projector ventures the visual data
empowering surfaces and physical items to be utilized as interfaces. The
projector itself comprises of batteries which have 3 hours of battery life. A
Tiny LED projector shows the information sent from the cell phone on any
surface in perspective question, divider or individual. The descending
confronting projector ventures the image on to a mirror.
4. Mirror.
The usage of mirror is imperative as the
projector dangles indicating descending from the neck. The mirror reflects the
picture on to a yearning surface. Along these lines at last the computerized
picture is liberated from its limits and set in the physical world.
5.Colored Marker.
There are color markers placed at the tip
of user’s finger. Marking the user’s fingers with red, yellow green and blue
colored tape helps the webcam to recognize the hand gestures. The movements and
arrangement of these markers are interpreted into gestures that act as a
interaction instruction for the projected application interfaces.
APPLICATIONS
The Sixth Sense prototype implements
several applications that demonstrate the usefulness, viability and flexibility
of the system.
The Sixth Sense device has a huge number
of applications. The following are few of the applications of Sixth Sense
Technology.
1. Make a call.
2. Call up a map.
3. Check the time.
4. Create multimedia reading experience.
5. Drawing application.
6. Zooming features.
7. Get product information.
8. Get book information.
9. Get flight updates.
10. Feed information on people.
11. Take pictures.
12. Check the email
Fig: Some Examples of Sixth Sense Technology.
Advantage
1. Sixth Sense is a user friendly interface which
integrates digital information into the physical world and its objects, making
the entire world your computer.
2. Sixth Sense does not change human habits but causes
computer and other machines to adapt to human needs.
3. It uses hand gestures to interact with digital
information.
4. Supports multi-touch and multi-user interaction.
5. Data access directly from machine in real time.
6.It is an open source and cost effective and we can
mind map the idea anywhere.
7. It is gesture-controlled wearable computing device
that feeds our relevant information and turns any surface into an interactive
display.
8. It is portable and easy to carry as we can wear it
in our neck.
9.The device could be used by anyone without even a
basic knowledge of a keyboard or mouse.
10.There is no need to carry a camera anymore. If we
are going for a holiday, then from now on wards it will be easy to capture
photos by using mere fingers.
Future Scope
1. To get rid of color markers.
2. To incorporate camera and projector inside mobile
computing device.
3.Whenever we place pendant- style wearable device on
table, it should allow us to use the table as multi touch user interface.
4.Applying this technology in various interest
like gaming, education systems etc.
5.To have 3D gesture tracking.
6.To make sixth sense work as fifth sense for disabled.
CONCLUSION
1.Sixth Sense recognizes the objects around us, displaying information
automatically and letting us to access it in any way we need.
2.The sixth sense prototype implements several applications that
demonstrate the usefulness, viability and flexibility of system.
3.The potential of becoming the ultimate "transparent" user
interface for accessing information about everything around us.
4.Allowing us to interact with this information vi natural hand gesture
No comments:
Post a Comment