The Manual Input Workstation (2004-2006: Golan Levin and Zachary Lieberman) presents a series of audiovisual vignettes which probe the expressive possibilities of hand gestures and finger movements. Interactions take place on a combination of custom interactive software, an analog overhead projector, and a digital computer video projector. The analog and digital projectors are aligned such that their projections overlap, resulting in an unusual quality of hybridized, dynamic light. During use, the visitors’ hand gestures are interpreted by a computer vision system as they pass across the glass top of the overhead projector. In response, the software generates synthetic graphics and sounds that are tightly coupled to the forms and movements of the visitors’ actions. The synthetic responses are co-projected over the organic, analog shadows, resulting in an almost magical form of augmented-reality shadow play. Research
You should include three types forms of research:
• Research on other projects that are related to yours. Make sure the projects relate to you
concept and not just the technology.
• Research on theories and philosophies related to your project. The closer the relationship is
to core concept the better. Please reference these properly.
• Research/ experiments on your own experience in the system The Camera Mouse: Visual Tracking of Body Features to Provide Computer Access for People With Severe Disabilities (http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=1021581)
– The system tracks the computer user’s movements with a video camera and translates them into the movements of the mouse pointer on the screen. Body features such as the tip of the user’s nose or finger can be tracked.
– Assistive technology devices have been developed to help them use their voluntary movements to control computers.
– Chen et al. developed a system that contains an infrared transmitter, mounted onto the user’s eyeglasses, a set of infrared receiving modules that substitute the keys of a computer keyboard, and a tongue-touch panel to activate the infrared beam . Helmets, electrodes, goggles, and mouthsticks are uncomfortable to wear or use.
– Most important, some users, in particular young children, dislike to be touched on their face and vehemently object to any devices attached to their heads.
– Corneal reflection systems have the disadvantages that they need careful calibration, require the user to keep his or her head almost completely still, and are not inexpensive.
– Given people’s experiences with currently available assistive technology devices, our goal has been to develop a nonintrusive, comfortable, reliable, and inexpensive communication device that is easily adaptable to serve the special needs of quadriplegic people and is especially suitable for children.
– The CameraMouse system currently involves two computers that are linked together—a “vision computer” and a “user computer.”
– The vision computer receives and displays a live video of the user sitting in front of the user computer. The video is taken by a camera that is mounted above or below the monitor of the
– The user computer runs a special driver program in the background that takes the signals received from the vision computer, scales them to , coordinates in the current screen resolution, and then substitutes them for the coordinates of the cursor. – To test other body features, not just facial features, the thumb was selected. Although it was tracked successfully, as shown in sequence E in Fig. 7, it has two main flaws as a tracking point.
First, the camera has difficulties in focusing on it. As can be seen in sequence E, the thumb takes up such a small portion of the screen that the camera’s autofocus mechanism focuses on
the objects in the background and not the thumb.
– Users learn to identify with their virtual embodiment through &nb
sp;interaction with it, provided that they can perceive straightforward, consistent relations between their actions on the virtual embodiment and the results of those actions for the embodiment and the CVE [S].
– Some of the more important issues for virtual embodiment design include:
Location; in shared spaces users need to know their own position and location in
relation to other users and objects.
Identity; users have to recognise who someone is from the embodiment, and be able
to differentiate between agents and other users.
Activity; embodiments should help in conveying a sense of on-going activity.
Availability and degree of presence; it is useful to convey information about the
availability for interaction.
Gesture and facial expression; gesture and facial expression play an important role in
History of activity; embodiments can support historical awareness of who has been
present and what activities they’ performed.
Manipulating one’s view of other people; to be able to control the view of other
people’s bodies in order to reduce machine load, and to create subjective views .
Representations across multiple media; this has to be considered because
embodiment extends itself not only in the graphical, but also in the audio and textual
Autonomous and distributed body parts; people can be in several places in one WE, or
in several CVEs at the same time
– Edmund Sapir famously remarked that blowing a candle produces a gesture and a sound that are identical to the gesture and sound made when pronouncing the (German) consonant W.
– Theorists of motor behavior also must confront this issue and ask on what grounds motor behavior can convey meaning.
– EBL = emotional body language
– Apple Multitouch Language
http://www.funkyspacemonkey.com/apple-patent-library-multitouch-language US Patent & Trademark Office for Apple multitouch-