From Surf Wiki (app.surf) — the open knowledge base
Computer processing of body language
none
none
The normal way that a computer functions manually is through a person that controls the computer. An individual generates computer actions with the use of either a computer mouse or keyboard. However the latest technology and computer innovation might allow a computer to not only detect body language but also respond to it. Modern devices are being experimented with, that may potentially allow that computer related device to respond to and understand an individual's hand gesture, specific movement or facial expression.
In relation to computers and body language, research is being done with the use of mathematics in order to teach computers to interpret human movements, hand gestures and even facial expressions. This is different from the normal way people generally communicate with computers for example with the click of the mouse, keyboard, or any physical contact in general between the user and the computer.
MIAUCE and Chaabane Djeraba
This type of research is being done by a group of European researchers and other scientists as well. There is also a project called MIAUCE (Multimodal interactions analysis and exploration of users within a Controlled Environment). This project has scientists working on making this sort of new advance in computer technology a reality. Chaabane Djeraba, the project coordinator stated "The motivation of the project is to put humans in the loop of interaction between the computer and their environment."
Researchers and scientists are trying to use their innovation and ideas in a way that can help them apply these modern technological devices to the daily needs of businesses and places people visit such as the mall or an airport. The project coordinator of MIAUCE stated "We would like to have a form of ambient intelligence where computers are completely hidden…this means a multimodal interface so people can interact with their environment. The computer sees their behavior and then extracts information useful for the user." This specific research group has developed a couple of different real life models of computer technology that will use body language as a means of communication and way to function.
References
- Moursund, David. Brief Introduction to Educational Implications of Artificial Intelligence. Oregon: Dave Moursund, 2006. Print.
- Braffort, Annelies. Gesture-based Communication in Human-computer Interaction: International Gesture Workshop, GW '99, Gif-sur-Yvette, France, March 17–19, 1999 : Proceedings. Berlin: Springer, 1999. Print.
- Fred, Ina. "Gates: Natal to Bring Gesture Recognition to Windows Too." Cnetnews 14 July 2009: 1. http://news.cnet.com. Ina Fred, 14 July 2009. Web. 18 Nov. 2010. .
- Hansen, Evan. "Building a Better Computer Mouse." CNET News. CNET, 2 Oct. 2002. Web. 20 Nov. 2010. .
- Unknown. "How Computers Can Read Body Language." EUROPA - European Commission - Homepage. 22 Oct. 2010. Web. 22 Nov. 2010. .
- Braffort, Annelies. Gesture-based Communication in Human-computer Interaction: Proceedings. Berlin etc.: Springer, 1999. Print.
- Yang, Ming-Hsuan, and [Narendra Ahuja. Face Detection and Gesture Recognition for Human-computer Interaction. Boston: Kluwer Academic, 2001. Print.
This article was imported from Wikipedia and is available under the Creative Commons Attribution-ShareAlike 4.0 License. Content has been adapted to SurfDoc format. Original contributors can be found on the article history page.
Ask Mako anything about Computer processing of body language — get instant answers, deeper analysis, and related topics.
Research with MakoFree with your Surf account
Create a free account to save articles, ask Mako questions, and organize your research.
Sign up freeThis content may have been generated or modified by AI. CloudSurf Software LLC is not responsible for the accuracy, completeness, or reliability of AI-generated content. Always verify important information from primary sources.
Report