Your Body
Your Home
Your Virtual World
Your Mind
Links               Contact Us

Your HOme

This would be the interface you have access to as you wander about your home. You would interact with your home by carrying on a conversation with it and making gestures.

Speech Interface Has Made Progess - Needs to Be Expanded to Your Home

The personal computer and mobile devices of today have built in speech recognition software. This enables you to interact with technology. Think if this were expanded to be able to talk with a computer system as you move about your home.

The personal computer market leader for desktop speech recognition software is Dragon NaturallySpeaking. It is designed to run on Microsoft Windows, but has also been shown to run under Linux using software emulation.  This has been reviewed in Wikipedia:NaturallySpeaking. and listing of speech recognition software at Wikipedia:List of speech recognition software.

NaturallySpeaking has to be trained for approximately 10 minutes to recognize the user's voice.  Initially, accuracy rates of 80-85% are reasonable to be expected.  An expert NaturallySpeaking user can expect 98-99% recognition accuracy according to Nuance Communications, but such claims of almost perfect accuracy have never been substantiated independently. Moreover, the program itself very carefully avoids reporting on recognition rates. The 98-99% figures are unlikely to be true.  So a recognition rate of 90 % is to be expected in a reasonable time.  This is missing 1 word in 10 and would make a speech interface tedious.  This rate would degrade in noisy environments.

So there is a ways to go but the technology is making rapid progress.


Gesture Recognition - 3D body tracking

Recent advances in consumer-level 3D-tracking technology have made it possible to have a complete knowledge of precise location of an individual. This representation is detailed enough to construct a skeletal model of how the person is standing and there exact location within the 3-D field of the sensor. Moreover, there is sufficient detail to precisely determine where their hands are in 3D space. This technology is highlighted in the 3-D body tracking for gaming using the Xbox Kinect. The Xbox Kinect does 3D-tracking of people in front of the television in order for them to interact with the Xbox game software by using natural body movements. What if this technology could be expanded throughout your entire home?

The future of gesture interfaces have been depicted in movies such as “Minority Reports” and the “Iron Man” series. Intel has been developing the gesture interface which is integrated into some personal computers. Their interface uses an Intel RealSense camera describe on Intel RealSense website. No doubt more technologies are on the way. In order for such 3D-tracking of a person to be really useful, a map of the location of items in the environment around them needs to be known by the software. Although most individuals construct a mental map of their homes, augmented reality can assist by providing a means of dynamically interacting with their world. In addition, augmented reality would decrease the mental load required to navigate and interact with items and therefore increase attention reserve to devote to the task at hand. A personal computer would build a 3D model of the home by labeling the items detected by the camera based upon their position and spatial dimensions. This would be greatly aided by using imaging technology to determine precisely what an item is along with all relevant data including spatial dimensions. The software could even associate relevant information about each object with its position. For example, the instructions for how items are to be used, or locations of button and control switched could be coded.


Example of Augmented Kitchen - Making Lunch

As an example of how this technology will work, consider the following scenario: You walk into the kitchen and say: "Time for Lunch".
The kitchen responds, "Understood - Time for lunch, Please specify."
You and kitchen computer system enter a brief discussion of the options based upon the computer's knowledge of the food items available and your preferences. And then with a final "yes" you select a particular soup. The kitchen is equipped with 3D-tracking technology that can direct you with voice commands. First you are directed to particular cabinet containing the soup. You are told the soup can is located on the bottom shelf to the right side. With precise 3D tracking of his hand the "kitchen" (software/hardware interface) direct you to the specified microwaveable container. You select the container and go to the kitchen counter next to the stove. The container is quickly scanned with a high resolution camera to confirm its identity (particular flavor of soup). The "kitchen" knows the cooking directions and you are told what they are. You unseal the plastic top. The "kitchen" knows the time and power settings for the soup and sets the microwave for this. You then think that crackers would be a good addition to the soup.
You speak out the command "Where are the crackers?".
You then confirm with a "yes" when the "kitchen" responds "Do you want crackers with your soup?"
The "kitchen" then tells you what cabinet the crackers are in.
And you notice the crackers are low - with a quick vocal command you have the "kitchen" add crackers to the shopping list.
While you was getting the crackers and putting the box away, the "kitchen" announces that his soup is ready and you retrieve it.
Meanwhile the "kitchen" has logged the meal for nutrition and also added as crackers for a future shopping list. Even with this simple lunch there is much an augmented reality kitchen can offer in terms of help. As the above story of fixing yourself a simple lunch, augmented reality will be the way information is obtained and actions are done as you walk around your home.


Cythor: bringing the ultimate interface into your life

Cythor is a Trademark of Cythor © 2022