Prototype
The current system is implemented with a handheld mini-projector retrofitted with sensors (proximity sensor, IR camera, 3D e-compass, 3D accelerometer, tilt and gyro sensors) that support position and orientation of the device. Since computer vision is used for both positioning and recognizing visual tags and images, substantial computation power is required, which exceeds the capability of the fastest mobile phone available at the time. Therefore, Ultra Mobile PC (UMPC over 1.33 GHz, with GPU built in) was used to handle vision processing.
Human Factors Considerations
When designing a mixed reality device for human use , it becomes crucial to analyze different human factors associated with the information projection.
Projection Direction from User
The device should have awareness of the direction in which it is looking. Depending on the projection’s pointing direction, the output varies. Projected direction is the most important fundamental contextual information to decide how the guiding information should be presented in order to guide the user to move in the right direction. Electronic compasses can be used to detect the direction which is used to determine what information is displayed.
Projection Distance from User
Our observed normal projection distance from the user was approximately 1.7 meters. When a user approaches towards the end of a straight path (or an intersection), users may want to project further ahead to investigate anticipated future directions. Therefore, it is important to ask how and what information should be presented depending on projected distance. Detecting the projecting distance requires both proximity and tilt sensors not only to measure the distance of projected distance, but also to distinguish projection surfaces (i.e. wall or floor). Tilt sensors are used to support the proximity sensors by measuring the projection angle from the user’s hand.