leaning) and the perspective algorithm applied to the fisheye camera’s video feed (i.e. Those interactions are simulated via the Kinect’s face tracking abilities (i.e. Leaning side to side changes the perspective of the video feed. Also, one would get closer to the window to see a more detail on the other side. For example, to see what’s further to the left on the other side of the window, one would lean or move to the right side to see around the corner. The affordances we are currently testing are what would be expected of a normal window. For Magic Window, that interaction is currently being tested in the context of teleconferencing, partnering with the office productivity company Steelcase. How people notice and use those affordances are fundamental to interaction design. Combined, it is a powerful tool that combines the latest technology to investigate how people would interact remotely given certain affordances.Īffordances are facets of a design that seem to ask for a user’s interaction without explicit instruction, such as the contours of a computer mouse or shape of a door handle suggest how to grasp it. It uses a Kinect to track users’ bodies in motion and a high-resolution fisheye video camera to capture everything in front of the display. ![]() Its current form is as a web application to make use of WebRTC and other technical affordances of web browsers. That concept is basically a live video stream controlled via gestures like hand movement, leaning your body, and face tracking. The Magic Window as a concept has existed in several applications for a while now, starting with Brian Davidson and Jeff Wilson. Interaction design, Javascript, UI design, Usability testing ![]() Georgia Tech Interactive Media Technology Center, Steelcase
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |