יום חמישי, 19.4.2012, 11:15
חדר 337, בניין טאוב למדעי המחשב
With the recent development of auto-multiscopic 3D displays that provide a 3D experience without the use of glasses, and with the recent
availability of inexpensive hybrid depth/color cameras such as the Kinect, that provide real-time geometric and texture information,
we are a step closer to realizing unencumbered 3D teleconferencing systems. However, many challenges still remain to be solved.
In this talk, I will present two teleconferencing software solutions based on the Kinect.
The first is FreeCam - an acquisition system which enables novel-view synthesis in real-time, allowing
the viewer to roam a scene using a virtual camera.
The second solution addresses the critical problem in video-mediated teleconferencing of the lack of eye contact caused by the disparity
between the locations of the subject and the camera. While this problem has been partially solved for high-end video conferencing systems
that employ custom-made expensive hardware, it has not been convincingly solved for consumer-level systems. In this talk I will present a
real-time gaze correction system that requires only one hybrid depth/color sensor such as the Kinect.
This system will be released shortly as a Skype plugin.
Both these projects are collaborations between the Center for Graphics and Geometric Computing (CGGC) at Technion and the
Computer Graphics Lab at ETH Zurich and part of the larger "Being There" Telepresence project administered by the
Institute of Media Innovation at Nanyang Technological University in Singapore.