Difference between revisions of "Teleimmersion"

From RSN
Jump to: navigation, search
(New page: Telepresence or tele-immersion technologies allow people to attend a shared meeting without being physically present in the same location. Commercial telepresence solutions available in th...)
 
(System Design)
Line 3: Line 3:
  
 
== System Design ==
 
== System Design ==
 +
 +
[[Image:SysFlow.png|500px|thumb|center|Flow of data and control in our system]]
 +
Our mobile video-conferencing system consists of
 +
multiple iRobot Create robots (differential drive), each
 +
carrying an Asus EEE PC connected over a serial interface.
 +
These inexpensive netbooks are powerful enough
 +
to run a fully-flavored Linux operating system. They
 +
also have a built-in embedded 1.3 megapixel camera.
 +
The laptops control the robots using the iRobot Open
 +
Interface (OI) specification. Communication between the
 +
robots and a central workstation uses an ad-hoc wireless
 +
network.
 +
 +
The flowchart above shows the flow of data and control executed in every time-step of our system:
 +
(1) vision-based state estimation using markers, (2) combined user state estimates are classify the user as either 'Rotating' or 'Linear' (translating), (3) as a function of the classified user motion pattern, compute optimal robot trajectories, and, (4) send control commands over the wireless network for each robot to execute.
  
 
== Human Motion Patterns and Optimal Robot Trajectories ==
 
== Human Motion Patterns and Optimal Robot Trajectories ==
  
 
== Simulations and Experiments ==
 
== Simulations and Experiments ==

Revision as of 15:06, December 12, 2009

Telepresence or tele-immersion technologies allow people to attend a shared meeting without being physically present in the same location. Commercial telepresence solutions available in the market today have significant drawbacks - they are very expensive, and confine people to the area covered by stationary cameras. In this research project, We aim to design and implement a mobile tele-immersion platform that addresses these issues by using robots with embedded cameras. In our system, the users can move around freely because robots autonomously adjust their locations. We provide a geometric definition of what it means to get a good view of the user, and present control algorithms to maintain a good view.

System Design

Flow of data and control in our system

Our mobile video-conferencing system consists of multiple iRobot Create robots (differential drive), each carrying an Asus EEE PC connected over a serial interface. These inexpensive netbooks are powerful enough to run a fully-flavored Linux operating system. They also have a built-in embedded 1.3 megapixel camera. The laptops control the robots using the iRobot Open Interface (OI) specification. Communication between the robots and a central workstation uses an ad-hoc wireless network.

The flowchart above shows the flow of data and control executed in every time-step of our system: (1) vision-based state estimation using markers, (2) combined user state estimates are classify the user as either 'Rotating' or 'Linear' (translating), (3) as a function of the classified user motion pattern, compute optimal robot trajectories, and, (4) send control commands over the wireless network for each robot to execute.

Human Motion Patterns and Optimal Robot Trajectories

Simulations and Experiments