MeBIT - Merging TV Broadcast and Information Technology

(Towards Interactive TV)

This page describes a collaboration between SICS and MediaMakeriet together with SVT. The collaboration was divided into two projects and took place during autumn 2000 and autumn 2001. The projects contained both a commercial and a research aspect. The collaboration was initiated by MediaMakeriet wanting to develop an easy to use low-cost virtual studio system. The target was smaller studios/channels with limited resources and limited amount of available studio space that might not afford the existing systems. Using a virtual studio system they can with a single mouse click "construct" a new studio increasing the possible ways they might present themselves and their programs. The research aspect of the projects centered around exploring the new ways of involving the viewers in the programs that is made possible by the use of virtual sets. By connecting the computers responsible for rendering the virtual part of the set to the Internet the TV program might be more or less turned into a shared environment where the viewers may take a more active part.

With the help of this system, it will be possible to produce TV programmes where the traditional studio is replaced with a real-time computer generated three-dimensional picture. There are many advantages to this approach: radically decreasing the time to construct and teear down a TV studio set, easier and less time consuming lighting of the set, decreased costs for storing studio sets, etc. Furthermore, some sets are much cheaper and easier to model in a computer than to physically construct. Once finalised, the system will allow scenarios where the whole appearance of a TV programme can be changed simply by selecting a new set of files in a computer. This switch of appearance will happen in a matter of seconds, as opposed to having to build several studios that will be used in parallel to achieve the same effect within traditional TV production.

In a normal setup the system uses three relative rotary encoder sensors (e.g. from Leine & Linde). Two are mounted on the camera stand and measures pan and tilt. The third sensor is mounted on a special holder near the camera lens and measures the rotation of the lens. The sensor values are sent to a MediaMakeriet proprietary box that is capable of turning six high-speed relative sensor values into an easily readable absolute form accessible over RS-232. On the other end of the RS-232 cable is a computer which uses the sensor values to recreate the optical model necessary to render the 3D scene as if it was shown from the camera, given its position, orientation, zoom, etc. The computer-generated picture is then mixed together with the image coming from the camera using chroma-keying technology to produce the final picture that is broadcasted.

One of the main problems in the projects were to find a quick and easy way of calibrating the sensors and to find the internal parameters such as lens distortion and camera filed of view in the zoom interval. How to find theses values and use or counter them is a common problem in computer vision research. Our approach was to find an acceptable solution that would include two stages. The first most difficult stage is to find the non-linearity of the relation between lens rotation (and hence zoom sensor measurements) and field of view. This is something that is done per lens and the resulting curve is stored on file. During system start-up the data on the file corresponding to the active lens is used to construct a spline making it possible to translate lens rotation into camera field of view. The second calibration stage would have to be performed each time the system was started but is on the other hand much simpler. This stage calibrates the external parameters of the setup, like pan, tilt and zoom sensor endpoints as well as the camera's orientation (and possibly position) in the studio. In this stage the end-user is guided through the process by a small number of dialogue windows.

In the project we also developed a simple SMS-C that made it possible to test how a limited form of user interaction in the form of sending SMS messages might be used in a virtual studio TV programme. Using predefined SMS messages we could alter 3D objects in the virtual set (like position, size and rotation). We also experimented with a more interactive way of involving the viewers (or in this case participators). By downloading a browser the viewers might, represented as 3D objects in some form, enter the 3D virtual set environment and take active part in the TV show.

The following images are snapshots from a test at SVT showing the presenter moving around the virtual weather map. The presentation ends with an animated virtual object showing heightened risk of forest fires.

For further information, contact Anders Wallberg.