WO1994007327A1 - Method and apparatus for on-screen camera control in video-conference equipment - Google Patents

Method and apparatus for on-screen camera control in video-conference equipment Download PDF

Info

Publication number
WO1994007327A1
WO1994007327A1 PCT/US1993/007948 US9307948W WO9407327A1 WO 1994007327 A1 WO1994007327 A1 WO 1994007327A1 US 9307948 W US9307948 W US 9307948W WO 9407327 A1 WO9407327 A1 WO 9407327A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
cursor
video image
video
generating
Prior art date
Application number
PCT/US1993/007948
Other languages
French (fr)
Inventor
Margaret M. Marasovich
Gordon D. Ford
Michael G. Duncan
Pamela P. Saegert
Original Assignee
Rolm Company
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rolm Company filed Critical Rolm Company
Publication of WO1994007327A1 publication Critical patent/WO1994007327A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices

Definitions

  • This invention relates to video conferencing systems. b. Related Art
  • Videophones and video conferencing systems are becoming increasingly popular. Through the use of a video conferencing system, conferees at a variety of locations can have meetings and pass both video and audio information over the public telephone lines.
  • a video conferencing system will include one or more cameras, microphones, speakers and displays disposed at each conference location. By transmitting control information over the telephony lines, conferees at any location can control the cameras, microphones, speakers and displays both at their own location and at the locations of the other parties. For example, by operation of a control panel, a conferee at location "A" can pan the camera at location "B" and then zoom in on a particular person or object.
  • An example of a prior art video conferencing system is the PictureTel System 4000 (manufactured by PictureTel Corporation of Danvers, Massachusetts) .
  • conference control is provided by way of a control box which usually rests on a table.
  • Camera control is performed by repeatedly pressing a button on the control box, representing a direction (up, down, right, left) , or by physically moving the camera to point in the desired direction.
  • Another common method for controlling the position of a camera in video conference equipment is to manually move the camera so that it points in the desired direction. This requires sitting within arm's reach of the camera or getting up during a conference to adjust the camera. Further, it takes considerable attention to physically turn the camera and may, at least momentarily, obscure the image during adjustment.
  • This invention provides an intuitive and natural means for controlling a camera being used as part of video- conference equipment.
  • a video conference participant controls the camera by using a pointing device, such as a mouse, to position a cursor on the video display screen.
  • the user can control the panning and tilting of the camera by positioning the cursor on one of four arrowheads located on the four edges of the video display screen.
  • the arrowheads are outlines, transparent inside the lines, overlaid onto the video conference image.
  • the entire arrowhead changes appearance (e.g. to a bright color) to indicate that it is active.
  • the user can press a button on the pointing device to move the camera in the desired direction. If the user holds down the button, the camera continues to move in the chosen direction. If the user clicks the button, the camera moves in short increments in the chosen direction.
  • This clicking method allows the user to make small adjustments in the camera's position, whereas holding down the button on the pointing device enables the user to make larger changes in camera position quickly and efficiently.
  • a participant in a video conference can point the cursor at the top arrow to move the camera up, the bottom arrow to move the camera down, and so on. This action feels much the same as, for instance, pointing the lens of a video camera at an object to be video-taped.
  • FIG. 1 shows the video portion of a video conferencing system according to an embodiment of the present invention
  • FIG. 2 is an illustration of a display screen having a graphics overlay with directional control arrows according to an embodiment of the present invention
  • FIG. 3 is an illustration of the display screen of FIG. 2 after a directional arrow has been selected
  • FIG. 4 is a flowchart illustrating the operation of the processor of FIG. 1 in performing the video graphics overlay functions
  • FIG. 5 is a flowchart illustrating the operation of a re-center and zoom operation of the processor of FIG. 1 according to an embodiment of the present invention
  • FIG. 6 is a flowchart illustrating the operation of re-center and zoom operations of the processor of FIG. 1 according to an alternative embodiment of the present invention.
  • This invention provides the means to adjust a camera with minimal distraction to the participants.
  • a participant can easily move the cursor towards the appropriate arrow. Once the cursor enters the active area for an arrow, the arrowhead lights up to show it is active.
  • the user can either cause the camera to move continuously by holding down the mouse button or he can adjust the camera positions in small increments by pressing and releasing the mouse button until the camera reaches the desired spot. Either method can be done without taking ones eyes off the screen.
  • the action of pointing the cursor in the direction one wishes the camera to move is a natural and intuitive means of controlling the camera.
  • Each conference station 100, 101 includes a conventional television camera 104 (104A in conference station 101) and its associated control and positioning motors 106.
  • Each station also includes a conventional processor 108 having a pointing device 110 such as a mouse.
  • the processor 108 is connected to a video graphics interface 112 which is, in turn, connected to a video mixer 114.
  • the video graphics interface 112 converts digitally encoded graphics display data generated by the processor 108 into analog video signals.
  • the processor 108 and the video graphics interface can be embodied, for example, as an IBM compatible 80X86 based computer with a Video Graphics Adaptor (VGA) card.
  • VGA Video Graphics Adaptor
  • the video conferencing system of FIG. l could be, for example, embodied by modifying the processor firmware of an existing PictureTel System 4000 as described herein, and by providing the System 4000 processor with the pointing device hardware and handlers.
  • Each station also includes a conventional video coder/decoder (Video CODEC) 116 of a type having an internal video data buffer.
  • the Video CODEC 116 is connected to the communications network by way of a digital telephony communications link 118.
  • the Video CODEC 116 sends and receives encoded digital video information from the digital telephony communications network 102 and converts the information into analog video signals.
  • a video mixer 114 is connected to both the Video Codec 116 and the Video Graphics Interface 112.
  • the video mixer receives the analog video signals from Video CODEC 116 and mixes them with the analog video signals from the video graphics interface 112.
  • the video mixer generates a combined video image comprising the conference image from the Video CODEC 116 overlaid with the graphics video generated by the video graphics interface 112. This combined video image is displayed on a conventional video display 115.
  • the Video CODEC 116 is also connected to the processor 108 and the television camera 104 and its positioning motors 106.
  • the Video CODEC converts digital camera control information received by way of the telephony network 102 or the processor 108 and converts it into analog camera positioning signals. These signals are sent to the camera positioning motors 106 which control the position of the camera (e.g. pan and tilt) and the camera positive and negative zoom functions.
  • the Video CODEC 116 also receives television signals from the camera 104 and converts them into digital video signals for transmission over the communications network. Either processor 108 can control the camera within its own video conferencing station and, through the Communication Network 102, it can also control the far end (remote) camera.
  • the processor 108 in the first video conferencing station 100 can control its own camera 104 as well as the camera 104A in the second video conferencing station 101.
  • the processor 100 controls which video conferencing station 100, 101 is selected as a source/destination for the Video CODEC 116 by sending the Video CODEC 116 appropriate routing control information.
  • the user determines which conference station's image is to be displayed (and informs the processor 108) by way of a selection on a menu bar.
  • the Video CODEC 116 sends the camera control information generated by the processor 108 to the conferencing station whose image is currently being displayed.
  • FIGs. 2 and 3 The appearance of the screen of the video display according to an embodiment of the present invention is illustrated in FIGs. 2 and 3.
  • the processor 108 generates camera control arrows 202A-D which are overlaid on a video conference image 204 (received from the Video CODEC 116) .
  • the dotted lines surrounding the arrowheads 202A-D do not appear on the screen. They are included here to indicate the approximate position and size of the active areas.
  • FIG. 3 illustrates the change in a camera control arrowhead once the cursor enters the active area around the arrowhead.
  • the cursor e.g. cross-hairs
  • the processor detects that the cursor has been moved within the active area around an arrowhead, the processor highlights that arrowhead and turns off all of the other arrows.
  • the processor detects depression of a button on the pointing device, it generates camera positioning control signals which it sends to the Video CODEC 116. Depending on whether the system is set up for remote and/or local camera control, these control " signals are sent either to the communications network (where they are used to control the far end camera position motors at the target remote conference station) or to the local camera positioning motors.
  • FIG. 4 is a flow chart of the graphic overlay control software for the processor of FIG. 1.
  • the processor determines the cursor position on the Video Display 115. This position is controlled by a user by way of the mouse 110. As is conventional, the processor constantly monitors the mouse and displays the cursor at the appropriate position on the video display screen.
  • the processor determines whether the cursor is within one of the active regions for the displayed directional arrows and if so, determines in which arrow's region the cursor is displayed.
  • steps 412-418 if the cursor is in the region of one of the arrows, the processor highlights that arrow and turns off highlighting on all of the others.
  • steps 420-426 the processor determines if the pointing device button is being held down (a single button mouse will be assumed here although the processor could just as readily look for depression of a particular button on a multi-button mouse) . If so, in steps 428-434, the camera is moved one step in the direction of the arrow. If the pointing device button is not being pressed, the processor returns to step 402. If, in steps 404-410, it is determined that the cursor is not within one of the active regions, in step 436, the highlighting on all arrows is turned off.
  • This method of camera control can also be implemented using a keyboard with directional arrows to position a cursor on the displayed arrowheads or with other pointing devices such as a trackball instead of a mouse.
  • a touch-screen monitor can be used to display the video image. In that case, a user would touch the screen in the active area surrounding the arrowhead in order to move the camera.
  • the camera control arrowhead display can be implemented in several ways. For example, a pure hardware implementation can be used where dedicated hardware in the video circuitry places the camera control arrows at fixed places on the video display. Another approach is to provide the processor with software which writes data directly into the screen buffer of the Video CODEC in order to display the arrowheads on the video screen. A third approach is to provide the processor with a video graphics overlay which impresses the camera control arrows over the standard video display by way of the video mixer.
  • a similar method can be used to control zoom function.
  • One such method is to use a three button pointing device.
  • one button on the pointing device controls camera pan and tilt in accordance with the selected direction arrow (as previously described)
  • a second button on the pointing device causes the camera to zoom-in
  • the third button on the pointing device causes the camera to zoom-out.
  • the zoom-in and zoom-out buttons can be used to cause the camera to zoom in or out around a specific point selected by the pointing device.
  • depression of a first button on a three button pointing device causes the processor to generate pan and tilt control data that will cause the active camera (the camera whose image is being displayed) to re-center on a selected point (the point on which the cursor is located) .
  • Depression of the second or third buttons on the pointing device activate, respectively, a combined re- center and zoom-in or re-center and zoom-out operation.
  • the user positions the cursor (by using the pointing device) in the center of the desired image and then presses the appropriate button (zoom in or zoom out) .
  • the operation of the re-center and zoom control software in the processor 108 of FIG. 1, according to the above-described embodiment, is illustrated in FIG. 5.
  • step 502 the processor determines if any of the three pointing device buttons are being depressed. If none of the buttons are being depressed the processor returns to step 502. If any one of the buttons are being depressed, in step 504, the processor 108 determines the X,Y coordinates of the cursor position. Next, in step 506 the processor calculates the difference in position (delta X and delta Y) between the cursor position and the center of the displayed conference image. Then, in step 508 the processor uses the delta X and delta Y values to generate the appropriate pan and tilt control information to cause the image to be re-centered around the cursor location and provides this control information to the Video CODEC 116.
  • the pan and tilt control information can be determined by the use of a look up table wherein each entry in the table corresponds to an appropriate number and direction of pan and tilt steps for a given delta X and delta Y.
  • the specific table values can be precoded into the look-up table if the camera/motor types at each station are known or can be exchanged by each station providing the other with its camera/motor control parameters over the communication link.
  • steps 510, 512 the processor determines whether one of the zoom-in or zoom-out buttons is being depressed. If not, the processor returns to step 502. If in step 510 it is determined that the zoom-in button is being depressed, in step 514 the processor generates control signals for one zoom-in step (stepping control of the zoom camera motors is assumed here) , provides them to the Video CODEC 116 and returns to step 502. Similarly, if in step 512 it is determined that the zoom-out button is being depressed, in step 516 the processor generates control signals for one zoom-out step, provides them to the Video CODEC 116 and returns to step 502. Just as with the highlighted arrow pan/tilt control (previously described) , if the zoom-in or zoom-out button is held down continuously, the processor will, correspondingly, continuously generate zoom-in or zoom-out control step signals until the button is released.
  • the zoom-in and zoom-out control can be de-coupled from the centering controls.
  • This embodiment is illustrated in FIG. 6.
  • the processor determines if any of the three pointing device buttons are being depressed. If none of the buttons are being depressed the processor returns to step 502. If any one of the buttons are being depressed, in step 602 the processor determines if the "re-center" button (e.g. the far left button) has been depressed.
  • step 504 determines the X,Y coordinates of the cursor position in step 504, calculates the delta X, delta Y values relative to the center of the displayed image in step 506 and then, in step 508, generates appropriate pan and tilt control signals to re-center the image. If the re-center button was not depressed, the processor performs step 510 in which the processor determines whether one of the zoom- in button (e.g. the center button on the pointing device) is being depressed. If in step 510 it is determined that the zoom-in button is being depressed, in step 514 the processor generates control signals for one zoom-in step (stepping control of the zoom camera motors is assumed here) , provides them to the Video CODEC 116 and returns to step 502.
  • the processor determines the X,Y coordinates of the cursor position in step 504
  • step 516 the processor generates control signals for one zoom-out step, provides them to the Video CODEC 116 and returns to step 502. It is noted that the processor can determine that the zoom-out button was depressed since the pointing device is known to be a three button device and the other two buttons were eliminated as choices in steps 602 and 510.
  • the user can re-center the image around a selected point and either zoom in, zoom-out or take no further action, by depressing a single button on the pointing device.
  • the functions of re-center, zoom-in and zoom-out are each accomplished independently by a separate button.
  • the processor first determines if the cursor is within one of the active regions in or around an arrow. If so, the method of FIG. 4 is performed. If not, the method of either FIG. 5 or FIG. 6 is performed (whichever has been implemented) .
  • the user can use the pointing device to draw a selection border (preferably rectangular) around the desired image.
  • the processor software then changes the camera position so as to re- center the image (around the center of the selected area) and changes and zoom factor to cause the selected image to fill the display screen.
  • the processor calculates the ratio (dl/d2) of the diagonal (dl) of the full displayed image over the diagonal (d2) of the selected image.
  • Control software can be used to ensure that the relative dimensions of the selected area are of the same proportion as the displayed image (alternatively, the processor can just re-center and use the diagonal data as is) . Both the zoom and pan and tilt control information can be determined by the use of a look up table.
  • each entry in the table corresponds to an appropriate number of zoom steps for a given dl/d2 ratio.
  • the specific table values can be precoded into the look-up table if the zoom motor types at each station are known or can be exchanged by each station providing the other with its zoom motor control parameters over the communication link. As described with respect to the embodiments of FIG. 5 and FIG. 6, this method of performing a re-center and zoom operation can be used in conjunction with the control arrows of FIGs. 2-4.
  • GUI Graphic User Interface
  • the processor may be provided with additional Graphic User Interface (GUI) software which enables a user to control various conference features (e.g. far/near camera select, volume control ...) by way of menu bars displayed along the periphery of the video display screen.
  • GUI Graphic User Interface
  • These menus can be implemented such that they can be hidden or displayed under user control. When the menus are displayed, the video image is compressed into the slightly smaller space remaining on the display screen.

Abstract

A video conferencing system in which a participant may change the camera position or zoom by using a pointing device, such as a mouse, to position a cursor on the video display screen. In one embodiment, the user can control the panning and tilting of the camera by positioning the cursor on one of four arrowheads located on the four edges of the video display screen. The arrowheads are outlines, transparent inside the lines, overlaid onto the video conference image. When a user moves the cursor into a predefined active area surrounding the arrowheads, the entire arrowhead changes to an appearance (e.g. bright color) to indicate that it is active. Once the cursor is in the active area, the user can press a button on the pointing device to move the camera in the desired direction. If the user holds down the button, the camera continues to move in the chosen direction. If the user clicks the button, the camera moves in short increments in the chosen direction. This clicking method allows the user to make small adjustments in the camera's position, whereas holding down the button on the pointing device enables the user to make larger changes in camera position quickly and efficiently. Using this technique of camera control, a participant in a video conference can point the cursor at the top arrow to move the camera up, the bottom arrow to move the camera down, and so on.

Description

METHOD AND APPARATUS FOR ON-SCREEN CAMERA CONTROL IN VIDEO-CONFERENCE EQUIPMENT
I. Background of the Invention a. Field of the Invention
This invention relates to video conferencing systems. b. Related Art
Videophones and video conferencing systems are becoming increasingly popular. Through the use of a video conferencing system, conferees at a variety of locations can have meetings and pass both video and audio information over the public telephone lines. Typically, a video conferencing system will include one or more cameras, microphones, speakers and displays disposed at each conference location. By transmitting control information over the telephony lines, conferees at any location can control the cameras, microphones, speakers and displays both at their own location and at the locations of the other parties. For example, by operation of a control panel, a conferee at location "A" can pan the camera at location "B" and then zoom in on a particular person or object. An example of a prior art video conferencing system is the PictureTel System 4000 (manufactured by PictureTel Corporation of Danvers, Massachusetts) . In video conferencing systems such as the System 4000, conference control is provided by way of a control box which usually rests on a table. Camera control is performed by repeatedly pressing a button on the control box, representing a direction (up, down, right, left) , or by physically moving the camera to point in the desired direction.
Although the above-described system provides a functional solution to camera control, using a button on a box located on a table in front of the participant requires the participant to take his eyes off the image on the screen to locate the button, and then to repeatedly press the button in order to move the camera in the desired direction. If the participant moves the camera too far, say, to the left, he may again need to look down at the button box to find the button for moving the camera to the right. These small adjustments can require repeated actions distracting to both the person adjusting the camera as well as to the participants at the remote end of the conference.
Another common method for controlling the position of a camera in video conference equipment is to manually move the camera so that it points in the desired direction. This requires sitting within arm's reach of the camera or getting up during a conference to adjust the camera. Further, it takes considerable attention to physically turn the camera and may, at least momentarily, obscure the image during adjustment.
II. Summary of the Invention
This invention provides an intuitive and natural means for controlling a camera being used as part of video- conference equipment. A video conference participant controls the camera by using a pointing device, such as a mouse, to position a cursor on the video display screen.
In one embodiment, the user can control the panning and tilting of the camera by positioning the cursor on one of four arrowheads located on the four edges of the video display screen. The arrowheads are outlines, transparent inside the lines, overlaid onto the video conference image. When a user moves the cursor into a predefined active area surrounding the arrowheads, the entire arrowhead changes appearance (e.g. to a bright color) to indicate that it is active. Once the cursor is in the active area, the user can press a button on the pointing device to move the camera in the desired direction. If the user holds down the button, the camera continues to move in the chosen direction. If the user clicks the button, the camera moves in short increments in the chosen direction. This clicking method allows the user to make small adjustments in the camera's position, whereas holding down the button on the pointing device enables the user to make larger changes in camera position quickly and efficiently. Using this technique of camera control, a participant in a video conference can point the cursor at the top arrow to move the camera up, the bottom arrow to move the camera down, and so on. This action feels much the same as, for instance, pointing the lens of a video camera at an object to be video-taped.
III. Brief Description of the Drawings
FIG. 1 shows the video portion of a video conferencing system according to an embodiment of the present invention;
FIG. 2 is an illustration of a display screen having a graphics overlay with directional control arrows according to an embodiment of the present invention;
FIG. 3 is an illustration of the display screen of FIG. 2 after a directional arrow has been selected;
FIG. 4 is a flowchart illustrating the operation of the processor of FIG. 1 in performing the video graphics overlay functions;
FIG. 5 is a flowchart illustrating the operation of a re-center and zoom operation of the processor of FIG. 1 according to an embodiment of the present invention; FIG. 6 is a flowchart illustrating the operation of re-center and zoom operations of the processor of FIG. 1 according to an alternative embodiment of the present invention.
Like numbered reference numerals appearing in more than one figure represent like elements.
IV. Detailed Description of the Preferred Embodiment This invention provides the means to adjust a camera with minimal distraction to the participants. By using a mouse, a participant can easily move the cursor towards the appropriate arrow. Once the cursor enters the active area for an arrow, the arrowhead lights up to show it is active. As soon as the cursor is in an active area, the user can either cause the camera to move continuously by holding down the mouse button or he can adjust the camera positions in small increments by pressing and releasing the mouse button until the camera reaches the desired spot. Either method can be done without taking ones eyes off the screen. Advantageously, the action of pointing the cursor in the direction one wishes the camera to move is a natural and intuitive means of controlling the camera. FIG. 1 shows the video portion of a video conferencing system according to an embodiment of the present invention. Two conference stations 100, 101 are illustrated, although it should be understood that more stations can be connected into the video conference by way of the digital telephony communication network 102. Each conference station 100, 101 includes a conventional television camera 104 (104A in conference station 101) and its associated control and positioning motors 106. Each station also includes a conventional processor 108 having a pointing device 110 such as a mouse. The processor 108 is connected to a video graphics interface 112 which is, in turn, connected to a video mixer 114. The video graphics interface 112 converts digitally encoded graphics display data generated by the processor 108 into analog video signals. The processor 108 and the video graphics interface can be embodied, for example, as an IBM compatible 80X86 based computer with a Video Graphics Adaptor (VGA) card. Alternatively, the video conferencing system of FIG. l could be, for example, embodied by modifying the processor firmware of an existing PictureTel System 4000 as described herein, and by providing the System 4000 processor with the pointing device hardware and handlers.
Each station also includes a conventional video coder/decoder (Video CODEC) 116 of a type having an internal video data buffer. The Video CODEC 116 is connected to the communications network by way of a digital telephony communications link 118. The Video CODEC 116 sends and receives encoded digital video information from the digital telephony communications network 102 and converts the information into analog video signals. A video mixer 114 is connected to both the Video Codec 116 and the Video Graphics Interface 112. The video mixer receives the analog video signals from Video CODEC 116 and mixes them with the analog video signals from the video graphics interface 112. Thus, the video mixer generates a combined video image comprising the conference image from the Video CODEC 116 overlaid with the graphics video generated by the video graphics interface 112. This combined video image is displayed on a conventional video display 115.
The Video CODEC 116 is also connected to the processor 108 and the television camera 104 and its positioning motors 106. The Video CODEC converts digital camera control information received by way of the telephony network 102 or the processor 108 and converts it into analog camera positioning signals. These signals are sent to the camera positioning motors 106 which control the position of the camera (e.g. pan and tilt) and the camera positive and negative zoom functions. The Video CODEC 116 also receives television signals from the camera 104 and converts them into digital video signals for transmission over the communications network. Either processor 108 can control the camera within its own video conferencing station and, through the Communication Network 102, it can also control the far end (remote) camera. For example, the processor 108 in the first video conferencing station 100 can control its own camera 104 as well as the camera 104A in the second video conferencing station 101. The processor 100 controls which video conferencing station 100, 101 is selected as a source/destination for the Video CODEC 116 by sending the Video CODEC 116 appropriate routing control information. The user determines which conference station's image is to be displayed (and informs the processor 108) by way of a selection on a menu bar. The Video CODEC 116 sends the camera control information generated by the processor 108 to the conferencing station whose image is currently being displayed.
The appearance of the screen of the video display according to an embodiment of the present invention is illustrated in FIGs. 2 and 3. The processor 108 generates camera control arrows 202A-D which are overlaid on a video conference image 204 (received from the Video CODEC 116) . The dotted lines surrounding the arrowheads 202A-D do not appear on the screen. They are included here to indicate the approximate position and size of the active areas.
FIG. 3 illustrates the change in a camera control arrowhead once the cursor enters the active area around the arrowhead. When a user moves the mouse, the cursor (e.g. cross-hairs) are correspondingly moved on the screen. When the processor detects that the cursor has been moved within the active area around an arrowhead, the processor highlights that arrowhead and turns off all of the other arrows. When the processor detects depression of a button on the pointing device, it generates camera positioning control signals which it sends to the Video CODEC 116. Depending on whether the system is set up for remote and/or local camera control, these control" signals are sent either to the communications network (where they are used to control the far end camera position motors at the target remote conference station) or to the local camera positioning motors.
FIG. 4 is a flow chart of the graphic overlay control software for the processor of FIG. 1. In step 402 the processor determines the cursor position on the Video Display 115. This position is controlled by a user by way of the mouse 110. As is conventional, the processor constantly monitors the mouse and displays the cursor at the appropriate position on the video display screen. In steps 404 through 410, the processor determines whether the cursor is within one of the active regions for the displayed directional arrows and if so, determines in which arrow's region the cursor is displayed. Next, in steps 412-418, if the cursor is in the region of one of the arrows, the processor highlights that arrow and turns off highlighting on all of the others. Next, in steps 420-426, the processor determines if the pointing device button is being held down (a single button mouse will be assumed here although the processor could just as readily look for depression of a particular button on a multi-button mouse) . If so, in steps 428-434, the camera is moved one step in the direction of the arrow. If the pointing device button is not being pressed, the processor returns to step 402. If, in steps 404-410, it is determined that the cursor is not within one of the active regions, in step 436, the highlighting on all arrows is turned off. ^ This method of camera control can also be implemented using a keyboard with directional arrows to position a cursor on the displayed arrowheads or with other pointing devices such as a trackball instead of a mouse. Further, a touch-screen monitor can be used to display the video image. In that case, a user would touch the screen in the active area surrounding the arrowhead in order to move the camera.
The camera control arrowhead display can be implemented in several ways. For example, a pure hardware implementation can be used where dedicated hardware in the video circuitry places the camera control arrows at fixed places on the video display. Another approach is to provide the processor with software which writes data directly into the screen buffer of the Video CODEC in order to display the arrowheads on the video screen. A third approach is to provide the processor with a video graphics overlay which impresses the camera control arrows over the standard video display by way of the video mixer.
A similar method can be used to control zoom function. One such method is to use a three button pointing device. In this embodiment one button on the pointing device controls camera pan and tilt in accordance with the selected direction arrow (as previously described) , a second button on the pointing device causes the camera to zoom-in and the third button on the pointing device causes the camera to zoom-out. In an alternative embodiment, the zoom-in and zoom-out buttons can be used to cause the camera to zoom in or out around a specific point selected by the pointing device. In this embodiment, depression of a first button on a three button pointing device causes the processor to generate pan and tilt control data that will cause the active camera (the camera whose image is being displayed) to re-center on a selected point (the point on which the cursor is located) . Depression of the second or third buttons on the pointing device activate, respectively, a combined re- center and zoom-in or re-center and zoom-out operation. In order to use the re-center and zoom function the user positions the cursor (by using the pointing device) in the center of the desired image and then presses the appropriate button (zoom in or zoom out) . The operation of the re-center and zoom control software in the processor 108 of FIG. 1, according to the above-described embodiment, is illustrated in FIG. 5. In step 502 the processor determines if any of the three pointing device buttons are being depressed. If none of the buttons are being depressed the processor returns to step 502. If any one of the buttons are being depressed, in step 504, the processor 108 determines the X,Y coordinates of the cursor position. Next, in step 506 the processor calculates the difference in position (delta X and delta Y) between the cursor position and the center of the displayed conference image. Then, in step 508 the processor uses the delta X and delta Y values to generate the appropriate pan and tilt control information to cause the image to be re-centered around the cursor location and provides this control information to the Video CODEC 116. The pan and tilt control information can be determined by the use of a look up table wherein each entry in the table corresponds to an appropriate number and direction of pan and tilt steps for a given delta X and delta Y. The specific table values can be precoded into the look-up table if the camera/motor types at each station are known or can be exchanged by each station providing the other with its camera/motor control parameters over the communication link.
In steps 510, 512 the processor determines whether one of the zoom-in or zoom-out buttons is being depressed. If not, the processor returns to step 502. If in step 510 it is determined that the zoom-in button is being depressed, in step 514 the processor generates control signals for one zoom-in step (stepping control of the zoom camera motors is assumed here) , provides them to the Video CODEC 116 and returns to step 502. Similarly, if in step 512 it is determined that the zoom-out button is being depressed, in step 516 the processor generates control signals for one zoom-out step, provides them to the Video CODEC 116 and returns to step 502. Just as with the highlighted arrow pan/tilt control (previously described) , if the zoom-in or zoom-out button is held down continuously, the processor will, correspondingly, continuously generate zoom-in or zoom-out control step signals until the button is released.
As an alternative to the above-described embodiment the zoom-in and zoom-out control can be de-coupled from the centering controls. This embodiment is illustrated in FIG. 6. As with the embodiment of FIG. 5, in step 502 the processor determines if any of the three pointing device buttons are being depressed. If none of the buttons are being depressed the processor returns to step 502. If any one of the buttons are being depressed, in step 602 the processor determines if the "re-center" button (e.g. the far left button) has been depressed. If yes, the processor 108 determines the X,Y coordinates of the cursor position in step 504, calculates the delta X, delta Y values relative to the center of the displayed image in step 506 and then, in step 508, generates appropriate pan and tilt control signals to re-center the image. If the re-center button was not depressed, the processor performs step 510 in which the processor determines whether one of the zoom- in button (e.g. the center button on the pointing device) is being depressed. If in step 510 it is determined that the zoom-in button is being depressed, in step 514 the processor generates control signals for one zoom-in step (stepping control of the zoom camera motors is assumed here) , provides them to the Video CODEC 116 and returns to step 502. If the zoom-in button is not being depressed, in step 516 the processor generates control signals for one zoom-out step, provides them to the Video CODEC 116 and returns to step 502. It is noted that the processor can determine that the zoom-out button was depressed since the pointing device is known to be a three button device and the other two buttons were eliminated as choices in steps 602 and 510.
It will be appreciated that in the embodiment of FIG. 5, the user can re-center the image around a selected point and either zoom in, zoom-out or take no further action, by depressing a single button on the pointing device. In the embodiment of FIG. 6, the functions of re-center, zoom-in and zoom-out are each accomplished independently by a separate button. Either of the embodiments of FIG. 5 or 6 can be used in conjunction with the directional arrows of FIGs. 2-4. In such an implementation, the processor first determines if the cursor is within one of the active regions in or around an arrow. If so, the method of FIG. 4 is performed. If not, the method of either FIG. 5 or FIG. 6 is performed (whichever has been implemented) .
In another alternative embodiment, the user can use the pointing device to draw a selection border (preferably rectangular) around the desired image. The processor software then changes the camera position so as to re- center the image (around the center of the selected area) and changes and zoom factor to cause the selected image to fill the display screen. In order to determine the appropriate zoom factor, the processor calculates the ratio (dl/d2) of the diagonal (dl) of the full displayed image over the diagonal (d2) of the selected image. Control software can be used to ensure that the relative dimensions of the selected area are of the same proportion as the displayed image (alternatively, the processor can just re-center and use the diagonal data as is) . Both the zoom and pan and tilt control information can be determined by the use of a look up table. The use of a look up table for re-centering has been previously described. In a similar manner, for the zoom factor, each entry in the table corresponds to an appropriate number of zoom steps for a given dl/d2 ratio. The specific table values can be precoded into the look-up table if the zoom motor types at each station are known or can be exchanged by each station providing the other with its zoom motor control parameters over the communication link. As described with respect to the embodiments of FIG. 5 and FIG. 6, this method of performing a re-center and zoom operation can be used in conjunction with the control arrows of FIGs. 2-4.
As is conventional, the processor may be provided with additional Graphic User Interface (GUI) software which enables a user to control various conference features (e.g. far/near camera select, volume control ...) by way of menu bars displayed along the periphery of the video display screen. These menus can be implemented such that they can be hidden or displayed under user control. When the menus are displayed, the video image is compressed into the slightly smaller space remaining on the display screen.
Now that the invention has been described by way of the preferred embodiment, various enhancements and improvements which do not depart from the scope and spirit of the invention will become apparent to those of skill in the art. Thus it should be understood that the preferred embodiment has been provided by way of example and not by way of limitation. The scope of the invention is defined by the appended claims.

Claims

CLAIMS :
1. A camera control mechanism for a video conferencing system, comprising: first means for receiving video signals from a communications network and generating a video image therefrom; second means for generating a graphic overlay and for superimposing the graphic overlay on the video image; third means for manipulating a cursor on the video image; fourth means, coupled to the first means, the second means and the third means, for determining a position of the cursor on the video image and for generating control signals to move a camera providing a source of the video image in a direction indicated by a position of the cursor on the video image.
2. The apparatus of Claim 1 wherein the third means comprises a pointing device having a control button and wherein the processing means generates the camera control information responsive to the position of the cursor at the time of depression of the control button by the user.
3. The apparatus of Claim 2 wherein the pointing device is a mouse.
4. The apparatus of Claim 2 wherein the graphics overlay comprises directional arrows and wherein the second means displays the video image overlaid with the directional arrows.
5. The apparatus of Claim 4 wherein the fourth means comprises means for determining whether the position of the cursor is in proximity to one of the directional arrows and for generating directional motor control corresponding thereto when the control button is depressed and it is determined that the cursor is in the proximity of the one of the directional arrows.
6. The apparatus of Claim 5 wherein the fourth means comprises means for highlighting the one of the directional arrows.
7. A video conferencing system, comprising: a video encoder/decoder, the video encoder/decoder including means for interfacing with a communication networ ; a camera connected to supply video signals to the video encoder/decoder; a camera positioning motor connected to the camera and connected to receive camera positioning information from the video encoder/decoder; processing means for generating digitally encoded video display data and for generating digital motor control information, the processing means being connected to the digital motor control information to the encoder/decoder; a video graphics interface connected to receive the encoded video display data from the processor; a video mixer, connected to receive graphic video signals from the video graphics interface and live image video signals from the video encoder/decoder; a video display, coupled to the video mixer, for displaying the live video image overlaid with the graphic video; and, a pointing device coupled to the processor; the pointing device comprising a push button switch and signal means for indicating when the push button switch has been depressed by a user; wherein, the processing means further comprises position detection and display means, for causing a cursor to be displayed on the video display at a position responsive to user control of the pointing device, and for generating different motor control signals dependent on a calculated position of the cursor on the live video image when the push button switch is depressed by the user.
8. The apparatus of Claim 7 wherein the graphics image comprises directional arrows and wherein the video mixer displays the live image overlaid with the directional arrows.
9. The apparatus of Claim 8 wherein the processing means comprises means for determining whether the position of the cursor is in proximity to one of the directional arrows and for generating directional motor control corresponding thereto when the push button switch is depressed and it is determined that the cursor is in the proximity of the one of the directional arrows.
10. The apparatus of Claim 9 wherein the processing means comprises means for highlighting the one of the directional arrows.
11. The apparatus of Claim 7 wherein the processing means further comprises means for causing a selected camera to zoom in on a point of the live video image subject responsive to user manipulation of the cursor on the live video image.
12. A method of controlling a camera in a video conferencing system, comprising the steps of : receiving video signals from a communications network and generating a live video image therefrom; manipulating a cursor on the live video image; positioning a cursor on the live video image using a pointing device having at least one control switch; determining when the control switch has been actuated and, in response, determining a present position of the cursor on the live video image; in further response to actuation of the control switch, generating camera control signals to reposition a camera providing a source of the live video image so as to re-center the live image around the present position of the cursor.
13. The method of Claim 12 comprising the further step of: in further response to actuation of the control switch, generating camera control signals to cause the camera providing the source of the live video image so as to perform a zoom-in step on the live video image.
14. The method of Claim 12 comprising the further step of: in further response to actuation of the control switch, generating camera control signals to cause the camera providing the source of the live video image so as to perform a zoom-out step on the live video image.
15. The method of Claim 12 comprising the further steps of: generating a graphic overlay comprising a plurality of directional arrows; superimposing the graphic overlay on the live video image; defining an active region around each of the directional arrows; responsive to the switch being actuated determining if the cursor is in one of the active regions; if the cursor is in one of the active regions, generating camera control signals to move the camera providing the source of the live video image in a direction of a directional arrow within the one of the active regions, instead of re-centering the live video image around the present position of the cursor.
16. A method of controlling a camera in a video conferencing system, comprising the steps of : receiving video signals from a communications network and generating a live video image therefrom; generating a graphic overlay comprising a plurality of directional arrows; superimposing the graphic overlay on the live video image; defining an active region around each of the directional arrows; manipulating a cursor on the live video image; positioning a cursor on the live video image using a pointing device having at least a first control switch; determining when the control switch has been actuated and, in response, determining a present position of the cursor on the live video image; responsive to the control switch being actuated determining if the present position of the cursor is in one of the active regions; if the present position of the cursor is in one of the active regions, generating camera control signals to move the camera providing the source of the live video image in a direction of a directional arrow within the one of the active regions.
17. The method of Claim 16, comprising the further steps of: determining when a second control switch on the pointing device has been actuated; and, responsive to the second control switch being actuated generating camera control signals to cause the camera providing the source of the live video image to change a zoom factor of the live video image.
18. A video conferencing system, comprising: means for receiving video signals from a communications network and generating a live video image therefrom; means for generating a graphic overlay comprising a plurality of directional arrows; a video display; means, coupled to the video display, the means for receiving and the means for generating, for superimposing the graphic overlay on the live video image; means for defining an active region around each of the directional arrows; a pointing device for manipulating a cursor on the live video image, the pointing device having at least a first control switch; means, coupled to the pointing device, for determining when the control switch has been actuated and, in response, determining a present position of the cursor on the live video image; means, coupled to the pointing device, the means for defining, and the communications network, for determining if the present position of the cursor is in one of the active regions responsive to the control switch being actuated, and if the present position of the cursor is in one of the active regions, generating camera control signals to move the camera providing the source of the live video image in a direction of a directional arrow within the one of the active regions.
PCT/US1993/007948 1992-09-21 1993-08-24 Method and apparatus for on-screen camera control in video-conference equipment WO1994007327A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US947,866 1986-12-30
US94786692A 1992-09-21 1992-09-21

Publications (1)

Publication Number Publication Date
WO1994007327A1 true WO1994007327A1 (en) 1994-03-31

Family

ID=25486913

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1993/007948 WO1994007327A1 (en) 1992-09-21 1993-08-24 Method and apparatus for on-screen camera control in video-conference equipment

Country Status (1)

Country Link
WO (1) WO1994007327A1 (en)

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5515099A (en) * 1993-10-20 1996-05-07 Video Conferencing Systems, Inc. Video conferencing system controlled by menu and pointer
EP0690616A3 (en) * 1994-06-27 1996-11-06 Matsushita Electric Ind Co Ltd Remote-control method for camera and remote-control device therefor
DE19531213A1 (en) * 1995-08-24 1997-02-27 Siemens Ag Interactive video camera control method
US6675386B1 (en) 1996-09-04 2004-01-06 Discovery Communications, Inc. Apparatus for video access and control over computer network, including image correction
US20070171273A1 (en) * 2006-01-26 2007-07-26 Polycom, Inc. System and Method for Controlling Videoconference with Touch Screen Interface
EP1465413A3 (en) * 1995-03-20 2010-01-06 Canon Kabushiki Kaisha Camera control system
US7716349B1 (en) 1992-12-09 2010-05-11 Discovery Communications, Inc. Electronic book library/bookstore system
US7835989B1 (en) 1992-12-09 2010-11-16 Discovery Communications, Inc. Electronic book alternative delivery systems
US7849393B1 (en) 1992-12-09 2010-12-07 Discovery Communications, Inc. Electronic book connection to world watch live
US7861166B1 (en) 1993-12-02 2010-12-28 Discovery Patent Holding, Llc Resizing document pages to fit available hardware screens
US7865567B1 (en) 1993-12-02 2011-01-04 Discovery Patent Holdings, Llc Virtual on-demand electronic book
US7865405B2 (en) 1992-12-09 2011-01-04 Discovery Patent Holdings, Llc Electronic book having electronic commerce features
US20110234746A1 (en) * 2006-01-26 2011-09-29 Polycom, Inc. Controlling videoconference with touch screen interface
US8073695B1 (en) 1992-12-09 2011-12-06 Adrea, LLC Electronic book with voice emulation features
US8095949B1 (en) 1993-12-02 2012-01-10 Adrea, LLC Electronic book with restricted access features
CN102685440A (en) * 2011-03-07 2012-09-19 株式会社理光 Automated selection and switching of displayed information
US8451314B1 (en) * 2009-11-20 2013-05-28 Cerner Innovation, Inc. Bi-directional communication system
US8578410B2 (en) 2001-08-03 2013-11-05 Comcast Ip Holdings, I, Llc Video and digital multimedia aggregator content coding and formatting
US8621521B2 (en) 2001-08-03 2013-12-31 Comcast Ip Holdings I, Llc Video and digital multimedia aggregator
US8804321B2 (en) 2012-05-25 2014-08-12 Steelcase, Inc. Work and videoconference assembly
US9053640B1 (en) 1993-12-02 2015-06-09 Adrea, LLC Interactive electronic book
US9053455B2 (en) 2011-03-07 2015-06-09 Ricoh Company, Ltd. Providing position information in a collaborative environment
US9086798B2 (en) 2011-03-07 2015-07-21 Ricoh Company, Ltd. Associating information on a whiteboard with a user
US9286294B2 (en) 1992-12-09 2016-03-15 Comcast Ip Holdings I, Llc Video and digital multimedia aggregator content suggestion engine
US9813641B2 (en) 2000-06-19 2017-11-07 Comcast Ip Holdings I, Llc Method and apparatus for targeting of interactive virtual objects

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3823219C1 (en) * 1988-07-08 1989-05-18 Telenorma Telefonbau Und Normalzeit Gmbh, 6000 Frankfurt, De

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3823219C1 (en) * 1988-07-08 1989-05-18 Telenorma Telefonbau Und Normalzeit Gmbh, 6000 Frankfurt, De

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
JOHANSEN CHR., SCHONFELDER H.: "DIE UEBERTRAGUNG VON TELESEMINAREN UEBER DEN "OLYMPUS"-SATELLITEN.", FKT FERNSEH UND KINOTECHNIK., FACHVERLAG SCHIELE & SCHON GMBH., BERLIN., DE, vol. 45., no. 03., 1 January 1991 (1991-01-01), DE, pages 136 - 141., XP000224242, ISSN: 1430-9947 *
NEURAL NETWORKS: ACADEMIC/INDUSTRIAL/NASA/DEFENSE vol. 1721, June 1992, AUBURN, AL, USA pages 511 - 520 K.N.MAGEE 'A SIMPLE FUZZY LOGIC REAL-TIME CAMERA TRACKING SYSTEM' *
PATENT ABSTRACTS OF JAPAN vol. 13, no. 354 (E-802)8 August 1989 *
PATENT ABSTRACTS OF JAPAN vol. 16, no. 279 (E-1220)22 June 1992 *
PATENT ABSTRACTS OF JAPAN vol. 17, no. 148 (E-1338)24 March 1993 *
TANIGAWA H., ET AL.: "PERSONAL MULTIMEDIA-MULTIPOINT TELECONFERENCE SYSTEM.", NETWORKING IN THE NINETIES. BAL HARBOUR, APR. 7 - 11, 1991., NEW YORK, IEEE., US, vol. 03., 7 April 1991 (1991-04-07), US, pages 1127 - 1134., XP000223440, ISBN: 978-0-87942-694-1, DOI: 10.1109/INFCOM.1991.147629 *

Cited By (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7865405B2 (en) 1992-12-09 2011-01-04 Discovery Patent Holdings, Llc Electronic book having electronic commerce features
US7849393B1 (en) 1992-12-09 2010-12-07 Discovery Communications, Inc. Electronic book connection to world watch live
US8073695B1 (en) 1992-12-09 2011-12-06 Adrea, LLC Electronic book with voice emulation features
US7835989B1 (en) 1992-12-09 2010-11-16 Discovery Communications, Inc. Electronic book alternative delivery systems
US9286294B2 (en) 1992-12-09 2016-03-15 Comcast Ip Holdings I, Llc Video and digital multimedia aggregator content suggestion engine
US7716349B1 (en) 1992-12-09 2010-05-11 Discovery Communications, Inc. Electronic book library/bookstore system
US5515099A (en) * 1993-10-20 1996-05-07 Video Conferencing Systems, Inc. Video conferencing system controlled by menu and pointer
US7861166B1 (en) 1993-12-02 2010-12-28 Discovery Patent Holding, Llc Resizing document pages to fit available hardware screens
US7865567B1 (en) 1993-12-02 2011-01-04 Discovery Patent Holdings, Llc Virtual on-demand electronic book
US9053640B1 (en) 1993-12-02 2015-06-09 Adrea, LLC Interactive electronic book
US8095949B1 (en) 1993-12-02 2012-01-10 Adrea, LLC Electronic book with restricted access features
EP0690616A3 (en) * 1994-06-27 1996-11-06 Matsushita Electric Ind Co Ltd Remote-control method for camera and remote-control device therefor
US5835140A (en) * 1994-06-27 1998-11-10 Matsushita Electric Industrial Co., Ltd. Remote-control method and apparatus for rotating image device
EP1465413A3 (en) * 1995-03-20 2010-01-06 Canon Kabushiki Kaisha Camera control system
DE19531213A1 (en) * 1995-08-24 1997-02-27 Siemens Ag Interactive video camera control method
US6675386B1 (en) 1996-09-04 2004-01-06 Discovery Communications, Inc. Apparatus for video access and control over computer network, including image correction
US8548813B2 (en) 1999-06-25 2013-10-01 Adrea, LLC Electronic book with voice emulation features
US9099097B2 (en) 1999-06-25 2015-08-04 Adrea, LLC Electronic book with voice emulation features
US9813641B2 (en) 2000-06-19 2017-11-07 Comcast Ip Holdings I, Llc Method and apparatus for targeting of interactive virtual objects
US10140433B2 (en) 2001-08-03 2018-11-27 Comcast Ip Holdings I, Llc Video and digital multimedia aggregator
US8578410B2 (en) 2001-08-03 2013-11-05 Comcast Ip Holdings, I, Llc Video and digital multimedia aggregator content coding and formatting
US8621521B2 (en) 2001-08-03 2013-12-31 Comcast Ip Holdings I, Llc Video and digital multimedia aggregator
US10349096B2 (en) 2001-08-03 2019-07-09 Comcast Ip Holdings I, Llc Video and digital multimedia aggregator content coding and formatting
US8872879B2 (en) 2006-01-26 2014-10-28 Polycom, Inc. System and method for controlling videoconference with touch screen interface
US20110234746A1 (en) * 2006-01-26 2011-09-29 Polycom, Inc. Controlling videoconference with touch screen interface
US8593502B2 (en) * 2006-01-26 2013-11-26 Polycom, Inc. Controlling videoconference with touch screen interface
US20070171273A1 (en) * 2006-01-26 2007-07-26 Polycom, Inc. System and Method for Controlling Videoconference with Touch Screen Interface
US8451314B1 (en) * 2009-11-20 2013-05-28 Cerner Innovation, Inc. Bi-directional communication system
CN102685440B (en) * 2011-03-07 2015-09-02 株式会社理光 The automatic selection of display information and switching
US9086798B2 (en) 2011-03-07 2015-07-21 Ricoh Company, Ltd. Associating information on a whiteboard with a user
US9716858B2 (en) 2011-03-07 2017-07-25 Ricoh Company, Ltd. Automated selection and switching of displayed information
CN102685440A (en) * 2011-03-07 2012-09-19 株式会社理光 Automated selection and switching of displayed information
US9053455B2 (en) 2011-03-07 2015-06-09 Ricoh Company, Ltd. Providing position information in a collaborative environment
US8804321B2 (en) 2012-05-25 2014-08-12 Steelcase, Inc. Work and videoconference assembly
US10786074B2 (en) 2012-05-25 2020-09-29 Steelcase Inc. Work and videoconference assembly
US11185158B1 (en) 2012-05-25 2021-11-30 Steelcase Inc. Work and videoconference assembly
US11612240B1 (en) 2012-05-25 2023-03-28 Steelcase Inc. Work and videoconference assembly

Similar Documents

Publication Publication Date Title
WO1994007327A1 (en) Method and apparatus for on-screen camera control in video-conference equipment
EP2489182B1 (en) Device and method for camera control
US6346962B1 (en) Control of video conferencing system with pointing device
US5657246A (en) Method and apparatus for a video conference user interface
US6433796B1 (en) Apparatus and method for displaying both an image and control information related to the image
US5936610A (en) Control device for image input apparatus
EP2446619B1 (en) Method and device for modifying a composite video signal layout
JP3862315B2 (en) Image display apparatus and control method thereof
US8300078B2 (en) Computer-processor based interface for telepresence system, method and computer program product
US20050024485A1 (en) Graphical user interface for system status alert on videoconference terminal
JP3335017B2 (en) Camera device control device
JPH07135594A (en) Image pickup controller
JP3036088B2 (en) Sound signal output method for displaying multiple image windows
JP2981408B2 (en) Method and apparatus for controlling high-speed introduction of a target object in a camera image
JPH06311510A (en) Conference supporting system for remote location
JP3530579B2 (en) Camera control device and window display method of camera control device
JPH05316504A (en) Seat selection device in video communication
JPH099231A (en) Video conference system and camera control system
JP2000152204A (en) Conference display method and conference display system
JP3121418B2 (en) Pointing system
JP3826130B2 (en) Camera control device and window control method of camera control device
JPH02193448A (en) Television communication device
JPH11122594A (en) Electronic conference system
JPH04323990A (en) Designation method of portrait camera image pickup position in image pickup system
JPH0869271A (en) Image display device

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): CA FI JP

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FR GB GR IE IT LU MC NL PT SE

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: CA