WO1990002370A1 - Remote operated vehicle control - Google Patents

Remote operated vehicle control Download PDF

Info

Publication number
WO1990002370A1
WO1990002370A1 PCT/GB1989/001018 GB8901018W WO9002370A1 WO 1990002370 A1 WO1990002370 A1 WO 1990002370A1 GB 8901018 W GB8901018 W GB 8901018W WO 9002370 A1 WO9002370 A1 WO 9002370A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
video
control system
information
picture
Prior art date
Application number
PCT/GB1989/001018
Other languages
French (fr)
Inventor
Rodney John Blissett
Christopher George Harris
Debra Charnley
Original Assignee
Plessey Overseas Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Plessey Overseas Limited filed Critical Plessey Overseas Limited
Publication of WO1990002370A1 publication Critical patent/WO1990002370A1/en

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0011Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement
    • G05D1/0038Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement by providing the operator with simple or augmented images from one or more cameras located onboard the vehicle, e.g. tele-operation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/002Special television systems not provided for by H04N7/007 - H04N7/18
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/20Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video object coding

Definitions

  • This invention relates to remote operated vehicle control. It relates particularly to means for controlling a remote operated vehicle where the connecting link carries information for a picture display of the ground surface and obstacles ahead of the vehicle.
  • One type of remote operated vehicle makes use of a cable connection between the vehicle and the remotely located operator.
  • a cable can carry a wide bandwidth of information for the operator including video signals, for example from a television camera mounted on the vehicle.
  • video signals for example from a television camera mounted on the vehicle.
  • An alternative approach would be to use a high bandwidth radio link but this can also restrict the range of operations and the type of country to be driven through.
  • it may not be suitable for an application where more than one remotely controlled vehicle is to be operated together.
  • There do exist techniques for video data compression which might be helpful, but these do not have adequate compression ratios to enable low bandwidth radio links to be ⁇ sed, in order to allow the necessary sensory feedback information to be transmitted .
  • the present invention was devised to provide a method for remote vehicle control where the need for a dedicated high bandwidth radio link can be avoided.
  • a remote operated vehicle control system comprising a remotely controlled vehicle carrying a television camera, the camera and vehicle being connected by a radio link to a separate base station having a radio receiver, in which the video picture for transmission is divided into separate data sets one part of which exhibits short timescale variations and another part of which exhibits long timescale variations, the two data sets being transmitted at differing video frame rates, and means at the receiver for combining the two transmissions to give a composite picture on a video display of said receiver.
  • the said video frame rates may include a rate of thirty hertz with a second slower rate.
  • the picture information for transmission is processed to select generally static features which are used to generate a synthesized scene in the video display.
  • the selected picture information may be processed such that only edge and corner information is used for said slower frame rate transmission.
  • Tne received slower frame rate information may be converted to video frame rate for projection into the image plane of the video display.
  • Figure 1 shows the remote operated vehicle in a typical scene with a base station which accommodates a control console for a vehicle operator
  • FIG. 2 is a block diagram of the main system control units, and,
  • FIG. 3 shows the different data transformations that are necessary to provide the required picture display.
  • a remote operated vehicle 1 is shown travelling along a country road and the vehicle is under the control of an operator located at a base station 2.
  • the operator will be positioned at an operator console 3 and this makes use of a video display showing the scene in front of the vehicle.
  • a television camera 4 located on the vehicle roof transmits a picture of the ground surface ahead of the vehicle 1 to a receiver at the console 3.
  • the radio link 6 is a two-way one so that control signals from the console 3 can be used to guide the vehicle.
  • signals from sensors on the vehicle 1 can be transmitted to the console 3 so that information on the braking effect, steering movements etc. can be applied to the video display.
  • each image is broken down by an information processing stage into 'image tokens'.
  • These tokens are representative of the positions of comer and edge features present in a given video picture frame and some description of the identification of these features is given in our copending patent application No. 881123 entitled 'Digital Data Processing'.
  • the processing circuitry will be able to synthesise a skeletal representation of the scene information based on the received token data.
  • a moving (video-rate) representation of the viewed scene can be generated based on a lower frequency update (for example, at two hertz), provided that the synthesized scene is reprojected onto the image plane at the normal video frame rate (thirty hertz).
  • a lower frequency update for example, at two hertz
  • the synthesized scene is reprojected onto the image plane at the normal video frame rate (thirty hertz).
  • the vehicle motion parameters are transmitted back to the base station at video rate.
  • This arrangement ensures that a real time video display is available at the base station faithfully reproducing the short timescale variations due to the vehicle dynamics and recording at a lower rate the general terrain and evolution of objects in the field of view of the television camera.
  • the dynamically updated scene may then be viewed by the operator and used to provide the sensory feedback information to enable the vehicle to be driven.
  • the need to compress the data results in a partial loss of information in that the actual gray- levels recorded by the television camera are not reconstituted
  • the reconstruction of 3D information provides valuable additional quantitative data regarding the nature of the terrain ahead of the vehicle and the presence of obstacles.
  • This method offers the advantages of eliminating the need for cumbersome and restrictive umbilicals and does not require dedicated high bandwidth radio links. Hence, the operability of the teleoperated vehicle is enhanced accordingly.
  • FIG. 2 The main system control units required to carry out this process are depicted in Figure 2.
  • This Figure shows the television camera 4 (which uses, for preference, a non-interlaced line arrangement) which delivers analogue video data by cable to a front- end video processor 7.
  • the processor 7 digitises each captured frame to 512 by 512 eight-bit pixels and this data is held in a frame store.
  • the front-end processor is designed to decompose each digitised frame into a list of extracted image tokens and associated attributes. These tokens may be based on localised corners and/or edges.
  • the design of the processor 7 is based on the VME-bus architecture and it follows the construction disclosed in the aforementioned paten: application.
  • the data emanating from the front-end processor 7 is at a much reduced data rate (about 1.8 megabits per second).
  • This data stream is passed into a 3D geometry module 8 which may additionally accept data from auxiliary motion and attitude sensors on board the vehicle 1.
  • the 3D geometry module 8 matches image tokens from frame to frame, computes or utilises the vehicle motion parameters and, through triangulation, estimates the 3D locations of the corresponding scene tokens. These estimates are continually refined as more data is accepted from subsequent frames.
  • the outputs from the module 8 are the refined motion parameter estimates together with a list of the relevant 3D locations of scene tokens that are currently visible to the television camera 4. These outputs are transmitted to the base station 2 by means of a transmitter 9 and the low bandwidth radio link 6.
  • the motion data is sent every video frame (at thirty hertz frame rate) and the 3D data is sent at a lower rate (for example, two hertz).
  • a processor for the 3D geometry module 8 employs a parallel processing architecture.
  • the rate of data transmission through the link 6 will be about sixteen kilobits per second.
  • the 3D scene tokens are used to synthesise a 3D surface representation of the viewed scene. This is done by means of a 3D surface generator 12.
  • the method for obtaining a points-only set of scene tokens is described in the aforementioned published paper.
  • the 3D tokens are reprojected back onto the current image plane, utilising the most recent vehicle motion and attitude data.
  • a Delaunay triangulation is then performed on the image plane through the projected tokens. This triangulation ensures that a corresponding 3D surface is reconstructed that is single valued in depth.
  • the surface may then be visualised by means of an orthogonal grid of contours (formed at the intersections of the surface with an equispaced set of orthogonal vertical planes). Furthermore, potential obstacles and regions unsuitable for driving may be inferred from the 3D surface and highlighted, if required.
  • This information is then presented as a dynamic display on the operator console 3.
  • the operator is then able to drive the vehicle by using slave controls and observing the display console.
  • the Tequired vehicle steering and braking demands are transmitted back to the vehicle 1.
  • the provision of 3D information to the operator will enable the nature of the terrain ahead of the vehicle to be assessed and potential obstacles can be located and avoided.
  • Figure 3 shows the different data transformations that are necessary to obtain the required picture display.
  • the signals from the television camera produce a series of video frames 14 at a rate of thirty per second. The frames are depicted as they are spaced in sequence along the time coordinate 16. From each spaced video frame, the front end processor acts to produce sets of image tokens 17. The sets of image tokens are then delivered to the 3D geometry module to contribute towards the production of 3D scene , tokens 18.
  • the vehicle 1 is also equipped with auxiliary sensors which provide information on, for example, vehicle speed and attitude.
  • the information 19 from these auxiliary sensors is processed to form a vehicle motion vector 21 which will have characteristics that will similarly vary along the time coordinate.
  • the information in the vehicle motion vector is transmitted by means of the radio link 6 at a rate of thirty frames per second to form a received vehicle motion vector 22 at the base station.
  • the information 19 from the auxiliary sensors is also delivered to the 3D geometry module to contribute towards the production of the 3D scene tokens 18.
  • These tokens 18 are varied at a rate of thirty frames per second, they are then sampled at a rate of two frames per second and the resulting data is transmitted by means of the radio link 6 to the base station.
  • the information received at the base station consisting of the sampled 3D scene tokens 23 at a rate of two frames per second and the received vehicle motion vector 22 at thirty frames per second, is then combined to provide a series of projected image token frames 24. This is done by reprojecting the 3D tokens back onto the current image plane utilising the most recent vehicle motion and attitude data.
  • a Delaunay triangulation 26 is performed on the image plane through the projected tokens. This triangulation ensures that a corresponding 3D surface is reconstructed that is single valued in depth.
  • the surface is then visualised by means of an orthogonal grid of contours to create the 3D surface contours and navigable regions at a rate of thirty frames per second.
  • This contour information 27 can be displayed on the video screen to give a somewhat simplified view of the terrain but one that will still allow the operator to drive the vehicle.
  • the invention is not necessarily restricted to control of a road vehicle, and it could be used for other purposes such as for transmitting sensory feedback information for the control of 2 remote manipulator or for remote landing of an unmanned aircraft.

Abstract

A remote operated vehicle control system comprises a remotely controlled vehicle (1) carrying a television camera (4), the camera (4) and vehicle (1) being connected by a radio link (6) to a separate base station (2) having a radio receiver (3), in which the video picture for transmission is divided into separate data sets one part of which exhibits short timescale variations and another part of which exhibits long timescale variations, the two data sets being transmitted at differing video frame rates, and means at the receiver for combining the two transmissions to give a composite picture on a video display of said receiver (3). This system can allow good control of the vehicle without demanding a high bandwidth radio link.

Description

REMOTE OPERATED VFHTa .F. fT> TROT
This invention relates to remote operated vehicle control. It relates particularly to means for controlling a remote operated vehicle where the connecting link carries information for a picture display of the ground surface and obstacles ahead of the vehicle.
One type of remote operated vehicle makes use of a cable connection between the vehicle and the remotely located operator. Such a cable can carry a wide bandwidth of information for the operator including video signals, for example from a television camera mounted on the vehicle. However, the necessarily limited length of the cable will restrict the range of the vehicle, it will also restrict manoeuvrability and be susceptible to damage or breakage. An alternative approach would be to use a high bandwidth radio link but this can also restrict the range of operations and the type of country to be driven through. In addition, it may not be suitable for an application where more than one remotely controlled vehicle is to be operated together. There do exist techniques for video data compression which might be helpful, but these do not have adequate compression ratios to enable low bandwidth radio links to be πsed, in order to allow the necessary sensory feedback information to be transmitted .
The present invention was devised to provide a method for remote vehicle control where the need for a dedicated high bandwidth radio link can be avoided.
According to the invention, there is provided a remote operated vehicle control system comprising a remotely controlled vehicle carrying a television camera, the camera and vehicle being connected by a radio link to a separate base station having a radio receiver, in which the video picture for transmission is divided into separate data sets one part of which exhibits short timescale variations and another part of which exhibits long timescale variations, the two data sets being transmitted at differing video frame rates, and means at the receiver for combining the two transmissions to give a composite picture on a video display of said receiver.
The said video frame rates may include a rate of thirty hertz with a second slower rate.
Preferably, the picture information for transmission is processed to select generally static features which are used to generate a synthesized scene in the video display. The selected picture information may be processed such that only edge and corner information is used for said slower frame rate transmission. Tne received slower frame rate information may be converted to video frame rate for projection into the image plane of the video display. By way of example, a particular embodiment of the invention will now be described with reference to the accompanying drawings, in which:
Figure 1 shows the remote operated vehicle in a typical scene with a base station which accommodates a control console for a vehicle operator,
Figure 2 is a block diagram of the main system control units, and,
Figure 3 shows the different data transformations that are necessary to provide the required picture display. As depicted in Figure 1, a remote operated vehicle 1 is shown travelling along a country road and the vehicle is under the control of an operator located at a base station 2. The operator will be positioned at an operator console 3 and this makes use of a video display showing the scene in front of the vehicle. A television camera 4 located on the vehicle roof transmits a picture of the ground surface ahead of the vehicle 1 to a receiver at the console 3. The radio link 6 is a two-way one so that control signals from the console 3 can be used to guide the vehicle. In addition to the picture information, signals from sensors on the vehicle 1 can be transmitted to the console 3 so that information on the braking effect, steering movements etc. can be applied to the video display.
As already mentioned, to transmit a full video picture to the base station would take up a large video bandwidth. In some applications this might be acceptable, but in the event that more than one vehicle might be required to operate simultaneously, it would be preferable to reduce this bandwidth. In the present invention, this is effected by transmitting different pans of the video information at different data rates. The underlying vehicle motion information which changes comparatively rapidly is transmitted at the usual video frame rate. The underlying 3D scene structure changes slowly and is transmitted at a slow frame rate. This means that it might take several video frames for newly inferred structural information to be transmitted.
To infer the structural information of the observed scene, each image is broken down by an information processing stage into 'image tokens'. These tokens are representative of the positions of comer and edge features present in a given video picture frame and some description of the identification of these features is given in our copending patent application No. 881123 entitled 'Digital Data Processing'.
An appropriate application of a 'Structure-from-Motion' algorithm as discussed in the paper 'Towards robot mobility through passive monocular vision' Blissett, R.J., Charnley, D. and Harris, C.G., Proceedings International Symposium on Teleoperation and Control, pp. 123-132, July 1988, is then made. The equivalent 3D locations of the corresponding scene features can then be estimated relative to the current position and attitude of the vehicle. The viewed scene is next decomposed into 3D tokens. This information can then be transmitted to the base station via a low bandwidth radio link.
At the base station, the processing circuitry will be able to synthesise a skeletal representation of the scene information based on the received token data. A moving (video-rate) representation of the viewed scene can be generated based on a lower frequency update (for example, at two hertz), provided that the synthesized scene is reprojected onto the image plane at the normal video frame rate (thirty hertz). In order to accomplish this it is required that the vehicle motion parameters are transmitted back to the base station at video rate. This arrangement ensures that a real time video display is available at the base station faithfully reproducing the short timescale variations due to the vehicle dynamics and recording at a lower rate the general terrain and evolution of objects in the field of view of the television camera. The dynamically updated scene may then be viewed by the operator and used to provide the sensory feedback information to enable the vehicle to be driven. Whilst the need to compress the data results in a partial loss of information in that the actual gray- levels recorded by the television camera are not reconstituted, the reconstruction of 3D information provides valuable additional quantitative data regarding the nature of the terrain ahead of the vehicle and the presence of obstacles. This method offers the advantages of eliminating the need for cumbersome and restrictive umbilicals and does not require dedicated high bandwidth radio links. Hence, the operability of the teleoperated vehicle is enhanced accordingly.
Some benefits of the system of the application will be apparent from the following comparison between the data rate requirements of different video transmission formats:
Video Format Data Rate (kbits/s)
Full resolution digital 60000
Full resolution compressed (10:1) 6000
Low resolution compressed (10:1) 1500
Full resolution edge image about 1500 two hertz update of 240 scene tokens plus thirty hertz update of 15, plus about 1 vehicle parameters
The main system control units required to carry out this process are depicted in Figure 2. This Figure shows the television camera 4 (which uses, for preference, a non-interlaced line arrangement) which delivers analogue video data by cable to a front- end video processor 7. The processor 7 digitises each captured frame to 512 by 512 eight-bit pixels and this data is held in a frame store. The front-end processor is designed to decompose each digitised frame into a list of extracted image tokens and associated attributes. These tokens may be based on localised corners and/or edges. The design of the processor 7 is based on the VME-bus architecture and it follows the construction disclosed in the aforementioned paten: application.
The data emanating from the front-end processor 7 is at a much reduced data rate (about 1.8 megabits per second). This data stream is passed into a 3D geometry module 8 which may additionally accept data from auxiliary motion and attitude sensors on board the vehicle 1. The 3D geometry module 8 matches image tokens from frame to frame, computes or utilises the vehicle motion parameters and, through triangulation, estimates the 3D locations of the corresponding scene tokens. These estimates are continually refined as more data is accepted from subsequent frames. The outputs from the module 8 are the refined motion parameter estimates together with a list of the relevant 3D locations of scene tokens that are currently visible to the television camera 4. These outputs are transmitted to the base station 2 by means of a transmitter 9 and the low bandwidth radio link 6. The motion data is sent every video frame (at thirty hertz frame rate) and the 3D data is sent at a lower rate (for example, two hertz). A processor for the 3D geometry module 8 employs a parallel processing architecture. The rate of data transmission through the link 6 will be about sixteen kilobits per second.
Once the data is received by means of a receiver 11 at the base station 2, the 3D scene tokens are used to synthesise a 3D surface representation of the viewed scene. This is done by means of a 3D surface generator 12. The method for obtaining a points-only set of scene tokens is described in the aforementioned published paper. First, the 3D tokens are reprojected back onto the current image plane, utilising the most recent vehicle motion and attitude data. A Delaunay triangulation is then performed on the image plane through the projected tokens. This triangulation ensures that a corresponding 3D surface is reconstructed that is single valued in depth. The surface may then be visualised by means of an orthogonal grid of contours (formed at the intersections of the surface with an equispaced set of orthogonal vertical planes). Furthermore, potential obstacles and regions unsuitable for driving may be inferred from the 3D surface and highlighted, if required.
This information is then presented as a dynamic display on the operator console 3. The operator is then able to drive the vehicle by using slave controls and observing the display console. The Tequired vehicle steering and braking demands are transmitted back to the vehicle 1. The provision of 3D information to the operator will enable the nature of the terrain ahead of the vehicle to be assessed and potential obstacles can be located and avoided.
Figure 3 shows the different data transformations that are necessary to obtain the required picture display. The signals from the television camera produce a series of video frames 14 at a rate of thirty per second. The frames are depicted as they are spaced in sequence along the time coordinate 16. From each spaced video frame, the front end processor acts to produce sets of image tokens 17. The sets of image tokens are then delivered to the 3D geometry module to contribute towards the production of 3D scene , tokens 18.
The vehicle 1 is also equipped with auxiliary sensors which provide information on, for example, vehicle speed and attitude. The information 19 from these auxiliary sensors is processed to form a vehicle motion vector 21 which will have characteristics that will similarly vary along the time coordinate. The information in the vehicle motion vector is transmitted by means of the radio link 6 at a rate of thirty frames per second to form a received vehicle motion vector 22 at the base station.
The information 19 from the auxiliary sensors is also delivered to the 3D geometry module to contribute towards the production of the 3D scene tokens 18. These tokens 18 are varied at a rate of thirty frames per second, they are then sampled at a rate of two frames per second and the resulting data is transmitted by means of the radio link 6 to the base station.
The information received at the base station, consisting of the sampled 3D scene tokens 23 at a rate of two frames per second and the received vehicle motion vector 22 at thirty frames per second, is then combined to provide a series of projected image token frames 24. This is done by reprojecting the 3D tokens back onto the current image plane utilising the most recent vehicle motion and attitude data. A Delaunay triangulation 26 is performed on the image plane through the projected tokens. This triangulation ensures that a corresponding 3D surface is reconstructed that is single valued in depth. The surface is then visualised by means of an orthogonal grid of contours to create the 3D surface contours and navigable regions at a rate of thirty frames per second. This contour information 27 can be displayed on the video screen to give a somewhat simplified view of the terrain but one that will still allow the operator to drive the vehicle.
The foregoing description of an embodiment of the invention has been given by way of example only, and a number of modifications may be made without departing from the scope of the invention as defined in the appended claims. For instance, the rate of change of picture frame data could readily differ from the rates described in this example, namely thirty and two frames per second. Instead of using auxiliary sensors on the vehicle to give data pertaining to the vehicle position and attitude, this information could be obtained in another way, such as by use of the Structure-fro - Motion algorithm already mentioned.
The invention is not necessarily restricted to control of a road vehicle, and it could be used for other purposes such as for transmitting sensory feedback information for the control of 2 remote manipulator or for remote landing of an unmanned aircraft.

Claims

1. A remote operated vehicle control system comprising a remotely controlled vehicle carrying a television camera, the camera and vehicle being connected by a. radio link to a separate base station having a radio receiver, in which the video picture for transmission is divided into separate data sets one part of which exhibits short timescale variations and another part of which exhibits long timescale variations, the two data sets being transmitted at differing video frame rates, and means at the receiver for combining the two transmissions to give a composite picture on a video display of said receiver.
2. A control system as claimed in Claim 1 , in which the said video frame rates include a rate of thirty hertz with a second slower rate.
3. A control system as claimed in Claim 1 or 2, in which the picture part having generally static features serves to generate a synthesised scene in the video display.
4. A control system as claimed in Claim 3, in which the selected picture information is processed such that only edge and corner information is used for said slower frame rate transmission.
5. A control system as claimed in any one of Claims 1 to 4, in which the received slower frame rate information is converted to video frame rate for projection into the image plane of the video display.
6. A remote operated vehicle control system substantially as hereinbefore described with reference to any one of the accompanying drawings.
PCT/GB1989/001018 1988-08-27 1989-08-25 Remote operated vehicle control WO1990002370A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB8820417.7 1988-08-27
GB8820417A GB2222338B (en) 1988-08-27 1988-08-27 Remote operated vehicle control

Publications (1)

Publication Number Publication Date
WO1990002370A1 true WO1990002370A1 (en) 1990-03-08

Family

ID=10642854

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB1989/001018 WO1990002370A1 (en) 1988-08-27 1989-08-25 Remote operated vehicle control

Country Status (4)

Country Link
EP (1) EP0383902A1 (en)
JP (1) JPH03501903A (en)
GB (1) GB2222338B (en)
WO (1) WO1990002370A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1994000951A1 (en) * 1992-06-29 1994-01-06 British Telecommunications Public Limited Company Coding and decoding video signals
EP0606173A1 (en) * 1993-01-05 1994-07-13 Sfim Industries Guiding assembly
FR2725102A1 (en) * 1994-09-27 1996-03-29 M5 Soc REMOTE VIDEO-CONTROL PROCESS OF EQUIPMENT IN PARTICULAR VEHICLES, AND IMPLEMENTATION DEVICE
EP0781679A1 (en) * 1995-12-27 1997-07-02 Dassault Electronique Control device for increasing safety in a fast vehicle, especially for a vehicle guided by an operator who may be located inside or outside the vehicle

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IT1249907B (en) * 1991-06-11 1995-03-30 SLOW SCAN REMOTE SURVEILLANCE SYSTEM USING THE MOBILE MOBILE COMMUNICATION SYSTEM.
GB2258114B (en) * 1991-07-26 1995-05-17 Rachel Mary Turner A remote baby monitoring system
SE512171C2 (en) * 1997-07-02 2000-02-07 Forskarpatent I Linkoeping Ab video Transmission
GB2382708B (en) 2001-11-21 2006-03-15 Roke Manor Research Detection of foreign objects on surfaces
US9282144B2 (en) 2011-01-14 2016-03-08 Bae Systems Plc Unmanned vehicle selective data transfer system and method thereof

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2310049A1 (en) * 1975-04-30 1976-11-26 Ver Flugtechnische Werke INSTALLATION FOR PROCESSING IMAGE-INFORMATION
GB2075794A (en) * 1980-05-10 1981-11-18 Deutsche Forsch Luft Raumfahrt Method for the transmission and projection of video images, in particular aerial photographs, with a reduced frequency in the image sequence
US4369464A (en) * 1979-07-09 1983-01-18 Temime Jean Pierre Digital video signal encoding and decoding system
US4405943A (en) * 1981-08-19 1983-09-20 Harris Corporation Low bandwidth closed loop imagery control and communication system for remotely piloted vehicle
US4513317A (en) * 1982-09-28 1985-04-23 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Retinally stabilized differential resolution television display
US4591909A (en) * 1983-04-20 1986-05-27 Nippon Telegraph & Telephone Public Corp. Interframe coding method and apparatus therefor
US4683494A (en) * 1984-08-13 1987-07-28 Nec Corporation Inter-frame predictive coding apparatus for video signal

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS61267476A (en) * 1985-05-22 1986-11-27 Nec Corp Monitor system
JPS6335094A (en) * 1986-07-30 1988-02-15 Nec Corp Moving image signal coding system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2310049A1 (en) * 1975-04-30 1976-11-26 Ver Flugtechnische Werke INSTALLATION FOR PROCESSING IMAGE-INFORMATION
US4369464A (en) * 1979-07-09 1983-01-18 Temime Jean Pierre Digital video signal encoding and decoding system
GB2075794A (en) * 1980-05-10 1981-11-18 Deutsche Forsch Luft Raumfahrt Method for the transmission and projection of video images, in particular aerial photographs, with a reduced frequency in the image sequence
US4405943A (en) * 1981-08-19 1983-09-20 Harris Corporation Low bandwidth closed loop imagery control and communication system for remotely piloted vehicle
US4513317A (en) * 1982-09-28 1985-04-23 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Retinally stabilized differential resolution television display
US4591909A (en) * 1983-04-20 1986-05-27 Nippon Telegraph & Telephone Public Corp. Interframe coding method and apparatus therefor
US4683494A (en) * 1984-08-13 1987-07-28 Nec Corporation Inter-frame predictive coding apparatus for video signal

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1994000951A1 (en) * 1992-06-29 1994-01-06 British Telecommunications Public Limited Company Coding and decoding video signals
GB2283636A (en) * 1992-06-29 1995-05-10 British Telecomm Coding and decoding video signals
GB2283636B (en) * 1992-06-29 1996-04-24 British Telecomm Coding and decoding video signals
US5841470A (en) * 1992-06-29 1998-11-24 British Telecommunications Public Limited Company Coding and decoding video signals
EP0606173A1 (en) * 1993-01-05 1994-07-13 Sfim Industries Guiding assembly
FR2725102A1 (en) * 1994-09-27 1996-03-29 M5 Soc REMOTE VIDEO-CONTROL PROCESS OF EQUIPMENT IN PARTICULAR VEHICLES, AND IMPLEMENTATION DEVICE
EP0704782A1 (en) * 1994-09-27 1996-04-03 Société M5 Remote video control method for machines especially for vehicles and apparatus for carrying out the method
US6304290B1 (en) 1994-09-27 2001-10-16 Societe M 5 Method for the video-assisted remote control of machines, especially vehicles, and device for the implementation of this method
EP0781679A1 (en) * 1995-12-27 1997-07-02 Dassault Electronique Control device for increasing safety in a fast vehicle, especially for a vehicle guided by an operator who may be located inside or outside the vehicle
FR2743162A1 (en) * 1995-12-27 1997-07-04 Dassault Electronique CONTROL DEVICE FOR SECURING A FAST VEHICLE, ESPECIALLY GUIDED BY AN OPERATOR ON OR OFF IN THE VEHICLE
US5987364A (en) * 1995-12-27 1999-11-16 Dassault Electronique Control device for making safe a fast vehicle, in particular guided by an operator on board the vehicle or otherwise

Also Published As

Publication number Publication date
GB2222338A (en) 1990-02-28
EP0383902A1 (en) 1990-08-29
GB8820417D0 (en) 1989-03-30
JPH03501903A (en) 1991-04-25
GB2222338B (en) 1992-11-04

Similar Documents

Publication Publication Date Title
US4855822A (en) Human engineered remote driving system
US5621429A (en) Video data display controlling method and video data display processing system
US5182641A (en) Composite video and graphics display for camera viewing systems in robotics and teleoperation
US6104425A (en) Method and apparatus for transmitting television signals, method and apparatus for receiving television signals, and method and apparatus for transmitting/receiving television signals
US5155683A (en) Vehicle remote guidance with path control
EP1468241B1 (en) Method and system for guiding a remote vehicle via lagged communication channel
US6111979A (en) System for encoding/decoding three-dimensional images with efficient compression of image data
JPH05143709A (en) Video effect device
WO1990002370A1 (en) Remote operated vehicle control
EP0735784B1 (en) Three-dimensional image display device
US20020005891A1 (en) Dual reality system
DE10197255T5 (en) VTV system
AU6669386A (en) Transmitting system and method using data compression
JPH07271434A (en) Environment map preparing method by plural mobile robots
JP2003510864A (en) Method and system for time / motion compensation for a head mounted display
JPH02103097A (en) Graphic display unit
CN1356529A (en) Ground manipulating and monitor deivce for coaxial dual-rotor robot helicopter
CN210804847U (en) Remote control driving system
CN114897935A (en) Unmanned aerial vehicle tracking method and system for air target object based on virtual camera
GB2231220A (en) Position control arrangements for aircraft in formation
CN1049925A (en) The control of Remote Control Vehicle
US7333156B2 (en) Sequential colour visual telepresence system
CN110100557A (en) A kind of Cold region apple fruit tree based on AR determines cave-applied fertilizer teleoperation method
Freedman et al. TV requirements for manipulation in space
GB2317086A (en) Virtual reality system

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): JP KR US

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE FR GB IT LU NL SE

WWE Wipo information: entry into national phase

Ref document number: 1989910132

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 1989910132

Country of ref document: EP

WWW Wipo information: withdrawn in national office

Ref document number: 1989910132

Country of ref document: EP