WO2014074080A1 - Landing assistance method for aircrafts - Google Patents

Landing assistance method for aircrafts Download PDF

Info

Publication number
WO2014074080A1
WO2014074080A1 PCT/TR2013/000332 TR2013000332W WO2014074080A1 WO 2014074080 A1 WO2014074080 A1 WO 2014074080A1 TR 2013000332 W TR2013000332 W TR 2013000332W WO 2014074080 A1 WO2014074080 A1 WO 2014074080A1
Authority
WO
WIPO (PCT)
Prior art keywords
landing
assistance method
objects
area
pilot
Prior art date
Application number
PCT/TR2013/000332
Other languages
French (fr)
Inventor
Özcan REMZI
Original Assignee
Tusaş - Türk Havacilik Ve Uzay Sanayii A.Ş.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tusaş - Türk Havacilik Ve Uzay Sanayii A.Ş. filed Critical Tusaş - Türk Havacilik Ve Uzay Sanayii A.Ş.
Priority to EP13803304.8A priority Critical patent/EP2917692A1/en
Publication of WO2014074080A1 publication Critical patent/WO2014074080A1/en

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/04Control of altitude or depth
    • G05D1/06Rate of change of altitude or depth
    • G05D1/0607Rate of change of altitude or depth specially adapted for aircraft
    • G05D1/0653Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing
    • G05D1/0676Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing specially adapted for landing
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • G01C21/3852Data derived from aerial or satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Definitions

  • the present invention is related to systems and their operating methods which assist pilots in whiteout and brownout conditions arising due to snow and dust during landing of aircrafts, especially helicopters.
  • EP1906151 In the European patent document numbered EP1906151 according to the known state of the art a display system is described wherein it supports the helicopter pilots by supplying an exact image of the landing area via high resolution images which have been captured right before a brownout.
  • the aim of the present invention is to provide a landing support method that enables a virtual world which is very close to the real one during whiteout and brownout conditions to the pilot, by combining the 3D digital maps with real video images belonging to the landing area.
  • Another aim of this invention is to provide a landing assistant method that shows real-like images/videos, by covering the modelled objects on the 3D map with air photos.
  • Another aim of the present invention is to carry out a landing assistance method, that can be applied with less complex systems in comparison to the state of the art and expensive systems.
  • Another aim of the present invention is to recognise objects that may move such as vehicles, human beings and animals, place these on the 3D digital map and warn the pilots for their existence.
  • a landing assistant method (100) developed to reach the aims of the present invention comprises the following steps:
  • photos of the region are taken before whiteout and brownout conditions arise (103). If detailed images of the landing area are taken by flying before landing, these images are used to generate a 3D model of the area and combined with a 3D digital map (104) if exists.
  • whiteout or brownout conditions are determined before landing, the virtual world created is adjusted according to the position and altitude of the air vehicle and the head angles of the pilot, and is shown via a display device available within the air vehicle (107).
  • the display device can be 3D/2D glasses or any kind of screen.
  • the photos taken until whiteout or brownout conditions arise (103) are embedded on the 3D digital map (105).
  • the real-like images that have been established are shown from a display device, available within the air vehicle according to the position and altitude of the air vehicle and the head angles of the pilot (108).
  • the display device can be any kind of 3D/2D glasses or a screen. Thus the pilot can view the landing area.
  • pattern recognition algorithms are applied for objects that are not present on the map in the steps of combining the camera images with a 3D digital map (104) or embedding the photos on a 3D digital map (105).
  • Through pattern recognition objects are identified (trees, buildings, vehicles, people etc.) and are converted into generic 3D models.
  • the object which has been defined is modelled as a real like object by using information such as distance and angles.
  • the modelled object is embedded to the map as a 3D object.
  • an updated map can be shown to the pilot during photograph embedding or image combining procedures.
  • objects that can change place in photos are determined via pattern recognition algorithms. While the virtual world is being displayed by the pilot, the recognized objects are marked to attract attention. According to a preferred embodiment of the present invention, after the moving objects are recognized and marked their probable routes and velocities are calculated. The estimated routes and objects that could be dangerous are marked and displayed to the pilot. According to a preferred embodiment of the present invention, the most suitable landing regions are determined by taking into consideration the routes of the mobile objects in the vicinity, the land conditions, buildings, vehicles, human beings and other objects and are then coloured in terms of priority and displayed to the pilot.
  • a single 2D camera shall be sufficient.
  • Pattern recognition and image processing algorithms and procedures such as route calculation, establishing and displaying a virtual world can be carried out with a single processor.

Abstract

The present invention is related to systems and their operating methods which assist pilots in whiteout and brownout conditions arising due to snow and dust during landing of aircrafts, especially helicopters. The aim of the present invention is to provide a landing support method that enables a virtual world which is very close to the real one during whiteout and brownout conditions to the pilot, by combining the 3D digital maps with real video images belonging to the landing area.

Description

DESCRIPTION
LANDING ASSISTANCE METHOD FOR AIRCRAFTS
Technical Field
The present invention is related to systems and their operating methods which assist pilots in whiteout and brownout conditions arising due to snow and dust during landing of aircrafts, especially helicopters.
Prior Art
During landing of aircrafts, especially helicopters, conditions that cause the view of the pilot to decrease which are called whiteout and brownout in snowy or dusty areas occur. In such cases, in order for the pilot to land the aircraft safely, there are methods which present the virtual image corresponding to the real image of the landing area via medium such as helmet mounted goggles or a display screen.
In the methods that are already being used expensive and complex systems such as 3D cameras, radar and lasers are being utilized. In such systems virtual images formed using 3D modelling is presented to the pilot.
In the European patent document numbered EP1906151 according to the known state of the art a display system is described wherein it supports the helicopter pilots by supplying an exact image of the landing area via high resolution images which have been captured right before a brownout.
In the United States document numbered US7642929 which was another document within the state of the art a system that assists helicopters during landing is described. Brief Description of the Invention
The aim of the present invention is to provide a landing support method that enables a virtual world which is very close to the real one during whiteout and brownout conditions to the pilot, by combining the 3D digital maps with real video images belonging to the landing area.
Another aim of this invention is to provide a landing assistant method that shows real-like images/videos, by covering the modelled objects on the 3D map with air photos.
Another aim of the present invention is to carry out a landing assistance method, that can be applied with less complex systems in comparison to the state of the art and expensive systems.
Another aim of the present invention is to recognise objects at the landing area and convert them to 3D models and then place them on the 3D digital map as generic models. Another aim of the present invention is to determine the routes of the mobile objects at the landing area and placing them on a 3D digital map.
Another aim of the present invention is to recognise objects that may move such as vehicles, human beings and animals, place these on the 3D digital map and warn the pilots for their existence.
Detailed description of the Invention
"A landing assistant method for aircrafts" developed in order to reach the aim of the present invention has been illustrated in the attached figures, wherein said figures show the following: Figure 1. Is the flow diagram of a landing assistant method
The parts in the drawings have each been numbered and the references corresponding to said parts have been given below:
100. Landing Assistant Method
A landing assistant method (100) developed to reach the aims of the present invention comprises the following steps:
- Deciding if there is sufficient time to carry out a flight over landing area in order to capture images of the landing area before performing the landing (101),
- If there is sufficient time, flying over the landing area during a preferred time and using a preferred flight pattern and capturing pictures of the area via a camera mounted on the air vehicle during said flight (102),
- If there is not sufficient time (if an immediate landing has to be done), taking pictures of the landing area before whiteout or brownout starts (103),
- If the step of capturing pictures belonging to the area with a camera (102) is applied, using these images to generate a 3D model of the area and combine with a 3D (3 Dimension) digital map supported by an object and altitude database (104) if exists,
- If the step of capturing pictures belonging to the area with a camera (102) is not applied, embedding the latest taken photos on a 3D digital map
(105),
- Perceiving whiteout and/or brownout conditions (106),
- Showing the virtual world established through the images combined with the 3D digital maps to the pilot via a display device (107), Showing the virtual world created with the photos embedded on the 3D digital map and 3 dimension object models generated by the system (104), to the pilot with a display device (108). According to the landing assistant method (100) subject to the present invention, pictures provided by flying for a preferred amount of time and using a predefined flight pattern over the landing region of the aircrafts are recorded (102). This procedure can only be carried out if there is sufficient amount of time before landing (101).
If there isn't sufficient amount of time before landing (101), or in other words if a direct landing to the region has to be made, photos of the region are taken before whiteout and brownout conditions arise (103). If detailed images of the landing area are taken by flying before landing, these images are used to generate a 3D model of the area and combined with a 3D digital map (104) if exists. When whiteout or brownout conditions are determined before landing, the virtual world created is adjusted according to the position and altitude of the air vehicle and the head angles of the pilot, and is shown via a display device available within the air vehicle (107). The display device can be 3D/2D glasses or any kind of screen. Thus the pilot can observe the landing area even there is not a clear vision of the area.
When there is no complete (enough for the 3D model generation) image of the landing region before landing, the photos taken until whiteout or brownout conditions arise (103) are embedded on the 3D digital map (105). When whiteout or brownout conditions are determined, the real-like images that have been established are shown from a display device, available within the air vehicle according to the position and altitude of the air vehicle and the head angles of the pilot (108). The display device, can be any kind of 3D/2D glasses or a screen. Thus the pilot can view the landing area.
According to a preferred embodiment of the invention, pattern recognition algorithms are applied for objects that are not present on the map in the steps of combining the camera images with a 3D digital map (104) or embedding the photos on a 3D digital map (105).
Through pattern recognition objects are identified (trees, buildings, vehicles, people etc.) and are converted into generic 3D models. The object which has been defined is modelled as a real like object by using information such as distance and angles. The modelled object is embedded to the map as a 3D object. Thus an updated map can be shown to the pilot during photograph embedding or image combining procedures.
Objects returning with a negative result from pattern recognition are symbolized on the map as 2D objects.
According to a preferred embodiment of the present invention, objects that can change place in photos (such as human beings, or vehicles etc) right before whiteout and brownout conditions are determined via pattern recognition algorithms. While the virtual world is being displayed by the pilot, the recognized objects are marked to attract attention. According to a preferred embodiment of the present invention, after the moving objects are recognized and marked their probable routes and velocities are calculated. The estimated routes and objects that could be dangerous are marked and displayed to the pilot. According to a preferred embodiment of the present invention, the most suitable landing regions are determined by taking into consideration the routes of the mobile objects in the vicinity, the land conditions, buildings, vehicles, human beings and other objects and are then coloured in terms of priority and displayed to the pilot.
According to the landing assistance method (100) subject to the invention, in order to capture images and photos, a single 2D camera shall be sufficient. Pattern recognition and image processing algorithms and procedures such as route calculation, establishing and displaying a virtual world can be carried out with a single processor.
Within the scope of this basic concept, it is possible to develop several various applications of the landing assistance method (100) for air vehicles subject to the invention, and the present invention cannot be delimited with the examples explained herein, and actually the present invention is as described in the claims.

Claims

1. A landing assistance method (100) for air vehicles which aids in landing of an air vehicle, under conditions which decrease the view of the pilots using said air vehicles, characterized in that it comprises the following steps:
- Deciding if there is sufficient time to carry out a flight over landing area in order to capture images of the landing area before performing the landing (101),
- If there is sufficient time, flying over the landing area during a preferred time and using a preferred flight pattern and capturing pictures of the area via a camera mounted on the helicopter during said flight (102),
- If there is not sufficient time (if an immediate landing has to be done), taking pictures of the landing area before whiteout or brownout starts (103),
- If the step of capturing pictures belonging to the area with a camera (102) is applied, using these images to generate a 3D model of the area and combine with a 3D (3 Dimension) digital map supported by an object and altitude database (104) if exists - If the step of capturing pictures belonging to the area with a camera (102) is not applied, embedding the latest taken photos on a 3D digital map (105),
- Perceiving whiteout and/or brownout conditions (106),
Showing the virtual world established through the images combined with the 3D digital maps to the pilot via a display device (107),
Showing the virtual world created with the photos embedded on the 3D digital map and 3 dimension object models generated by the system (104), to the pilot with a display device (108).
2. A landing assistance method (100) according to claim 1, characterized in that the images displayed to the pilot with a display device are adjusted according to the position and altitude of the air vehicle and the head angles of the pilot.
3. A landing assistance method (100) according to any of the preceding claims, characterized in that objects which are not available on the digital map are identified with pattern recognition.
4. A landing assistance method (100) according to any of the preceding claims, characterized in that the objects which have been determined with pattern recognition, are modelled as 3 dimension real like objects by using information such as their distances and angles and then added to the map.
5. A landing assistance method (100) according to any of the preceding claims, characterized in that the objects which are returned with a negative result from pattern recognition are symbolized as 2D on the map.
6. A landing assistance method (100) according to any of the preceding claims, characterized in that the objects which are mobile or which can move are determined with pattern recognition algorithms and are marked on the digital map.
7. A landing assistance method (100) according to any of the preceding claims, characterized in that the possible routes and velocity of the mobile or moveable objects are calculated and shown on the digital map.
8. A landing assistance method (100) according to any of the preceding claims, characterized in that the most suitable landing regions are determined by taking into consideration the routes or mobile objects, land conditions, buildings, vehicles, human beings and other objects and then marked with preferred colour according to their priorities.
9. A landing assistance method (100) according to any of the preceding claims, characterized in that a single 2D camera is used in order to capture images and photographs.
10. A landing assistance method (100) according to any of the preceding claims, characterized in that procedures such as pattern recognition, image processing algorithms, route calculations, creating and displaying a virtual world are carried out with a single processor.
PCT/TR2013/000332 2012-11-07 2013-11-01 Landing assistance method for aircrafts WO2014074080A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP13803304.8A EP2917692A1 (en) 2012-11-07 2013-11-01 Landing assistance method for aircrafts

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TR2012/12844 2012-11-07
TR201212844 2012-11-07

Publications (1)

Publication Number Publication Date
WO2014074080A1 true WO2014074080A1 (en) 2014-05-15

Family

ID=49759518

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/TR2013/000332 WO2014074080A1 (en) 2012-11-07 2013-11-01 Landing assistance method for aircrafts

Country Status (2)

Country Link
EP (1) EP2917692A1 (en)
WO (1) WO2014074080A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10198955B1 (en) * 2016-09-08 2019-02-05 Amazon Technologies, Inc. Drone marker and landing zone verification
US10388172B1 (en) 2016-09-08 2019-08-20 Amazon Technologies, Inc. Obstacle awareness based guidance to clear landing space
WO2021156154A1 (en) * 2020-02-05 2021-08-12 Outsight System, method, and computer program product for avoiding ground blindness in a vehicle

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060087452A1 (en) * 2004-10-23 2006-04-27 Eads Deutschland Gmbh Method of pilot support in landing helicopters in visual flight under brownout or whiteout conditions
EP1701133A2 (en) * 2005-03-08 2006-09-13 Northrop Grumman Corporation Geographic information storage transmission and display system
WO2008002875A2 (en) * 2006-06-26 2008-01-03 Lockheed Martin Corporation Method and system for providing a perspective view image by intelligent fusion of a plurality of sensor data
WO2011039666A1 (en) * 2009-10-01 2011-04-07 Rafael Advanced Defense Systems Ltd. Assisting vehicle navigation in situations of possible obscured view

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060087452A1 (en) * 2004-10-23 2006-04-27 Eads Deutschland Gmbh Method of pilot support in landing helicopters in visual flight under brownout or whiteout conditions
EP1701133A2 (en) * 2005-03-08 2006-09-13 Northrop Grumman Corporation Geographic information storage transmission and display system
WO2008002875A2 (en) * 2006-06-26 2008-01-03 Lockheed Martin Corporation Method and system for providing a perspective view image by intelligent fusion of a plurality of sensor data
WO2011039666A1 (en) * 2009-10-01 2011-04-07 Rafael Advanced Defense Systems Ltd. Assisting vehicle navigation in situations of possible obscured view

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10198955B1 (en) * 2016-09-08 2019-02-05 Amazon Technologies, Inc. Drone marker and landing zone verification
US10388172B1 (en) 2016-09-08 2019-08-20 Amazon Technologies, Inc. Obstacle awareness based guidance to clear landing space
US10922984B1 (en) 2016-09-08 2021-02-16 Amazon Technologies, Inc. Drone marker and landing zone verification
WO2021156154A1 (en) * 2020-02-05 2021-08-12 Outsight System, method, and computer program product for avoiding ground blindness in a vehicle

Also Published As

Publication number Publication date
EP2917692A1 (en) 2015-09-16

Similar Documents

Publication Publication Date Title
US20120176497A1 (en) Assisting vehicle navigation in situations of possible obscured view
US10802509B2 (en) Selective processing of sensor data
US10599149B2 (en) Salient feature based vehicle positioning
EP3206768B1 (en) Inspection vehicle control device, control method, and computer program
JP5618840B2 (en) Aircraft flight control system
CN106856566B (en) A kind of information synchronization method and system based on AR equipment
US8019490B2 (en) Imaging and display system to aid helicopter landings in brownout conditions
JP5775632B2 (en) Aircraft flight control system
WO2017073310A1 (en) Image capture system for shape measurement of structure, method of capturing image of stucture used for shape measurement of structure, onboard control device, remote control device, program, and recording medium
CN105759833A (en) Immersive unmanned aerial vehicle driving flight system
JP2001344597A (en) Fused visual field device
JP2005268847A (en) Image generating apparatus, image generating method, and image generating program
JP2006027331A (en) Method for collecting aerial image information by utilizing unmanned flying object
CN109997091B (en) Method for managing 3D flight path and related system
CN108450032B (en) Flight control method and device
KR101771492B1 (en) Method and system for mapping using UAV and multi-sensor
JP6482857B2 (en) Monitoring system
JP6482855B2 (en) Monitoring system
JP2019050007A (en) Method and device for determining position of mobile body and computer readable medium
WO2014074080A1 (en) Landing assistance method for aircrafts
JP6831949B2 (en) Display control system, display control device and display control method
WO2021250914A1 (en) Information processing device, movement device, information processing system, method, and program
KR20200081841A (en) Signal processing system for aircraft
WO2021130980A1 (en) Aircraft flight path display method and information processing device
CN110892354A (en) Image processing method and unmanned aerial vehicle

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13803304

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2013803304

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE