CN103471658A - Autonomic unmanned perception system, automatic moving perception terminal and working method of autonomic unmanned perception system and automatic moving perception terminal - Google Patents

Autonomic unmanned perception system, automatic moving perception terminal and working method of autonomic unmanned perception system and automatic moving perception terminal Download PDF

Info

Publication number
CN103471658A
CN103471658A CN2013104497998A CN201310449799A CN103471658A CN 103471658 A CN103471658 A CN 103471658A CN 2013104497998 A CN2013104497998 A CN 2013104497998A CN 201310449799 A CN201310449799 A CN 201310449799A CN 103471658 A CN103471658 A CN 103471658A
Authority
CN
China
Prior art keywords
perception
lower case
life
perception terminal
infrared
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2013104497998A
Other languages
Chinese (zh)
Other versions
CN103471658B (en
Inventor
赵小川
胡江
钱毅
李陈
张敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China North Computer Application Technology Research Institute
Original Assignee
China North Computer Application Technology Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China North Computer Application Technology Research Institute filed Critical China North Computer Application Technology Research Institute
Priority to CN201310449799.8A priority Critical patent/CN103471658B/en
Publication of CN103471658A publication Critical patent/CN103471658A/en
Application granted granted Critical
Publication of CN103471658B publication Critical patent/CN103471658B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses an autonomic unmanned perception system, an automatic moving perception terminal and a working method of the autonomic unmanned perception system and the automatic moving perception terminal. The autonomic unmanned perception system comprises an upper shell, a lower shell and a bottom cover, wherein the upper shell, the lower shell and the bottom cover are arranged sequentially from top to bottom and are spliced to form a spherical surface. Direction drive wheels are installed at the bottom of the lower shell and connected with a steering engine. A disturbance motor drives the upper shell to rotate on the lower shell around a rotation shaft of the upper shell. The bottom cover is attached to the lower shell slightly, and then the direction drive wheels are arranged in the bottom cover. A perception sensing unit comprises a visible light sensor, an infrared ray sensor, a life detection instrument and an ultrasonic sensor, wherein the visible light sensor and the infrared ray sensor are installed on the upper shell, and the life detection instrument and the ultrasonic sensor are installed on the lower shell at an interval. A main control unit of the automatic moving perception terminal receives sensing information output by the perception sensing unit and fuses the sensing information. The autonomic unmanned perception system can achieve automatic emission; the automatic moving perception terminal can move automatically and collect environmental information of target areas.

Description

The unmanned sensory perceptual system of autonomous type and autonomous perception terminal and method of work thereof
Technical field
The present invention relates to a kind of perception accident scene perception terminal of omnibearing information on every side, relate in particular to the unmanned perception terminal of a kind of autonomous type, the unmanned sensory perceptual system of autonomous type with the unmanned perception terminal of this autonomous type, and the method for work of the unmanned perception terminal of this autonomous type, this method of work relates generally to the information fusion method of heat transfer agent.
Background technology
Now, all kinds of accidents have formed great threat to the safety of the stable and people's life of society, therefore, greatly develop towards the jury rig of accident to hit the terror accident, maintain social stability significant.Particularly " how to obtain quickly and efficiently the event environmental information in when burst, for police and soldier provide decision-making foundation, thereby reduce casualties " and become the public safety field problem demanding prompt solution.
Summary of the invention
It is movably a kind of that one object of the present invention is to provide, the autonomous perception terminal of the environmental information of the target area that accurately perception has accident to occur.
The technical solution used in the present invention is: a kind of autonomous perception terminal, comprise upper body, lower case and the bottom arranged in turn from top to bottom, disturbance motor, direction driving wheel and with the direction driving wheel steering wheel of corresponding configuration one by one, and perception sensing unit, wherein, the outside surface of described upper body, lower case and bottom is solid of revolution, and is spliced into sphere; The all directions driving wheel is installed on the bottom of described lower case, and expose by the diapire of lower case, each steering wheel is installed in lower case, the rotating shaft of the output shaft of steering wheel and direction driving wheel with corresponding separately is connected, described disturbance motor is installed on the erecting frame be arranged in lower case, the output shaft of described disturbance motor is fixedly connected with the inwall of upper body, to drive upper body revolving shaft around self on lower case, rotates; Described bottom fits with lower case with being left unlocked or unlatched, makes the part that described direction driving wheel exposes by diapire be placed in described bottom;
Described perception sensing unit comprises visible light sensor, infrared ray sensor, life-detection instrument and ultrasonic sensor, and described visible light sensor and infrared ray sensor are installed on described upper body; Described life-detection instrument and ultrasonic sensor are installed on described lower case, and interval is arranged; And,
Described autonomous perception terminal also comprises the main control unit be arranged in lower case, with perception sensing unit, steering wheel, with the disturbance motor, communication is connected described main control unit, to control unlatching, the motion of direction driving wheel and the rotation of upper body of perception sensing unit, described main control unit receives the heat transfer agent of perception sensing unit output, and heat transfer agent is carried out to information fusion.
Preferably, described autonomous perception terminal comprises four direction driving wheels that are square and distribute.
Another object of the present invention is to provide the unmanned sensory perceptual system of a kind of autonomous type, it comprises above-mentioned autonomous perception terminal and for launching the Pneumatic catapult of described autonomous perception terminal, by Pneumatic catapult, the autonomous perception terminal is emitted to the target area near zone.
The 3rd purpose of the present invention is to provide a kind of information fusion method of heat transfer agent, with the environmental information of perception target area exactly.
The technical solution used in the present invention is: the method for work of above autonomous perception terminal comprises the steps:
Step 1: described main control unit is opened life-detection instrument, and main control unit, when knowing that life-detection instrument has sensed life entity and exists, by servo driving direction driving wheel, makes the autonomous perception terminal move towards the target area with life entity;
Step 2: in the process that described main control unit moves in autonomous perception terminal head for target zone, open each ultrasonic sensor, to survey barrier on the way and to be kept away barrier by servo driving direction driving wheel;
Step 3: when described main control unit arrives target area at the autonomous perception terminal, open visible light sensor and infrared ray sensor, with visible images and the infrared image that gathers respectively target area;
Step 4: described main control unit carries out information fusion to the heat transfer agent that comprises the life-information that range information that visible images, infrared image, ultrasonic sensor sense and life-detection instrument sense;
The method of wherein, carrying out information fusion is:
Step 41: visible images and infrared image are carried out to registration, make two width image sizes identical, be designated as through the registration visible images with through the registration infrared image;
Step 42: will and input to respectively dividing filter through the registration infrared image through the registration visible images and carry out frequency division, and obtain the visible ray high fdrequency component I through the registration visible images h, visible ray low frequency component I l, through the infrared high fdrequency component P of registration infrared image h, infrared low frequency component P l;
Step 43: structure two dimensional image texture detects pattern matrix R, by described two dimensional image texture detect pattern matrix respectively with visible ray high fdrequency component, visible ray low frequency component, infrared high fdrequency component and infrared low frequency component one to one four width images carry out related operation, draw the Texture complication coefficients R of each pixel value in four width images iH, R iL, R pH, R pL;
Step 44: carry out information fusion according to following rule, establishing the high frequency coefficient of image after information fusion is H, and low frequency coefficient is L;
H=A * I h+ B * P h, wherein, if the Texture complication coefficients R iHbe greater than the Texture complication coefficients R pH, A is greater than B, on the contrary A is less than B;
L=C * I l+ D * P l, wherein, if the Texture complication coefficients R iLbe greater than the Texture complication coefficients R pL, C is greater than D, on the contrary C is less than D;
Wherein, A, B, C, D is fusion coefficients, and is equal to 2 n, the n value of fusion coefficients A, B is value in two default different natural numbers, and the n value of fusion coefficients C, D is value in two default different natural numbers, for the default natural number of fusion coefficients A, B, C and D, in 1,2,3 and 4, chooses;
Step 45: low frequency coefficient L and high frequency coefficient H are superimposed, are restored, obtain the fused images of visible images and infrared image;
Step 46: the range information that will obtain by ultrasonic sensor and the life-information obtained by life-detection instrument are added on described fused images, obtain the sensor information fusion image.
Preferably, described ultrasonic sensor is transmitting-receiving integrated, to occur that crosstalk phenomenon, described ultrasound wave are collected together to have done in order preventing from receiving echo to adopt the low and high level of different coding to be encouraged, and to carry out the echo differentiation by auto-correlation.
The beneficial effect of the invention is: autonomous perception terminal of the present invention can obtain accident scene omnibearing information on every side in real time, because it adopts sphere structure, can equally carry out autonomous transmission to firing shells by Pneumatic catapult, in addition, because its underpart housing has the direction driving wheel, therefore, the part except bottom of autonomous perception terminal can be carried out autonomous under the control of steering wheel, at this, the bottom that the present invention fits together separably by design and lower case, can in the process of launching the autonomous perception terminal by Pneumatic catapult, effectively protect the direction driving wheel, after the autonomous perception terminal is launched from Pneumatic catapult, because bottom is fitted on lower case with being left unlocked or unlatched, therefore, can under the effect of aerodynamic force, with lower case, be separated immediately, and then the remainder of autonomous perception terminal gets final product autonomous after falling target area.Moreover, autonomous perception terminal of the present invention adopts four class sensors, wherein, determine target area by life-detection instrument, by the ultrasonic sensor obstacle, and the sensor information fusion of four class sensors is obtained to the environmental information in the scope of target area together, effectively improved the accuracy of image information, it is become and a kind ofly can be used for the zone that the personnel such as, bottom of car outer to corner, dark cave, dangerous room, high-wall are difficult for directly observing and scouted, carry out security protection, anti-terrorism, locate a kind of effective Reconnaissance Equipment of tasks such as dashing forward.Finally, utilize autonomous perception terminal of the present invention, can effectively reduce police or soldier's injures and deaths, embody the theory of " people-oriented ".
The accompanying drawing explanation
Fig. 1 shows according to the one-piece construction of autonomous perception terminal of the present invention except bottom;
Fig. 2 is the exploded view according to autonomous perception terminal of the present invention;
Fig. 3 shows the polycrystalline substance of lower case shown in Fig. 2.
Embodiment
Below in conjunction with the drawings and specific embodiments, the present invention will be further described in detail.
As shown in Figures 1 to 3, autonomous perception terminal of the present invention comprises upper body 1, lower case 2 and the bottom 3 of arranging in turn from top to bottom, disturbance motor 20, direction driving wheel 9 and with direction driving wheel 9 steering wheel 50 of corresponding configuration one by one, and perception sensing unit, wherein, the outside surface of upper body 1, lower case 2 and bottom 3 is solid of revolution, and is spliced into complete sphere; All directions driving wheel 9 is installed on the bottom of lower case 2, and expose by the diapire of lower case 2, each steering wheel 50 is installed in lower case 2, the rotating shaft of the output shaft of steering wheel 50 and direction driving wheel 9 with corresponding separately is connected, rotation with from controlling party to driving wheel, disturbance motor 20 is installed on the erecting frame 30 be arranged in lower case 2, and the output shaft of disturbance motor 20 is fixedly connected with the inwall of upper body 1, and upper body 1 can be rotated around the revolving shaft of self on lower case; Above bottom 3 fits with lower case 2 with being left unlocked or unlatched, make the part that direction driving wheel 9 exposes by diapire be placed in bottom 3, and then direction driving wheel 9 suffers damage while preventing from launching the autonomous perception terminal by Pneumatic catapult, at this, because bottom 3 is to be left unlocked or unlatched setting, therefore, after the autonomous perception terminal is launched, bottom 3 can aloft be separated with lower case 2, the movement after the main part that can not affect the autonomous perception terminal is landed.
As illustrated in fig. 1 and 2, above perception sensing unit comprises visible light sensor 5, infrared ray sensor 6, life-detection instrument 8 and ultrasonic sensor 7, above visible light sensor 5 and infrared ray sensor 6 are installed on upper body 1, by the rotation of upper body, but the what comes into a driver's in the 360 degree scopes of coverage goal zone.Above life-detection instrument 8 and ultrasonic sensor 7 are installed on lower case 2, and the interval layout, effectively to identify and obstacle.
Above autonomous perception terminal also comprises the main control unit be arranged in lower case 2, with perception sensing unit, steering wheel 50, with the disturbance motor, 20 communications are connected main control unit, and then unlatching, the motion of direction driving wheel 9 and the rotation of upper body 1 of each sensor of control perception sensing unit, main control unit receives the heat transfer agent of each sensor output of perception sensing unit, and heat transfer agent is carried out to information fusion.
In order to make the autonomous perception terminal have higher flexibility ratio, it can comprise four direction driving wheels 9 that are square and distribute.
The unmanned sensory perceptual system of autonomous type of the present invention comprises above-mentioned autonomous perception terminal and for launching the Pneumatic catapult of this autonomous perception terminal, by Pneumatic catapult, to firing shells, equally the autonomous perception terminal is emitted to the target area near zone.
The method of work of above autonomous perception terminal comprises the steps:
Step 1: main control unit is opened life-detection instrument 8, and main control unit, when knowing that life-detection instrument 8 has sensed life entity and exists, by servo driving direction driving wheel, makes the autonomous perception terminal move towards the target area with life entity;
Step 2: in the process that main control unit moves in autonomous perception terminal head for target zone, open each ultrasonic sensor 7, to survey barrier on the way and to be kept away barrier by servo driving direction driving wheel;
Step 3: when main control unit arrives target area at the autonomous perception terminal, open visible light sensor 5 and infrared ray sensor 6, with visible images and the infrared image that gathers respectively target area;
Step 4: described main control unit carries out information fusion to the heat transfer agent that comprises the life-information that range information that visible images, infrared image, ultrasonic sensor sense and life-detection instrument sense;
The method of wherein, carrying out information fusion is:
Step 41: visible images and infrared image are carried out to registration, make two width image sizes identical, be designated as through the registration visible images with through the registration infrared image; Visible images herein and infrared image are the images after analog quantity image that respective sensor is collected carries out analog to digital conversion;
Step 42: will and input to respectively dividing filter through the registration infrared image through the registration visible images and carry out frequency division, and obtain the visible ray high fdrequency component I through the registration visible images h, visible ray low frequency component I l, through the infrared high fdrequency component P of registration infrared image h, infrared low frequency component P l;
Step 43: structure two dimensional image texture detects pattern matrix R, by described two dimensional image texture detect pattern matrix respectively with visible ray high fdrequency component, visible ray low frequency component, infrared high fdrequency component and infrared low frequency component one to one four width images carry out related operation, draw the Texture complication coefficients R of each pixel value in four width images iH, R iL, R pH, R pL;
Step 44: carry out information fusion according to following rule, establishing the high frequency coefficient of image after information fusion is H, and low frequency coefficient is L;
H=A * I h+ B * P h, wherein, if the Texture complication coefficients R iHbe greater than the Texture complication coefficients R pH, A is greater than B, on the contrary A is less than B;
L=C * I l+ D * P l, wherein, if the Texture complication coefficients R iLbe greater than the Texture complication coefficients R pL, C is greater than D, on the contrary C is less than D;
Wherein, A, B, C, D is fusion coefficients, and is equal to 2 n, the n value of fusion coefficients A, B is value in two default different natural numbers, and the n value of fusion coefficients C, D is value in two default different natural numbers, for the default natural number of fusion coefficients A, B, C and D, in 1,2,3 and 4, chooses; For example the n value of default A, B is chosen in 1 and 2, when definite A is greater than B, and fusion coefficients A=2 2, fusion coefficients B=2 1;
Step 45: low frequency coefficient L and high frequency coefficient H are superimposed, are restored, obtain the fused images of visible images and infrared image;
Step 46: the range information that will obtain by ultrasonic sensor and the life-information obtained by life-detection instrument are added on described fused images, obtain the sensor information fusion image.
Preferably, described ultrasonic sensor is transmitting-receiving integrated, to occur that crosstalk phenomenon, described ultrasound wave are collected together to have done in order preventing from receiving echo to adopt the low and high level of different coding to be encouraged, and to carry out the echo differentiation by auto-correlation.For example: the height level of drive of No. 1 ultrasonic sensor is: 0111100010, the receiver of No. 1 ultrasonic sensor is decoded to echo, coding after carrying recorded decoding, with launching electrical level, contrasted, if 0111100010, echo is that No. 1 ultrasonic sensor sends, and records the echo time; If not 0111100010, echo is that other ultrasonic sensors send or is noise, does not record the echo time.
Be only preferred embodiment of the present invention in sum, not be used for limiting practical range of the present invention.Be that all equivalences of doing according to the content of the present patent application the scope of the claims change and modify, all should belong to technology category of the present invention.

Claims (5)

1. an autonomous perception terminal, it is characterized in that: comprise upper body, lower case and the bottom arranged in turn from top to bottom, disturbance motor, direction driving wheel and with the direction driving wheel steering wheel of corresponding configuration one by one, and perception sensing unit, wherein, the outside surface of described upper body, lower case and bottom is solid of revolution, and is spliced into sphere; The all directions driving wheel is installed on the bottom of described lower case, and expose by the diapire of lower case, each steering wheel is installed in lower case, the rotating shaft of the output shaft of steering wheel and direction driving wheel with corresponding separately is connected, described disturbance motor is installed on the erecting frame be arranged in lower case, the output shaft of described disturbance motor is fixedly connected with the inwall of upper body, to drive upper body revolving shaft around self on lower case, rotates; Described bottom fits with lower case with being left unlocked or unlatched, makes the part that described direction driving wheel exposes by diapire be placed in described bottom;
Described perception sensing unit comprises visible light sensor, infrared ray sensor, life-detection instrument and ultrasonic sensor, and described visible light sensor and infrared ray sensor are installed on described upper body; Described life-detection instrument and ultrasonic sensor are installed on described lower case, and interval is arranged; And,
Described autonomous perception terminal also comprises the main control unit be arranged in lower case, with perception sensing unit, steering wheel, with the disturbance motor, communication is connected described main control unit, to control unlatching, the motion of direction driving wheel and the rotation of upper body of perception sensing unit, described main control unit receives the heat transfer agent of perception sensing unit output, and heat transfer agent is carried out to information fusion.
2. autonomous perception terminal according to claim 1, is characterized in that: comprise four direction driving wheels that are square and distribute.
3. the unmanned sensory perceptual system of autonomous type is characterized in that: comprise the described autonomous perception terminal of claim 1 or 2 and for launching the Pneumatic catapult of described autonomous perception terminal.
4. the method for work of autonomous perception terminal according to claim 1 and 2, is characterized in that: comprise the steps:
Step 1: described main control unit is opened life-detection instrument, and main control unit, when knowing that life-detection instrument has sensed life entity and exists, by servo driving direction driving wheel, makes the autonomous perception terminal move towards the target area with life entity;
Step 2: in the process that described main control unit moves in autonomous perception terminal head for target zone, open each ultrasonic sensor, to survey barrier on the way and to be kept away barrier by servo driving direction driving wheel;
Step 3: when described main control unit arrives target area at the autonomous perception terminal, open visible light sensor and infrared ray sensor, with visible images and the infrared image that gathers respectively target area;
Step 4: described main control unit carries out information fusion to the heat transfer agent that comprises the life-information that range information that visible images, infrared image, ultrasonic sensor sense and life-detection instrument sense;
The method of wherein, carrying out information fusion is:
Step 41: visible images and infrared image are carried out to registration, make two width image sizes identical, be designated as through the registration visible images with through the registration infrared image;
Step 42: will and input to respectively dividing filter through the registration infrared image through the registration visible images and carry out frequency division, and obtain the visible ray high fdrequency component I through the registration visible images h, visible ray low frequency component I l, through the infrared high fdrequency component P of registration infrared image h, infrared low frequency component P l;
Step 43: structure two dimensional image texture detects pattern matrix R, by described two dimensional image texture detect pattern matrix respectively with visible ray high fdrequency component, visible ray low frequency component, infrared high fdrequency component and infrared low frequency component one to one four width images carry out related operation, draw the Texture complication coefficients R of each pixel value in four width images iH, R iL, R pH, R pL;
Step 44: carry out information fusion according to following rule, establishing the high frequency coefficient of image after information fusion is H, and low frequency coefficient is L;
H=A * I h+ B * P h, wherein, if the Texture complication coefficients R iHbe greater than the Texture complication coefficients R pH, A is greater than B, on the contrary A is less than B;
L=C * I l+ D * P l, wherein, if the Texture complication coefficients R iLbe greater than the Texture complication coefficients R pL, C is greater than D, on the contrary C is less than D;
Wherein, A, B, C, D is fusion coefficients, and is equal to 2 n, the n value of fusion coefficients A, B is value in two default different natural numbers, and the n value of fusion coefficients C, D is value in two default different natural numbers, for the default natural number of fusion coefficients A, B, C and D, in 1,2,3 and 4, chooses;
Step 45: low frequency coefficient L and high frequency coefficient H are superimposed, are restored, obtain the fused images of visible images and infrared image;
Step 46: the range information that will obtain by ultrasonic sensor and the life-information obtained by life-detection instrument are added on described fused images, obtain the sensor information fusion image.
5. method of work according to claim 4, it is characterized in that: described ultrasonic sensor is transmitting-receiving integrated, occurring that crosstalk phenomenon, described ultrasound wave are collected together to have done in order to prevent from receiving echo adopts the low and high level of different coding to be encouraged, and carries out the echo differentiation by auto-correlation.
CN201310449799.8A 2013-09-27 2013-09-27 The unmanned sensory perceptual system of autonomous type and autonomous perception terminal and method of work thereof Active CN103471658B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310449799.8A CN103471658B (en) 2013-09-27 2013-09-27 The unmanned sensory perceptual system of autonomous type and autonomous perception terminal and method of work thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310449799.8A CN103471658B (en) 2013-09-27 2013-09-27 The unmanned sensory perceptual system of autonomous type and autonomous perception terminal and method of work thereof

Publications (2)

Publication Number Publication Date
CN103471658A true CN103471658A (en) 2013-12-25
CN103471658B CN103471658B (en) 2015-08-26

Family

ID=49796604

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310449799.8A Active CN103471658B (en) 2013-09-27 2013-09-27 The unmanned sensory perceptual system of autonomous type and autonomous perception terminal and method of work thereof

Country Status (1)

Country Link
CN (1) CN103471658B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105806317A (en) * 2016-02-28 2016-07-27 西北大学 Cave detection system
CN107181500A (en) * 2017-04-27 2017-09-19 北京凌宇智控科技有限公司 A kind of signal receiving device
CN108421178A (en) * 2018-06-18 2018-08-21 中鸿纳米纤维技术丹阳有限公司 A kind of rescue robot Special transmitting formula life discovery module

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5446445A (en) * 1991-07-10 1995-08-29 Samsung Electronics Co., Ltd. Mobile detection system
US20050206729A1 (en) * 2001-07-11 2005-09-22 Chang Industry, Inc. Deployable monitoring device having self-righting housing and associated method
JP2007130691A (en) * 2005-11-08 2007-05-31 Advanced Telecommunication Research Institute International Communication robot
CN201638346U (en) * 2009-09-25 2010-11-17 向凌云 Mobile safety precaution device
CN102161202A (en) * 2010-12-31 2011-08-24 中国科学院深圳先进技术研究院 Full-view monitoring robot system and monitoring robot
CN102774442A (en) * 2012-06-01 2012-11-14 上海大学 Caterpillar track running device used in constrained space
CN202662115U (en) * 2012-05-18 2013-01-09 哈尔滨理工大学科技园发展有限公司 Electronic security robot

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5446445A (en) * 1991-07-10 1995-08-29 Samsung Electronics Co., Ltd. Mobile detection system
US20050206729A1 (en) * 2001-07-11 2005-09-22 Chang Industry, Inc. Deployable monitoring device having self-righting housing and associated method
JP2007130691A (en) * 2005-11-08 2007-05-31 Advanced Telecommunication Research Institute International Communication robot
CN201638346U (en) * 2009-09-25 2010-11-17 向凌云 Mobile safety precaution device
CN102161202A (en) * 2010-12-31 2011-08-24 中国科学院深圳先进技术研究院 Full-view monitoring robot system and monitoring robot
CN202662115U (en) * 2012-05-18 2013-01-09 哈尔滨理工大学科技园发展有限公司 Electronic security robot
CN102774442A (en) * 2012-06-01 2012-11-14 上海大学 Caterpillar track running device used in constrained space

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
邢永康等: "一种基于区域分割的红外可见光图像融合方法", 《重庆理工大学学报(自然科学)》, vol. 25, no. 1, 31 January 2011 (2011-01-31) *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105806317A (en) * 2016-02-28 2016-07-27 西北大学 Cave detection system
CN105806317B (en) * 2016-02-28 2019-04-23 西北大学 A kind of detecting caves system
CN107181500A (en) * 2017-04-27 2017-09-19 北京凌宇智控科技有限公司 A kind of signal receiving device
CN108421178A (en) * 2018-06-18 2018-08-21 中鸿纳米纤维技术丹阳有限公司 A kind of rescue robot Special transmitting formula life discovery module
CN108421178B (en) * 2018-06-18 2023-10-20 中鸿纳米纤维技术丹阳有限公司 Special emission type life discovery module of rescue robot

Also Published As

Publication number Publication date
CN103471658B (en) 2015-08-26

Similar Documents

Publication Publication Date Title
US20220169383A1 (en) Systems and methods for multi-orientation flight
US20210001838A1 (en) Apparatus and methods for obstacle detection
US20170300051A1 (en) Amphibious vertical take off and landing unmanned device with AI data processing apparatus
US20170123425A1 (en) Salient feature based vehicle positioning
CN101825903B (en) Water surface control method for remotely controlling underwater robot
US20160286128A1 (en) Amphibious vtol super drone camera in a mobile case (phone case) with multiple aerial and aquatic flight modes for capturing panoramic virtual reality views, selfie and interactive video
JP2020537217A (en) Object detection and avoidance for aircraft
CN107585222A (en) Unmanned scout car
US20100179691A1 (en) Robotic Platform
CN106335646A (en) Interference-type anti-UAV (Unmanned Aerial Vehicle) system
JP2018537335A (en) Method and system for controlling the flight of an unmanned aerial vehicle
CN206411021U (en) Natural gas leakage detection robot system based on graphical analysis
US20180339768A1 (en) Systems and methods for uav sensor placement
CN206893109U (en) A kind of unmanned plane monitoring system
CN103471658A (en) Autonomic unmanned perception system, automatic moving perception terminal and working method of autonomic unmanned perception system and automatic moving perception terminal
CN107783547A (en) Post disaster relief rotor wing unmanned aerial vehicle obstacle avoidance system and method
DE3843043A1 (en) Management method and device for disaster and environmental protection
JP2019050007A (en) Method and device for determining position of mobile body and computer readable medium
CN107444608A (en) A kind of new unmanned plane
CN205787918U (en) A kind of detection system of the automatic decision unmanned plane direction of motion
CN113391636A (en) Ultrasonic sensing obstacle avoidance's thing networking intelligence patrols and guards against robot based on 5G communication
Kadous et al. Caster: A robot for urban search and rescue
CN207301375U (en) Land sky combined detection system
CN217835833U (en) Scout car
CN112433541A (en) Intelligent management system of plant protection unmanned aerial vehicle based on Beidou navigation

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant