WO2008133741A2 - Multiple sensor processing - Google Patents

Multiple sensor processing Download PDF

Info

Publication number
WO2008133741A2
WO2008133741A2 PCT/US2007/086590 US2007086590W WO2008133741A2 WO 2008133741 A2 WO2008133741 A2 WO 2008133741A2 US 2007086590 W US2007086590 W US 2007086590W WO 2008133741 A2 WO2008133741 A2 WO 2008133741A2
Authority
WO
WIPO (PCT)
Prior art keywords
sensor system
tracks
tasks
predicted quality
track
Prior art date
Application number
PCT/US2007/086590
Other languages
French (fr)
Other versions
WO2008133741A3 (en
Inventor
Steven T. Cummings
George A. Blaha
Larry L. Stern
Original Assignee
Raytheon Company
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Raytheon Company filed Critical Raytheon Company
Priority to CA2673312A priority Critical patent/CA2673312C/en
Priority to JP2009543038A priority patent/JP5378229B2/en
Priority to EP07874345A priority patent/EP2122384B1/en
Publication of WO2008133741A2 publication Critical patent/WO2008133741A2/en
Publication of WO2008133741A3 publication Critical patent/WO2008133741A3/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/003Transmission of data between radar, sonar or lidar systems and remote stations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
    • G01S13/726Multiple target tracking
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Circuits Of Receivers In General (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

A method of multiple sensor processing includes receiving, at a first sensor system, track data from a second sensor system, comparing track data from the first sensor system to the track data from the second sensor system to determine if a track will be within a field of view of the first sensor system during a time period, determining, at a first sensor system, predicted quality of tracks based on the track data and broadcasting the predicted quality of tracks. The method also includes receiving predicted quality of tracks from the second sensor system and determining a first set of tasks based on the predicted quality of tracks determined by the first sensor system and the predicted quality of tracks received from the second sensor system.

Description

MULTIPLE SENSOR PROCESSING
BACKGROUND
Sensor systems such as radar systems utilize resources to detect objects. Some sensor systems may be adjusted to control the utilization of resources to detect and track objects. One resource may be a type of waveform propagated from the sensor system. Another type of resource may include the amount of energy available to propagate the waveform. Other resources may include an amount of processing time dedicated to process a sensor contact. Other resources may include determining a number of objects to observe.
The challenge in determining a resource utilization develops when resources are in conflict. For example, in a radar system, a first waveform tracks objects better than using a second waveform. However, the radar system may expend a certain energy transmitting the first waveform to track contacts but expend less energy transmitting the second waveform. Determining how to allocate the resources is a challenge for a single sensor system but the challenge is compounded in a sensor network environment having more than one sensor system.
SUMMARY In one aspect, the invention is a method of multiple sensor processing. The method includes receiving, at a first sensor system, track data from a second sensor system, comparing track data from the first sensor system to the track data from the second sensor system to determine if a track will be within a field of view of the first sensor system during a time period, determining, at a first sensor system, predicted quality of tracks based on the track data and broadcasting the predicted quality of tracks. The method also includes receiving predicted quality of tracks from the second sensor system and determining a first set of tasks based on the predicted quality of tracks determined by the first sensor system and the predicted quality of tracks received from the second sensor system.
In another aspect the invention is an article including a machine-readable medium that stores executable instructions used in multiple sensor processing. The instructions cause a machine to receive, at a first sensor system, track data from a second sensor system, compare track data from the first sensor system to the track data from the second sensor system to determine if a track will be within a field of view of the first sensor system during a time period, determine, at a first sensor system, predicted quality of tracks based on the track data and broadcast the predicted quality of tracks. The instructions also cause a machine to receive predicted quality of tracks from the second sensor system and determine a first set of tasks based on the predicted quality of tracks determined by the first sensor system and the predicted quality of tracks received from the second sensor system.
In a further aspect, the invention is an apparatus used in multiple sensor processing. The apparatus includes circuitry to receive, at a first sensor system, track data from a second sensor system, compare track data from the first sensor system to the track data from the second sensor system to determine if a track will be within a field of view of the first sensor system during a time period, determine, at a first sensor system, predicted quality of tracks based on the track data and broadcast the predicted quality of tracks. The apparatus also includes circuitry to receive predicted quality of tracks from the second sensor system and determine a first set of tasks based on the predicted quality of tracks determined by the first sensor system and the predicted quality of tracks received from the second sensor system.
DESCRIPTION OF THE DRAWINGS
FIG. 1 is a block diagram of an example of a sensor network. FIG. 2 is a block diagram of an example of a sensor system. FIG. 3 is a flowchart of an example of a process performed by each sensor system in the sensor network. FIG. 4 is a diagram of a sensor network environment.
FIG. 5 is a flowchart of an example of a process to determine a set of tasks. FIG. 6 is an example of a list of candidate solutions.
DETAILED DESCRIPTION Referring to FIG. 1 , a sensor network 10 includes sensor systems (e.g., a sensor system 12a, a sensor system 12b, a sensor system 12c and a sensor system 12d) connected by a network (e.g., a wired network, a wireless network or a combination thereof). In one example, the sensor network 10 is a battlefield sensor network where each sensor system 12a-12d is used to detect and track sensor contacts (i.e., tracks), for example, enemy targets, friendly targets, unidentified targets and so forth. The sensor systems 12a-12d may include, for example, ground-based radar systems, air-based radar systems, sea-based radar systems, space-based radar systems or any combination thereof. In one example, the sensor systems 12a-12d may be a mix of different sensor systems having different bandwidths, spectra and be of different generations of sensor systems.
Each sensor system 12a-12d dedicates a certain amount of processing time to track a contact. Due to the speed at which some tracks may travel and the speed of processing required to monitor the tracks, sensor systems may not be able to track each and every track efficiently enough to meet mission objectives (e.g., defending a ship from attack). A central resource manager that monitors and controls the sensor systems has performed balancing and control of the sensor systems in conventional systems. However, the central resource manager adds to processing response time. In contrast, the novel sensor network architecture 10 is a distributed sensor architecture having no central resource manager to manage the processing of the sensor systems 12a-12d, but rather, the sensor systems contribute to the overall management of the sensor network 10. For example, a sensor system 12a determines how to allocate resources to track a contact based on data received from the other sensor systems 12b-12d. The other sensors systems 12b-12d are each also determining how to allocate resources to track a contact based on the data received by the other sensor systems.
FIG. 2 depicts a general sensor system, for example, the sensor system 12a (FIG. 1 ). The sensor system 12a includes a processor 22, a volatile memory 24, a non- volatile memory 26 (e.g., a hard disk), a sensor 28 and a transceiver 30. The non-volatile memory 26 includes computer instructions 34, an operating system 36 and data 38. The sensor 28 detects contacts. For example, the sensor 28 is a radar. The transceiver 30 allows for communications to and from other sensor systems 12b-12d through the network 14 Referring to FIG. 3, a process 50 is an example of a process performed by each of the sensor systems 12a-12d, for example, the sensor system 12a, to determine, for example, resource utilization of each sensor system in the sensor network 10. In one example, the computer instructions 24 (FIG. 2) are executed by the processor 22 (FIG. 2) out of the volatile memory 24 (FIG. 2) to perform process 50.
Process 50 receives track data from the other sensor systems (52). For example, the sensor system 12a receives track data from the other sensor systems 12b-12d through the transceiver 30 from the network 14 (FIG. 2). In one example, as in FIG. 4, a track 42 and a track 44 are in a field of view (i.e., viewing angle), Ql , of the sensor system 12a and the track 44 and a track 46 are in a field of view, Q2, of the sensor system 12b. The sensor system 12b would send track data on the track 44 and the track 46 to the sensor system 12a.
Process 50 compares the track data to sensor specific track data from the sensor system (56). The sensor specific tracks relate to tracks specifically observed by the sensor system even though other tracks may be in the field of view of the sensor system.
For example, the sensor system 12a compares the tracks that the sensor system 12a observed with the tracks observed by the other sensor systems 12b-12d. In one example, as in FIG. 4, the sensor system 12a, for resource allocation reasons, is only observing track 42 even though track 44 is in its field of view Ql . The sensor system 12b is observing the tracks 44, 46.
Process 50 determines if a track will be in range of the sensor system during a future time period (62). For example, the sensor system 12a determines if a track will be in its range for a time period (e.g., a schedule period). In one example, as in FIG. 4, the sensor system 12a would determine if track 42, 44, 46 will be in its field of view Ql in the time period. In this example, the sensor system 12a determines that tracks 44, 46 will remain in the field of view Ql and that track 46 will be in the field of view Ql during the time period. Process 50 predicts quality of track observations and resource costs (66). For example, the quality of a track observation may be represented by a track covariance matrix as well as a probability of future resolution. For instance, the track covariance matrix represents the estimated error variances in range, range rate, azimuth, azimuth rate, elevation, and elevation rate of the tracked target as well as the covariances between all of these quantities, e.g., the error bounds of range and range rate due to correlation between these quantities.
The probability of future resolution refers to the proposed system predicting where the tracked objects will be in the future and estimating the probability of each sensor in the network being able to resolve, that is, independently measure, the position of the tracked objects based on the individual sensor's capabilities. The resource costs may be quantified as the percentage of the total duty or occupancy limit that is expended within a resource period in order to detect the track. If multiple options exist for observing a track (e.g., multiple waveforms), the observation quality and resource costs is predicted for each option. In other example, predict measurement quality and resource costs for more than one point in time.
Process 50 broadcasts the quality of track predictions to the other sensor systems (72). For example, sensor system 12a, broadcasts the quality of track predictions to the sensor systems 12b-12d through the network 14. Process 50 receives quality of track predictions from the other sensor systems (76). For example, sensor system 12a, receives the quality of track predictions from the sensor systems 12b-12d from the network 14. Process 50 determines a set of tasks (82). For example, based on the quality of track predictions received from the sensor system 12b- 12d and the quality of track predictions determined by the sensor system 12a, the sensor system 12a chooses a set of tasks (i.e., a plan) for each sensor system 12a-12d that minimizes a particular cost function (e.g., derived from priorities) while satisfying resource constraints. The set of tasks may include which tracks to observe. The set of tasks may also include what waveform to use to observe a track. In one embodiment, the other sensor system 12b-12d have also determined the set of tasks using processing 50 separately which is the same set of tasks as determined by the sensor system 12a.
Process 50 executes the set of tasks (84). For example, the sensor system 12a executes a portion of the set of tasks applicable to the sensor system 12a. In one embodiment, the other sensors system 12b-12d each executes the set of tasks applicable to their respective sensor system.
Process 50 determines if the set of tasks is current (86). If the set of tasks is not current, process 50 predicts quality of the track observations (66). If process 50 determines that the set of tasks is current, process 50 determines if there is a new track (92). If process 50 determines there is a new track, process 50 compares track data to sensor specific track data (56). If process 50 determines there is no new track, process 50 determines if the set of tasks is current (86).
Referring to FIG. 5, an example of a process to determine a set of tasks (82) is a process 100. In general, there may be many combinatorial optimization techniques available to determine the set of tasks. A solution as used herein is the set of tasks. In one example, process 100 is a heuristic process.
Exact solution techniques usually find the globally optimum solution (ideal sensor tasking) if allowed to run to completion. However, for task selection problems that are sufficiently large (e.g., many sensors, many targets) it is generally impractical to allow an exact algorithm to run to completion. Instead, it is advantageous to make use of heuristic solution techniques which may not be guaranteed to find the globally optimum solution, but are designed to find near-optimal solutions quickly. Detailed descriptions of many exact and heuristic solution techniques are available in the prior art. One particular heuristic solution technique, a "Tabu search'", may be implemented to solve the sensor task selection problem. The Tabu search is an iterative search method that augments the local search algorithm performed by the sensor system by employing a strategy to find global rather local (i.e., sensor system specific) optimum solutions. Process 100 determines an initial solution (102). For example, the initial solution to the sensor task selection problem is generated by choosing a feasible (can be achieved without exceeding
100% utilization of any one sensor system 12a-12d) solution in a deterministic way. Specifically, for each sensor system 12a- 12d, all available options within a given scheduling period are collected for each sensor. The options are sorted with each scheduling period by resource utilization consumed by each option. The option with the least resource utilization for the entire scheduling period is selected to generate the initial solution. The initial schedule generated will be the initial point in the algorithm from which to proceed and find the "best" solution. The initial sensor task selection will yield a low-resource utilization. In other examples, an alternate way of selecting the initial solution would be to use a non-deterministic method of generating the initial schedule. This offers the advantage of starting point diversity, which may improve solution quality if the cost function has a very large number of local minima over the solution space. An example of a non-deterministic way of selecting the initial solution is to randomly select the options for each scheduling period.
Process 100 generates a list of schedule options (1 12). The list of schedule options explored at each iteration is generated by considering the combinatorial neighborhood of the schedule options determined from the previous iteration. Specifically, the combinatorial neighborhood is defined as all options that may be obtained by changing a single option in the schedule by one step as described below. A step is defined as choosing the next higher or next lower resource utilization option.
Referring to FIG. 6, a matrix 200 includes a first period schedule 202, a set of second period schedule options (e.g., a second period schedule option 206a a second period schedule option 206b, a second period schedule option 206c, ..., and a second period schedule option 206M), a set of third period schedule options (e.g., a third period schedule option 212a, a third period schedule option 212b, a third period schedule option 212c, ..., and a third period schedule option 212M) and a set of Nth period schedule options (e.g., an Nth period schedule option 218a, an Nth period schedule option 218b, an Nth period schedule option 218c, ..., and an Nth period schedule option 218M). N represents the number of scheduling periods and M represents the number of schedule options (e.g., utilization choices per period). In FIG. 6, the assumption is that there is M schedule options per N period.
However, in other examples, each period may have a different schedule option per period. The schedule options 206a-206M, 212a-212M, 218a-218M represent possible resource utilizations for the corresponding scheduling period. The first period schedule 202 represents the set of tasks to be performed in the first period as determined in processing block 102. Process 100 determines the best period schedule option for each subsequent time period. For example, in the second time period, the second period schedule option 206a-206M are selected one at a time to determine the best schedule option. In one example, the dashed line path represents a "next-step" in the schedule option selection. In one iteration, the second period schedule 206b is chosen. In the next iteration the second period schedule 206c is chosen. In every iteration, only one scheduling option is being changed to pick the utilization closest to the "best" solution.
For a given schedule option, the neighboring schedule options represent either lower or higher-resource utilization options. For example, the second period schedule option 206c has a lower resource utilization than the second period schedule 206b while the second period schedule option 206a.
In one example, choosing schedule options along a solid line path 222 yields the
"best" set of schedule option and the best solution would include the set of best schedule options. For example, the best solution includes the best schedule options such as the second period schedule option 206b, the third period schedule option 212c and the Nth period schedule option 218a. Process 100 evaluates options (116). For example, each candidate neighborhood schedule options are evaluated by computing the future aggregate track error for the particular combination of sensor tasks, using the observation quality predictions made by each sensor. In addition, the feasibility of the schedule options is evaluated by computing the resource usage for the particular combination of sensor tasks.
Process 100 determines a best admissible solution (122). A solution is the best admissible solution once it becomes the "best" neighboring schedule option. In one example, a "best" neighboring schedule option yields lower cost and is feasible (e.g., does not exceed 100% utilization of any one sensor). Process 100 determines if stopping conditions are met (126). For example, iterations are halted at a fixed number of iterations after a locally optimum solution is encountered. In other examples, substantially more sophisticated stopping conditions may be formulated that take into account details of the problem in addition to the computing time available before a solution is required. If the stopping conditions are not met, process 100 updates the conditions (134).
For example, the conditions may include neighboring schedule options which have already been selected by the algorithm so that neighboring schedules options are not visited in future iterations of the algorithm. The algorithm is designed to search for the global minimum and not stop at local minima. In order to accomplish this, a higher-cost subsequent schedule option may be selected as the best neighboring schedule option as long as it is feasible to execute.
If stopping conditions are met, process 100 uses a final solution (144). For example, process 100 uses the final schedule options. In one example, if subsequent schedule options become infeasible, the current schedule options are declared the global minima, i.e., the solution to the search.
Process 50 is not limited to use with the hardware and software of FIG. 2; it may find applicability in any computing or processing environment and with any type of machine or set of machines that is capable of running a computer program. Process 50 may be implemented in hardware, software, or a combination of the two. Process 50 may be implemented in computer programs executed on programmable computers/machines that each includes a processor, a storage medium or other article of manufacture that is readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and one or more output devices. Program code may be applied to data entered using an input device to perform process 50 and to generate output information.
The system may be implemented, at least in part, via a computer program product, (i.e., a computer program tangibly embodied in an information carrier (e.g., in a machine- readable storage device or in a propagated signal)), for execution by, or to control the operation of, data processing apparatus (e.g., a programmable processor, a computer, or multiple computers)). Each such program may be implemented in a high level procedural or object-oriented programming language to communicate with a computer system. However, the programs may be implemented in assembly or machine language. The language may be a compiled or an interpreted language and it may be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program may be deployed to be executed on one computer or on multiple computers at one site or distributed across multiple sites and interconnected by a communication network. A computer program may be stored on a storage medium or device (e.g., CD-ROM, hard disk, or magnetic diskette) that is readable by a general or special purpose programmable computer for configuring and operating the computer when the storage medium or device is read by the computer to perform process 50. Process 50 may also be implemented as a machine- readable storage medium, configured with a computer program, where upon execution, instructions in the computer program cause the computer to operate in accordance with process 50.
The processes described herein are not limited to the specific embodiments described herein. For example, the processes 30 and 50 are not limited to the specific processing order of FIGS. 3 and 5, respectively. Rather, any of the processing blocks of FIGS. 3 and 5 may be re-ordered, combined or removed, performed in parallel or in serial, as necessary, to achieve the results set forth above.
In other embodiments, one or more of the sensor systems may perform different types of processes to determine the set of tasks than the other sensor systems in the network. In these embodiments, the set of tasks determined by one or more of the sensors is substantially the same as the set of tasks determined by the other sensor systems within an accepted tolerance difference.
In one example, in determining the set of tasks in block 82 (FIG. 3), an embedded resource usage benefit calculation based on the broadcasted track predictions of each sensor system is sent to every other sensor system across the network. The embedded resource usage benefit calculation is performed for each task to reduce the total number of paths through the schedule options by retaining only those tasks which significantly improve the particular cost function. In executing the sensor resource utilization process, it is possible, based on the particulars of the multiple sensor system locations and each sensor system's individual capabilities, that the embedded resource usage benefit calculation can eliminate the need for a search of a large number of schedule options, leaving a single possible feasible solution.
The processing blocks in FIGS. 3 and 5 associated with implementing the system may be performed by one or more programmable processors executing one or more computer programs to perform the functions of the system. All or part of the system may be implemented as, special puipose logic circuitry (e.g., an FPGA (field programmable gate array) and/or an ASIC (application-specific integrated circuit)).
Processors suitable for the execution of a computer program include, by way of example, both general and special puipose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. Elements of a computer include a processor for executing instructions and one or more memory devices for storing instructions and data.
Elements of different embodiments described herein may be combined to form other embodiments not specifically set forth above. Other embodiments not specifically described herein are also within the scope of the following claims. What is claimed is:

Claims

1. A method of multiple sensor processing comprising: receiving, at a first sensor system, track data from a second sensor system; comparing track data from the first sensor system to the track data from the second sensor system to determine if a track will be within a field of view of the first sensor system during a time period; determining, at a first sensor system, predicted quality of tracks based on the track data; broadcasting the predicted quality of tracks; receiving predicted quality of tracks from the second sensor system; and determining a first set of tasks based on the predicted quality of tracks determined by the first sensor system and the predicted quality of tracks received from the second sensor system.
2. The method of claim 1 , further comprising: receiving, at the second sensor system, track data from a first sensor system; comparing track data from the second sensor system to the track data from the first sensor system to determine if a track will be within a field of view of the second sensor system during the time period; receiving predicted quality of tracks from the first sensor system; and determining a second set of tasks based on the predicted quality of tracks determined by the second sensor system and the predicted quality of tracks received from the first sensor system.
3. The method of claim 2 wherein the first set of tasks and the second set of task are substantially equivalent.
4. The method of claim 1 wherein determining the first set of tasks comprises determining a set of tasks performed by each sensor system.
5. The method of claim 1 wherein determining the first set of tasks comprises determining a first set of tasks using a tabu algorithm.
6. The method of claim 1 wherein determining a first set of tasks comprises determining utilization of resources.
7. The method of claim 1 wherein determining a first set of tasks comprises determining an embedded resource usage benefit calculation based on the broadcasted track predictions.
8. The method of claim 1 wherein determining the predicted quality of tracks comprises determining quality of track observations of the tracks.
9. The method of claim 8 wherein determining the quality of track observations of the tracks comprises determining kinematic measurement variances of the tracks.
10. The method of claim 1 wherein determining the predicted quality of tracks comprises determining resource costs for the tracks.
1 1. An article comprising a machine-readable medium that stores executable instructions used in multiple sensor processing, the instructions causing a machine to: receive, at a first sensor system, track data from a second sensor system; compare track data from the first sensor system to the track data from the second sensor system to determine if a track will be within a field of view of the first sensor system during a time period; determine, at a first sensor system, predicted quality of tracks based on the track data; broadcast the predicted quality of tracks; receive predicted quality of tracks from the second sensor system; and determine a first set of tasks based on the predicted quality of tracks determined by the first sensor system and the predicted quality of tracks received from the second sensor system.
12. The article of claim 1 1 , further comprising instructions causing the machine to: receive, at the second sensor system, track data from a first sensor system; compare track data from the second sensor system to the track data from the first sensor system to determine if a track will be within a field of view of the second sensor system during the time period; receive predicted quality of tracks from the first sensor system; and determine a second set of tasks based on the predicted quality of tracks determined by the second sensor system and the predicted quality of tracks received from the first sensor system.
13. The article of claim 12 wherein the first set of tasks and the second set of task are substantially equivalent.
14. The article of claim 1 1 wherein the instructions causing the machine to determine the first set of tasks comprises instructions causing a machine to determine a set of tasks performed by each sensor system.
15. The article of claim 1 1 wherein instructions causing the machine to determine the first set of tasks comprises instructions causing the machine to determine a first set of tasks using a tabu algorithm.
16. An apparatus used in multiple sensor processing, comprising: circuitry to: receive, at a first sensor system, track data from a second sensor system; compare track data from the first sensor system to the track data from the second sensor system to determine if a track will be within a field of view of the first sensor system during a time period; determine, at a first sensor system, predicted quality of tracks based on the track data; broadcast the predicted quality of tracks; receive predicted quality of tracks from the second sensor system; and determine a first set of tasks based on the predicted quality of tracks determined by the first sensor system and the predicted quality of tracks received from the second sensor system.
17. The apparatus of claim 16 wherein the circuitry comprises at least one of a processor, a memory, programmable logic and logic gates.
18. The apparatus of claim 16, further comprising circuitry to: receive, at the second sensor system, track data from a first sensor system; compare track data from the second sensor system to the track data from the first sensor system to determine if a track will be within a field of view of the second sensor system during the time period; receive predicted quality of tracks from the first sensor system; and determine a second set of tasks based on the predicted quality of tracks determined by the second sensor system and the predicted quality of tracks received from the first sensor system.
19. The apparatus of claim 18 wherein the first set of tasks and the second set of task are substantially equivalent.
20. The apparatus of claim 16 wherein the circuitry to determine the first set of tasks comprises circuitry to determine a set of tasks performed by each sensor system.
21. The apparatus of claim 16 wherein circuitry to determine the first set of tasks comprises circuitry to determine a first set of tasks using a tabu algorithm.
PCT/US2007/086590 2006-12-20 2007-12-06 Multiple sensor processing WO2008133741A2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CA2673312A CA2673312C (en) 2006-12-20 2007-12-06 Multiple sensor processing
JP2009543038A JP5378229B2 (en) 2006-12-20 2007-12-06 Multi-sensor processing
EP07874345A EP2122384B1 (en) 2006-12-20 2007-12-06 Multiple sensor processing

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US87092306P 2006-12-20 2006-12-20
US60/870,923 2006-12-20
US11/941,402 2007-11-16
US11/941,402 US7508335B2 (en) 2006-12-20 2007-11-16 Multiple sensor processing

Publications (2)

Publication Number Publication Date
WO2008133741A2 true WO2008133741A2 (en) 2008-11-06
WO2008133741A3 WO2008133741A3 (en) 2008-12-24

Family

ID=39542024

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2007/086590 WO2008133741A2 (en) 2006-12-20 2007-12-06 Multiple sensor processing

Country Status (5)

Country Link
US (1) US7508335B2 (en)
EP (1) EP2122384B1 (en)
JP (1) JP5378229B2 (en)
CA (1) CA2673312C (en)
WO (1) WO2008133741A2 (en)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100013935A1 (en) * 2006-06-14 2010-01-21 Honeywell International Inc. Multiple target tracking system incorporating merge, split and reacquisition hypotheses
US7508335B2 (en) * 2006-12-20 2009-03-24 Raytheon Company Multiple sensor processing
JP4831103B2 (en) * 2008-03-28 2011-12-07 三菱電機株式会社 Radar equipment
IT1401374B1 (en) * 2010-08-09 2013-07-18 Selex Sistemi Integrati Spa THREE-DIMENSIONAL MULTISENSOR TRACKING BASED ON TWO-DIMENSIONAL TRACKS ACQUIRED BY TARGET SENSOR TRACKERS
US8718323B2 (en) 2010-10-19 2014-05-06 Raytheon Company Batch detection association for enhanced target descrimination in dense detection environments
AU2011380291B2 (en) * 2011-10-31 2015-03-12 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Apparatus and method for analyzing sensor data
JP5991806B2 (en) * 2011-11-10 2016-09-14 三菱電機株式会社 Wake integration apparatus, wake integration system, computer program, and wake integration method
JP6202850B2 (en) * 2013-03-28 2017-09-27 三菱電機株式会社 Target tracking device
US9489195B2 (en) 2013-07-16 2016-11-08 Raytheon Company Method and apparatus for configuring control software for radar systems having different hardware architectures and related software products
US9557406B2 (en) * 2013-07-16 2017-01-31 Raytheon Command And Control Solutions Llc Method, system, and software for supporting multiple radar mission types
US9594160B2 (en) * 2014-02-25 2017-03-14 The Mitre Corporation Track associator
US10101196B2 (en) 2016-02-17 2018-10-16 Qualcomm Incorporated Device for UAV detection and identification
WO2018063713A1 (en) * 2016-09-30 2018-04-05 Qualcomm Incorporated Device for uav detection and identification
US10466346B2 (en) * 2017-11-17 2019-11-05 Gm Global Technology Operations, Llc Method and apparatus for continuous tracking in a multi-radar system
WO2019140185A1 (en) * 2018-01-11 2019-07-18 Shemirade Management Llc Architecture for vehicle automation and fail operational automation
JP7337008B2 (en) * 2020-03-10 2023-09-01 三菱電機株式会社 radar equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5138321A (en) * 1991-10-15 1992-08-11 International Business Machines Corporation Method for distributed data association and multi-target tracking
WO2001009641A1 (en) * 1999-07-30 2001-02-08 Litton Systems, Inc. Registration method for multiple sensor radar
JP2002014162A (en) * 2000-06-30 2002-01-18 Nec Eng Ltd Method for calculating most probable positions of plurality of radars
EP1533628A1 (en) * 2003-11-19 2005-05-25 Saab Ab A method for correlating and numbering target tracks from multiple sources
WO2006097771A1 (en) * 2005-03-17 2006-09-21 Bae Systems Plc Networks
US20060220951A1 (en) * 2005-04-04 2006-10-05 Raytheon Company System and method for coherently combining a plurality of radars

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6435391A (en) * 1987-07-31 1989-02-06 Hitachi Ltd Automatic track information correlation adjustment system for multiple radar system
JPH02141687A (en) * 1988-11-22 1990-05-31 Nec Corp Automatic tracking system of ship-bone radar
US5696503A (en) * 1993-07-23 1997-12-09 Condition Monitoring Systems, Inc. Wide area traffic surveillance using a multisensor tracking system
US5961571A (en) * 1994-12-27 1999-10-05 Siemens Corporated Research, Inc Method and apparatus for automatically tracking the location of vehicles
US6415188B1 (en) * 1998-12-23 2002-07-02 Dennis Sunga Fernandez Method and apparatus for multi-sensor processing
JP3469151B2 (en) * 1999-12-17 2003-11-25 三菱電機株式会社 Operation method of multiple radar cooperation system
JP3642287B2 (en) * 2001-03-16 2005-04-27 三菱電機株式会社 Radar system and radar apparatus
JP4771657B2 (en) * 2001-08-16 2011-09-14 ヴァレオ・レイダー・システムズ・インコーポレーテッド Proximity object detection system
US6885303B2 (en) * 2002-02-15 2005-04-26 Hrl Laboratories, Llc Motion prediction within an amorphous sensor array
JP4693633B2 (en) * 2006-01-11 2011-06-01 三菱電機株式会社 Target tracking device
US7508335B2 (en) * 2006-12-20 2009-03-24 Raytheon Company Multiple sensor processing
US7944981B2 (en) * 2007-05-31 2011-05-17 Motorola Mobility, Inc. Data transmission in a frequency division multiple access communication system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5138321A (en) * 1991-10-15 1992-08-11 International Business Machines Corporation Method for distributed data association and multi-target tracking
WO2001009641A1 (en) * 1999-07-30 2001-02-08 Litton Systems, Inc. Registration method for multiple sensor radar
JP2002014162A (en) * 2000-06-30 2002-01-18 Nec Eng Ltd Method for calculating most probable positions of plurality of radars
EP1533628A1 (en) * 2003-11-19 2005-05-25 Saab Ab A method for correlating and numbering target tracks from multiple sources
WO2006097771A1 (en) * 2005-03-17 2006-09-21 Bae Systems Plc Networks
US20060220951A1 (en) * 2005-04-04 2006-10-05 Raytheon Company System and method for coherently combining a plurality of radars

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
BLACKMAN S ET AL: "MULTIPLE SENSOR TRACKING: ISSUES AND METHODS" DESIGN AND ANALYSIS OF MODERN TRACKING SYSTEMS, XX, XX, 1 January 1999 (1999-01-01), pages 595-659, XP002277517 *

Also Published As

Publication number Publication date
CA2673312C (en) 2016-03-15
WO2008133741A3 (en) 2008-12-24
EP2122384B1 (en) 2012-10-24
US20080150787A1 (en) 2008-06-26
JP5378229B2 (en) 2013-12-25
EP2122384A2 (en) 2009-11-25
US7508335B2 (en) 2009-03-24
CA2673312A1 (en) 2008-11-06
JP2010513932A (en) 2010-04-30

Similar Documents

Publication Publication Date Title
EP2122384B1 (en) Multiple sensor processing
Moo et al. Adaptive radar resource management
Yan et al. Prior knowledge-based simultaneous multibeam power allocation algorithm for cognitive multiple targets tracking in clutter
Jimenez et al. Design of task scheduling process for a multifunction radar
US8860603B2 (en) Method for optimizing the management of radar time for secondary radars operating in modes
WO2007144570A1 (en) Improvements relating to target tracking
Narykov et al. Algorithm for resource management of multiple phased array radars for target tracking
Bozdogan et al. Improved assignment with ant colony optimization for multi-target tracking
JP3339295B2 (en) Sensor group management device
JP2001051051A (en) Radar controller
Byrne et al. Rolling horizon non-myopic scheduling of multifunction radar for search and track
Krout et al. Intelligent ping sequencing for multistatic sonar systems
Ding et al. Benefits of target prioritization for phased array radar resource management
Hanselman et al. Dynamic tactical targeting
Focke et al. Interval Algebra–An effective means of scheduling surveillance radar networks
Charlish et al. Sensor management for radar networks
Boers et al. Adaptive MFR parameter control: fixed against variable probabilities of detection
Xun et al. Control based sensor management for a multiple radar monitoring scenario
Tao et al. An optimal algorithm of time resource for multi-target tracking under active oppressive jamming
Blackman et al. Improved tracking capability and efficient radar allocation through the fusion of radar and infrared search-and-track observations
Focke et al. Implementing Interval Algebra to schedule mechanically scanned multistatic radars
Sanli et al. Joint coverage scheduling and identity management for multiple-target tracking in wireless sensor networks
Jang et al. Heuristic pulse interleaving algorithms for multi-target tracking on pulse doppler phased array radars
JP2001296116A (en) Control device for plural sensors
JP5768436B2 (en) Beam management apparatus, radar apparatus including the same, beam management method and program

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 2673312

Country of ref document: CA

ENP Entry into the national phase

Ref document number: 2009543038

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2007874345

Country of ref document: EP

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07874345

Country of ref document: EP

Kind code of ref document: A2