US8719317B1 - Structured data conversion system - Google Patents

Structured data conversion system Download PDF

Info

Publication number
US8719317B1
US8719317B1 US12/930,910 US93091010A US8719317B1 US 8719317 B1 US8719317 B1 US 8719317B1 US 93091010 A US93091010 A US 93091010A US 8719317 B1 US8719317 B1 US 8719317B1
Authority
US
United States
Prior art keywords
sensor
data
standard
file
computer program
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US12/930,910
Inventor
Louis M. Crain
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NAVY USA AS REPRESENTED BY DEPARTMENT OF
US Department of Navy
Original Assignee
US Department of Navy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by US Department of Navy filed Critical US Department of Navy
Priority to US12/930,910 priority Critical patent/US8719317B1/en
Assigned to NAVY, U.S.A. AS REPRESENTED BY THE DEPARTMENT OF THE reassignment NAVY, U.S.A. AS REPRESENTED BY THE DEPARTMENT OF THE ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CRAIN, LOUS M.
Priority to US12/932,342 priority patent/US8756259B1/en
Application granted granted Critical
Publication of US8719317B1 publication Critical patent/US8719317B1/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks

Definitions

  • FIG. 1 illustrates a schematic block diagram of a Structured Data Conversion system, according to embodiments of the invention.
  • FIG. 5 illustrates a screen shot of a SENSRB Modules Form, according to embodiments of the invention.
  • FIG. 10B illustrates a screen shot of a SENSRB consistency window after recalculating to eliminate inconsistencies, according to embodiments of the invention.
  • the invention generally relates to data conversion, and more particularly, to a structured data conversion system consistent with at least one standard.
  • FIG. 1 illustrates a first aspect of a structured data conversion system.
  • Reference character 10 generally indicates an apparatus of embodiments of the invention.
  • FIGS. 2 and 3 screen shots of a Graphical User Interface are shown, according to embodiments of the invention.
  • FIG. 2 illustrates a screen shot of a Graphical User Interface of a Structured Data Conversion System, according to embodiments of the invention.
  • FIG. 3 illustrates a screen shot of a Graphical User Interface of a SENSRB Data Conversion System, according to embodiments of the invention.
  • the SENSRB Data Conversion System is a specific implementation of the Structured Data Conversion System.
  • the at least one external standard file 18 is a text file adapted for input and output of metadata into and out of the data conversion system 10 .
  • the converting data step further includes converting data from a structure view format to a data view format by using the selection device to actuate a first convert icon in the graphical user interface 30 A and 30 B.
  • the twenty-five character name must be unique, explicit, and registered with the NTB.c 01d Platform URI 0 0 0 0 Platform Uniform Resource Identifier.
  • This optional field allows a unique platform identification, especially for non-aircraft platforms where ACFTB TRE AC_TAIL_NO does not apply.
  • the thirty-two byte field accommodates a Uniform Resource Identifiers (URI), which can facilitate access to more detailed platform information via the intemet. See Z.5.1.X Uniform Resource Identifiers. 01e Operational 0 0 0 0 0 Operational Domain. Specifies the sensor platform's domain of operation during the collection-providing Domain some indication of the imageiy perspective.
  • END_DATE and END_TIME may be equal to START_DATE and START_TIME, respectively, for imagery collections modeled as instantaneously collected.
  • 01p Generation Count 0 0 0 0 Generation Count The number of times the data contained in this TRE has been resected or adjusted. Zero (00) shall represent that the data is the original NITF form, which might have undergone manipulations reflected in Module 4. If subsequent modifications are made to this metadata through resection or adjustments, this count will be incremented accordingly.
  • 01q Generation Date 0 0 0 0 0 Generation Date 01q Generation Date 0 0 0 0 Generation Date.
  • the value is in seconds before (negative) or after (positive) START_TIME.
  • a zero value places the reference time at the imaging start time, such as would be the case for sensors modeled with instantaneous collections-where START_TIME equals END_TIME.
  • This field shall not be defaulted if either REFERENCE_PIXEL_ROW or REFERENCE_PIXEL_COLUMN (index 05b or 05c) is defaulted.
  • 05b Reference Row 0 0 0 0 0 Reference Pixel Row and Column of Applicability. Reference row and column pixel index relative to the NITF image segment at which the field values in this TRE apply.
  • each button represents whether that module is Required, Conditional, a One or Other module, or an Optional module.
  • the Modules Form 60 B shows the modules required for a Content Level 2 SENSRB.
  • the ten icons to the right of the SENSRB View icon represent operations on various representations of the data.
  • a portion of a sample Name View is shown below in Table III.
  • the user selects the drop down box 46 B below the Data View icon 36 B ( FIG. 3 ) to select the representation desired.
  • the dominant view is the Name View and generally the other views are derived from it.
  • that view can be converted into a Name View by using generic Field Names (the names used in the documentation but not precisely the names used in operation, for example “Latitude” for Geodetic Type and “Geocentric X” for Cartesian Type. Future work includes adding other data views to the system.
  • the erase icons 44 B are good ways to get a fresh start in creating a SENSRB file from scratch when the user needs to start over for some reason. Selecting the erase icons 44 B will delete all ModuleForms and all associated InputForms.
  • the erase icon under the Data View will erase the particular Data View currently being viewed. For example, the user can erase the XML View and go back to the Name View and make changes. By again selecting the XML View, the changes the user has made will be reflected in the XML View.
  • the text window 50 B is always active.
  • the user can edit the file, including insertions, deletions, copy, paste, and selecting all. For example, to copy a Data View into a word processing program, the user touches the Data View text window, enters commands specific to the particular word processing program, then actuates the word processing program, and pastes the selected item.
  • a change made in the text window 50 B of any active component becomes a part of that data and any use of that data will reflect those changes. For example, when the user makes changes to a Name View table, and then “Convert” into SENSRB, the new SENSRB will contain those changes.
  • FIG. 7A illustrates a screen shot of a SENSRB attitude input form 90 in Euler angles, according to embodiments of the invention.
  • FIG. 8 illustrates a screen shot of a SENSRB attitude input form 110 in quaternions, according to embodiments of the invention.
  • the user may open another attitude module and view that same information in another format.
  • the form is pre-populated with the values appropriate for those forms consistent with the first defined form.
  • a flag field 92 , 102 , and 112 is “N” and all of the new fields are gray (undefined).
  • the SENSRB TRE can include multiple attitude modules, thus making it permissible to change the flag field 92 , 102 , and 112 to “Y” and select a default icon 94 , 104 , and 114 and accept the pre-populated values in those fields.
  • FIG. 9 illustrates a screen shot of a SENSRB inconsistent attitude representation 120 , according to embodiments of the invention. This occurs where two or more attitude modules could be defined that are inconsistent representations of the attitude.
  • the sign is changed on one of the terms in the Quaternion Module. The user sees the FIG. 9 screen when attempting to save that module.
  • a consistency window 130 has appeared indicating that, in this case, there are two inconsistent Attitude Modules.
  • the buttons in the window provide a variety of means to address the problem.
  • FIG. 10A illustrates a screen shot of a SENSRB consistency window 130 notification, according to embodiments of the invention.
  • the user sees an Euler icon 132 and a Quatemion icon 134 .
  • the user will select either the Euler icon 132 or the Quaternion icon 134 to indicate which definition should prevail.
  • the consistency window 130 includes two leave inconsistent icons 136 and 138 and two deactivate icons 140 and 142 .
  • the user could elect to leave the two modules inconsistent in which case both modules will be eventually written to the SENSRB file.
  • the deactivate icons cause the flag field 92 , 102 , or 112 in FIGS. 7A , 7 B, or 8 of the corresponding Module to be set to “N”.

Abstract

Embodiments of the invention relate to converting infrared, radar, and photographic sensor data using an apparatus having a graphical user interface, a user-selection device, and an electronic processor configured to convert sensor data conversion computer program (abbreviated “computer program”) is associated with the electronic processor. A sensor standard definition file is consistent with a sensor standard (abbreviated “the standard”)and is configured for input to the computer program. The sensor standard definition file is a programmable external file defining specifies (elements used to communicate sensor data in the computer program) of the standard. The sensor standard definition file is editable and extendable to support changes in the standard. A sensor standard rules file, sensor external standard file, and sensor external data file are consistent with the standard and are operatively associated with the computer program.

Description

STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT
The invention described herein may be manufactured and used by or for the government of the United States of America for governmental purposes without the payment of any royalties thereon or therefor.
FIELD OF THE INVENTION
The invention generally relates to data conversions, and more particularly, to a structured data conversion system consistent with at least one standard.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 illustrates a schematic block diagram of a Structured Data Conversion system, according to embodiments of the invention.
FIG. 2 illustrates a screen shot of a Graphical User Interface of a Structured Data Conversion System, according to embodiments of the invention.
FIG. 3 illustrates a screen shot of a Graphical User Interface of a SENSRB Data Conversion System, according to embodiments of the invention.
FIG. 4 illustrates an image of the hierarchical structure of an SENSRB Data Conversion System, according to embodiments of the invention.
FIG. 5 illustrates a screen shot of a SENSRB Modules Form, according to embodiments of the invention.
FIG. 6 illustrates a screen shot of an input form, according to embodiments of the invention.
FIG. 7A illustrates a screen shot of a SENSRB attitude input form in Euler angles, according to embodiments of the invention.
FIG. 7B illustrates a screen shot of a SENSRB attitude input form in unit vectors, according to embodiments of the invention.
FIG. 8 illustrates a screen shot of a SENSRB attitude input form in quaternions, according to embodiments of the invention.
FIG. 9 illustrates a screen shot of a SENSRB inconsistent attitude representation, according to embodiments of the invention.
FIG. 10A illustrates a screen shot of a SENSRB consistency window notification, according to embodiments of the invention.
FIG. 10B illustrates a screen shot of a SENSRB consistency window after recalculating to eliminate inconsistencies, according to embodiments of the invention.
FIG. 11 illustrates a screen shot of a SENSRB module 13 for pixel referenced sets, according to embodiments of the invention.
It is to be understood that the foregoing general description and the following detailed description are exemplary and explanatory only and are not to be viewed as being restrictive of the invention, as claimed. Further advantages of this invention will be apparent after a review of the following detailed description of the disclosed embodiments, which are illustrated schematically in the accompanying drawings and in the appended claims.
DETAILED DESCRIPTION OF EMBODIMENTS OF THE INVENTION
The invention generally relates to data conversion, and more particularly, to a structured data conversion system consistent with at least one standard.
Typically, information standards, including SENSRB, are released on a stand-alone basis. SENSRB is a specific type of structured data conversion system used with sensors. Users are expected to read the standard, which is generally quite complex, and formulate a means to implement that standard in support of their activity, either on the creation side (the makers of sensors in the case of SENSRB) or on the receiving side (those looking to exploit the information for subsequent processes, including targeting). Providing an easy to use, functional, intuitive application concurrent with the release of the standard will help to simplify the task of understanding and employing the standard for all users. Because of this, it is desirous to find a structured data conversion system.
Although embodiments of the invention are described in considerable detail, including references to certain versions thereof, other versions are possible. Examples of other versions include performing the steps in an alternate sequence or hosting a program on a different platform. Therefore, the spirit and scope of the appended claims should not be limited to the description of versions included herein.
Embodiments of the invention may take the form of a computer program product on a computer-usable storage medium having computer-usable/readable program instructions embodied in the medium. Any suitable computer readable medium may be utilized including either computer readable storage mediums, such as, for example, hard disks, CD-ROMs, optical storage devices, or magnetic storage devices, or a transmission media, such as, for example, those supporting the Internet or intranet.
Computer-usable/readable program instructions for carrying out operations of embodiments of the invention may be written in an object oriented programming language such as, for example, Python, Visual Basic, or C++. However, computer-usable/readable program instructions for carrying out operations of embodiments of the invention may also be written in conventional procedural programming languages, such as, for example, the “C#” programming language. The computer-usable/readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer. In the latter scenario, the remote computer may be connected to the user's computer through a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider or any other method known in the art).
Embodiments of the invention are described in part below with reference to flow chart illustrations and/or block diagrams of methods and computer program products according to embodiments of the invention. It will be understood that each block of the flow chart illustrations and/or block diagrams, and combinations of blocks in the flow chart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flow chart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory, including RAM, that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means that implement the function/act specified in the flow chart and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions that execute on the computer or other programmable apparatus provide steps for implementing the functions/acts specified in the flow chart and/or block diagram block or blocks.
Referring to the accompanying drawings in which like reference numbers indicate like elements, FIG. 1 illustrates a first aspect of a structured data conversion system. Reference character 10 generally indicates an apparatus of embodiments of the invention.
Embodiments of the invention generally relate to a data conversion system associated with at least one standard including: a device 12 adaptable to convert data and output converted data in a tangible medium, at least one standard definition file 14 having instructions consistent with at least one standard, where the at least one standard definition file is operatively associated with the device, at least one standard rules file 16 having instructions consistent with the at least one standard, where the at least one standard rules file is operatively associated with the device, at least one external standard file 18 consistent with the at least one standard, where the at least one external standard file is operatively associated with the device, and at least one external data file 20 consistent with the at least one standard, where the at least one external data file is operatively associated with the device. Both the at least one external standard file 18 and the at least one external data file 20 are input and output files, hence the reason for dual arrowed connectors with the device 12 in FIG. 1. The at least one standard rules file 16 is sometimes referred to as a rules external file without detracting from the merits of the invention.
Referring to both FIGS. 2 and 3, screen shots of a Graphical User Interface are shown, according to embodiments of the invention. FIG. 2 illustrates a screen shot of a Graphical User Interface of a Structured Data Conversion System, according to embodiments of the invention. FIG. 3 illustrates a screen shot of a Graphical User Interface of a SENSRB Data Conversion System, according to embodiments of the invention. The SENSRB Data Conversion System is a specific implementation of the Structured Data Conversion System.
In embodiments, the invention generally relates to a data conversion apparatus including: an electronic processing device (not shown) adaptable to convert data and output converted data in a tangible medium, a data conversion computer program 12 (FIG. 1) operatively associated with the electronic processing device, where the data conversion computer program uses a graphical user interface 30A and 30B to display on a display device (not shown), where the display device is operatively associated with the electronic processing device, a selection device (not shown) operatively associated with the display device, at least one standard definition file 14 (FIG. 1) consistent with at least one standard, where the at least one standard definition file is operatively associated with the data conversion computer program, at least one standard rules file 16 (FIG. 1) consistent with the at least one standard, where the at least one standard rules file is operatively associated with the data conversion computer program, at least one external standard file 18 (FIG. 1) consistent with the at least one standard, where the at least one external standard file is operatively associated with the data conversion computer program, and at least one external data file 20 (FIG. 1) consistent with the at least one standard, where the at least one external data file is operatively associated with the data conversion computer program.
In embodiments, the selection device is a computer mouse. The selection device may also be a combination of keys associated with a computer keyboard.
In embodiments, the at least one standard is a set of specifications of content and format to be used in requesting information on a topic, the topic being data that is adaptable for mathematical conversion. The at least one standard is application specific, depending on the particular analysis a user wants to perform. For instance, a particular standard may exist for a broad class of sensors (for which the user, during one execution, is providing information on one particular sensor) while another standard may exist for a class of control systems. Thus, one of the at least one standard is applied with the accompanying at least one standard definition file 14, the at least one standard rules file 16, the at least one external standard file 18, and the at least one external data file 20.
As an example, in a specific application for sensors (SENSRB) of the apparatus 10, the standard is a specification of information about a broad class of sensors (including infrared, radar, and photographic) that could exist on virtually any airborne vehicle or satellite or even on a handheld device carried by a soldier. In any one execution of the apparatus 10, a user is defining information about a particular sensor and about the path of the sensor, such as, for example, on a vehicle as a function of time in three-dimensional space, where the sensor is pointing, and what type of lens it has. Thus, the use could be to define a new sensor for general use by the community or it could be to provide metadata regarding one specific use of the sensor such as, for example, a mission to take photographs using the sensor), without detracting from the scope of the invention.
In embodiments, the device is an electronic processor adapted to run a data conversion computer program 12. The data conversion computer program 12 causes a graphical user interface 30A and 30B to display on a display device (not shown). The display device is a computer monitor. The graphical user interface includes icons (32A through 48A and 32B through 48B) displayed using the display device. The graphical user interface 30A and 30B is displayed on the display device when actuated with the selection device. The graphical user interface 30A and 30B includes at least one external standard file icon 18, delineated as an external structure icon in FIG. 2 and as an external SENSRB icon in FIG. 3. The graphical user interface 30A and 30B also includes at least one external data file icon 20.
In embodiments, the at least one standard definition file 14 is a programmable external file associated with the at least one standard, where the at least one standard definition file defines specifics of the at least one standard, where the at least one standard definition file is editable and extendable to support changes in the at least one standard, the specifics being used to define a plurality of elements used to communicate data in the data conversion system 10. The specifics further dictate how data is written, read, and communicated by the apparatus 10.
In embodiments, the at least one standard definition file 14 provides complete documentation of the standard upon a user mouse-over of a section of the standard with a computer mouse.
In embodiments, the at least one standard rules file 16 is an external file used to specify standard specifics rules used during data input to test data accuracy and data. consistency.
In embodiments, the at least one external standard file 18 is a text file adapted for input and output of metadata into and out of the data conversion system 10.
In embodiments, the at least one external data file 20 is a text file adapted for input and output of formatted, human readable records into and out of the data conversion system 10. Examples of formatted, human readable records include tab-deliminated records and graphical images.
In embodiments, the graphical user interface 30A and 30B includes at least one external standard file icon 18. The graphical user interface 30A and 30B includes at least one external data file icon 20.
In embodiments, the graphical user interface 30A and 30B further includes: an internal structure view icon 32A and 32B operatively associated with the data conversion computer program 12, a plurality of read command icons 34A and 34B operatively associated with the data conversion computer program, where the plurality of read command icons are used to read data into a data view format. Also included is a data view icon 36A and 36B operatively associated with the data conversion computer program 12, the data view icon being used to allow a user to view and manipulate data associated with the data view format, a plurality of write command icons 38A and 38B operatively associated with the data conversion computer program, a plurality of convert icons 40A and 40B operatively associated with the data conversion computer program, a plurality of compare icons 42A and 42B operatively associated with the data conversion computer program, a plurality of erase icons 44A and 44B operatively associated with the data conversion computer program, where the plurality of erase icons are used to allow a user to erase previously read in data, a plurality of display format drop-down list icons 46A and 46B operatively associated with the data conversion computer program, where the plurality of display format drop-down list icons are used to display converted data in different formats, a plurality of release icons 48A and 48B operatively associated with the data conversion computer program, and at least one text window 50A and 50B operatively associated with the data conversion computer program, where the at least one text window is used to display a plurality of data view formats. The at least one text window 50A and 50B is also referred to as an active text window and a visualization window.
In other embodiments, the invention includes a method of converting data, including: providing an electronic processing device adaptable to convert data and output converted data in a tangible medium, providing a data conversion computer program 12 operatively associated with the electronic processing device, the data conversion computer program using a graphical user interface 30A and 30B, providing a display device operatively associated with the electronic processing device, providing a selection device operatively associated with the display device, providing at least one standard definition file 14 consistent with at least one standard, the at least one standard definition file being operatively associated with the data conversion computer program, providing at least one standard rules file 16 consistent with the at least one standard, the at least one standard rules file being operatively associated with the data conversion computer program, providing at least one external standard file 18 consistent with the at least one standard, the at least one external standard file being operatively associated with the data conversion computer program and displayed as an icon on the graphical user interface; providing at least one external data file 20 consistent with the at least one standard, the at least one external data file being operatively associated with the data conversion computer program and displayed as an icon on the graphical user interface, executing the data conversion computer program with the selection device, the executing causing the graphical user interface to be displayed to a user on the display device, converting data by operatively selecting a plurality of icons 32A through 48A and 32B through 48B on the graphical user interface 30A and 30B, by actuating the selection device, and outputting the converted data in the tangible medium.
Output includes hard copy printouts, other computer accessible medium such as computer hard drives or floppy discs, visual display screens, as well as other mediums employing identified target coordinates.
The providing at least one standard definition file step includes providing mouse-over capability providing complete documentation of the standard, thus removing the need for a user to keep complete documentation of the standard available when converting data.
The converting data step includes providing: an internal structure view icon 32A and 32B operatively associated with the data conversion computer program 12, a plurality of read command icons 34A and 34B operatively associated with the data conversion computer program, the plurality of read command icons used to read data into a data view format, a data view icon 36A and 36B operatively associated with the data conversion computer program, where the data view icon is used to allow a user to view and manipulate data associated with the data view format, a plurality of write command icons 38A and 38B operatively associated with the data conversion computer program, a plurality of convert icons 40A and 40B operatively associated with the data conversion computer program, a plurality of compare icons 42A and 42B operatively associated with the data conversion computer program, a plurality of erase icons 44A and 44B operatively associated with the data conversion computer program, the plurality of erase icons used to allow a user to erase data inputted into the apparatus, a plurality of display format drop-down list icons 46A and 46B operatively associated with the data conversion computer program, the plurality of display format drop-down list icons used to display converted data in different formats, a plurality of release icons 48A and 48B operatively associated with the data conversion computer program, and at least one text window 50A and 50B operatively associated with the data conversion computer program, the at least one text window used to display a plurality of data view formats.
The plurality of compare icons 42A and 42B includes a first compare icon, a second compare icon, and a third compare icon. For example, compare icons (or buttons) 42B in the SENSRB Data Conversion System compare the data in the two sets on either side of the button. Thus, for example a comparison of two SENSRB files is performed by selecting the first compare icon 42B located between and below the External SENSRB icon 18B and the SENSRB View icon 32B. A comparison of a SENSRB file and a data file is performed by selecting the second compare icon 42B located between and below the SENSRB View icon 32B and the Data View icon 36B. Similarly, a comparison of two data files is performed by selecting the third compare icon 42B located between and below the Data View icon 36B and the External Data icon 20.
Thus, in similar fashion, selecting the first compare icon 42A located between and below the External Structure icon 18 and the Structure View icon 32B with the selection device compares two structure files. Similarly, selecting the second compare icon 42A located between and below the Structure View icon 32A and the Data View icon 36A with the selection device compares a structure file and a data file. Finally, selecting the third compare icon 42A located between and below the Data View icon 36A and the External Data icon 20 with the selection device compares two data files.
The at least one text window 50A and 50B further displays a plurality of data view formats, an activity log, compare requests, and error messages.
The converting data step further includes inputting data into computer memory accessible by the data conversion computer program by using the selection device to actuate a structure view icon 32A and 32B in the graphical user interface 30A and 30B, thereby inputting data producing an external structure file 18.
The converting data step further includes reading previously existing data from the at least one external standard file 18 into the graphical user interface 30A and 30B, where the at least one external standard file is data defined in the format of the at least one standard, by using the selection device to actuate an external structure icon in the graphical user interface.
The converting data step further includes converting data from a structure view format to a data view format by using the selection device to actuate a first convert icon in the graphical user interface 30A and 30B.
The converting data step further includes converting data from the data view format to the structure view format by using the selection device to actuate a second convert icon in the graphical user interface 30A and 30B.
Outputting the converted data in a tangible medium includes displaying the converted data on the display device.
There are many instances in engineering when an application or project needs to communicate to other engineers a complete definition of all of the data related to that particular project. To facilitate this for a particular field of engineering, an authorized group can create a “standard” or a “specification” of the various data elements related to that topic as well the order and format of how those data elements will be communicated. This is called a Structured Data System.
One particular instance of a Structured Data System specification is the SENSRB standard, as illustrated in FIG. 3. A detailed document, NTB RFC-035d, SENSRB, Compendium of Controlled Extensions (CE) for the National Imagery Transmission Format (NITF), has been defined that specifies the set of data and metadata required to communicate information related to airborne sensors, including cameras, infrared sensors, and radar. This standard defines the Structured Data System for modem sensors.
Standards for data and metadata, such as SENSRB, are often complex and subject to interpretation, which degrades the ability for data to be shared seamlessly between projects and organizations and contractors. Moreover, the complexity of the standards often results in significant ramp-up time before systems can be fully integrated utilizing those standards.
The apparatus 10 acts as a software implementation of such a standard and changes the user involvement into a straightforward, intuitive engineering process. The ease of use, additional clarity, the savings in time to implement and to utilize the data, and the generalization of the approach to additional standards provides the Navy and the government with a clear opportunity for substantial saving both in terms of time and money.
A particular strength of this invention is its generality in being able to support a large class of Structured Data Systems. The invention utilizes an external “template” file to define the complete set of data elements including their definitions and specifications within a particular standard. It also utilizes an external “rules” file to define standard-specific relationships between data elements.
The system also serves to provide data conversion facilities so that data that already exists in another standard can be converted or used as a starting point for further distribution or subsequent applications.
The system uses a fully-icon based user interface combined with easy to interpret drop-down lists, as well as input and output tables created in real time, so as to make the system unusually easy to use for both novice and expert users alike.
An additional advantage of the system is the methodology by which the software has been implemented. Rather than having the software implement the SENSRB standard directly, the system uses an external file (a text file or spreadsheet file) to define all of the elements of the standard that are to be revealed to the user. As a result, any changes, minor or major, to the standard generally require only the creation of a new version of the external file, and do not require additional software development. This external file can not only define the elements of the standard, it also provides the complete documentation, presented upon a user “mouse-over” of that section of the standard, thus removing the requirement for the user to keep the complete documentation available during use of the product.
Consistent with the advantage of the external file definition for the standard, the system also utilizes another external file to define standard-specific rules that can be applied during execution of the project to aid the user in detecting errors and inconsistencies in the data. The rules are externally defined which serves to reduce the amount of standard-specific actions in the software itself. This not only improves ease of maintenance of the standard, it also allows extension of the rules to support new observations derived by use in the community.
The overall flow and use of the system is represented in the icon-based user interface (the graphical user interface). The primary functions of the data conversion elements of the system are invoked and displayed using the graphical user interface. Each of the rectangular icons in the interface represents an action that the user can invoke.
FIG. 3 illustrates the set of major options available to the user in creating, converting, storing, reading, and comparing data in various formats. The various objects in the window are icons, each of which causes a certain action to take place. Starting from the left in FIG. 3, the user can open an externally defined SENSRB file, which can then be seen and then read into the SENSRB View system. The user can select the SENSRB View icon 32B to either create a complete set of standards data from scratch or to view and modify external data that has been read.
The two convert icons 40B, convert data to and from the two major classes of formats: the SENSRB tagged record format (and its complimentary input mode forms) and a data view format. A data view is an engineering oriented format such as, for example, a columnar format as one might see in a spreadsheet file or a word processing table. Another data view is an XML format. The specific format is selected from a drop-down list shown in green and located below the data view icon. The user can request that a data view of any format be written to at least one external data file and similarly at least one external data file can be read into the data view. Yet another data form is a three-dimensional graphical representation of the data in relation to the earth.
The compare buttons 42B compare the data in the two sets on either side of the button, for example a comparison of two SENSRB files, or a comparison of a SENSRB file and a data file, or a comparison of two data files.
The window at the bottom of the interface is an active text window 50B that displays information appropriate to the current activity. Generally this shows the data contained in the data system currently selected (and outlined in red). In other instances, the text window 50B will display supporting information such as an activity log, or error messages, or the results of a compare request.
When the user is creating a complete SENSRB model or viewing an existing model, additional iconic representation are displayed to the user that provide not only control but user feedback to guide the process. A fully icon driven and intuitive interface helps in the broad acceptance of the product.
In SENSRB, the at least one standard definition file 14 is labeled sb Records Template and is a computer file. The sb Records Template file defines all parameters applicable to input forms of the apparatus 10. The spreadsheet version of the external file is used to define the specifics of the standards implementation of SENSRB. The table can be edited and extended to support changes to the standard or to support additional standards or engineering data systems.
Table I shows an example of the data fields inherent with the sb Records Template file.
TABLE I
Example of the Data Fields Inherent with the sb Records Template File
Index Field Name Size DataType Req RangeA Min Value Max Value Units
01 General Data 1 Z R N, Y (
01a Sensor 25 A R =1ACESHY: Airborne Hyperspectral, AG3607:
Airborne Multispectral, ALIRT: Time-gated topographic
LIDAR, Applanix-DSS: Medium format airborne DSS,
ATARS-IRLS: Infrared Line Scanner, ATARS-LAEO:
Low Altitude EO, ATARS-MAEO: Medium Altitude EO,
ATFLIR: Advanced Technology Forward Looking IR,
BuckEye: Digital camera and LiDAR, Constant Hawk:
CH1EO, Bright Start II: Fire Scout, GEOMOS: Gorgon
Stare Mosaic, GIRMOS: Gemini Infrared Multi-Object
Spectrograph, GR-M80: ORS-1 Sat (SYERS2A),
HYCAS: Airborne Hyperspectral, HYDICE:
Hyperspectral Digital Imagery Collection Exp, LIDAR:
Light Detection And Ranging, OBC: Optical Bar
Camera, Predator-EO Nose: Predator EO Nose
camera, Predator-EO Spot: Predator EO Telephoto
camera, Predator-EO Zoom: Predator EO Zoom
camera, Predator-IR Amb: Predator IR Camera,
Predator-IR M50: Predator IR M500 Camera, Predator-
IR M60: Predator IR M600 Camera, RADEOS: Rapid
Deployment Electro-Optical System, Raptor-LOROP:
Raptor Long Range Oblique Photog, RTNESS:
SAR/MTI/EO/IR, Shadow Harvest: Hyperspectral
imaging sensor, SHARP-H: CA279H High Altitude
camera, SHARP-M: CA279M Med Altitude camera,
SPMWIR: Airborne Midwave IR, SPSWIR: Airborne
Shortwave IR, SPVNIR: Airborne Visual/Near IR,
SYERS: Sr. Year EO Recon Sys, SYERS2: Sr. Year
EO Recon Sys 2, SYERS2A: Sr. Year EO Recon Sys
2A, TARS-IRLS: Infrared Line Scanner, TARS-LAEO:
Low Altitude EO, TARS-MAEO: Medium Altitude EO,
Wescam-EO/IR: MX-15 EO/IR, Wescam-V: MX-20EN
Airborne full motion video, Wescam-F: MX-20EW
Airborne Single Frame, Wescam-IR: MX-20IR Airborne
Midwave Infrared, Use Sensor URI
01b Sensor URI 32 A C
01c Platform 25 A R =1Super Hornet: F/A-18E/F, Hornet: F/A-18C/D,
Growler: EA-18G, F-16C: Fighting Falcon, Tornado:
GR4, U-2, Liberty: King Air MC-12W, Global Hawk:
RQ-4A/B/N, Reaper: MQ-9, Predator: MQ-1, Shadow:
RQ-7, STUAS: Small Tactical Unmanned Air Sys, Scan
Eagle: Mini-UAV EO camera, Raven B: RQ-11B, Fire
Scout: MQ-8, Orion: P-3C, Poseidon: P-8, IKONOS:
Commercial Satellite, ARL: RC-7 Crazy Hawk, ORS-1:
Operationally Responsive Space, QuickBird:
Commercial Satellite, Worldview-1: Commercial
Satellite, GeoEye: Commercial Satellite, Wortdview-2:
Commercial Satellite, Use Platform URI
01d Platform URI 32 A C
01e Operational 10 A R =1Air, Space, Maritime, Ground
Domain
01f Content 1 A R =10: Minimal Content, 1: Minimal Content+, 2:
Level Situational Awareness, 3: Situational Awareness+, 4:
Geopositioning, 5: Geopositioning+, 6: Enhanced
Geopositioning, 7: Enhanced Geopositioning+, 8:.
Accurate Geopositioning, 9: Accurate Geopositioning+
01g Geodetic 5 A R =1WGS84
System
01h Geodetic 1 A R =1G: Geodetic, C: Cartesian
Type
01i Elevation 3 A R =1HAE: Ht above ellipsoid, MSL: Above sea level,
Datum AGL: Above ground level
01j Length Unit 2 A R =1Si: Metric, EE: English
01k Angular Unit 3 A R =1DEG: Degrees, RAD: Radians, SMC: Semi-circles
01l Start Date 8 D R 19411207 20991231
01m Start Time 14 T R 0 86399.99999999
01n End Date 8 D R 19411207 20991231
01o End Time 14 T R 0 86399.99999999
01p Generation 2 I R 0 99
Count
01q Generation 8 D c 19411207 20991231
Date
01r Generation 10 t c 0 86399.99999999
Time
02 Sensor Array 1 Z R N, Y
Data
02a Detection 20 A R Visible, Infrared, Multispectral,
Hyperspectral
02b Row 8 I R 1 99999999
Detectors
02c Column 8 I R 1 99999999
Detectors
02d Row Metric 8 y c 0.000001 99999999
02e Column 8 y c 0.000001 99999999
Metric
02f Focal Length 8 y c 0.000001 99999999
02g Row Fov 8 a c 0 270
02h Column Fov 8 a c 0 270
02i Calibrated 1 A C N, Y
03 Sensor 1 2 R N, Y
Calibration
Data.
03a Calibration 2 A R mm: millimeters, px: pixels
Unit
03b Principal 9 f c −99999999 99999999 =V03a
Point Offset
X
03c Principal 9 f c −99999999 99999999 =V03a
Point Offset
Y
03d Radial Distort 12 f c −1E99 1E99 =V03a,−2
1
03e Radial Distort 12 f c −1E99 1E99 =V03a,−4
2
03f Radial Distort 12 f c −1E99 1E99 =V03a,−6
3
03g Radial Distort 9 f c −1E99 1E99 =V03a
Limit
03h Decent 12 f c −1E99 1E99 =V03a,−1
Distort 1
03i Decent 12 f c −1E99 1E99 =V03a,−1
Distort 2
03j Affinity 12 f c 0 999999999
Distort 1
03k Affinity 12 f c 0 999999999
Distort 2
03l Calibration 8 D c 19411207 20991231
Date
04 Image 1 Z R N, Y
Formation
Data
04a Method 15 A R Single Frame, Multiple Frames, Video
Frames, Pushbroom, Whiskbroom
04b Mode 3 A R
04c Row Count 8 P R 1 99999999
04d Column 8 P R 1 99999999
Count
04e Row Set 8 P R 1 99999999
04f Column Set 8 P R 1 99999999
04g Row Rate 10 T R −99.9999999 99.9999999
04h Column Rate 10 T R −99.9999999 99.9999999
04i First Pixel 8 P R 0 99999999
Row
04j First Pixel 8 P R 0 99999999
Column
04k Transform 1 Z R 0, 1, 2, 3, 4, 5, 6, 7, 8
Params
04l Transform 12 f c −1E99 1E99
Param1
04m Transform 12 f c −1E99 1E99
Param2
04n Transform 12 f c −1E99 1E99
Param3
04o Transform 12 f c −1E99 1E99
Param4
04p Transform 12 f c −1E99 1E99
Param5
04q Transform 12 f c −1E99 1E99
Param6
04r Transform 12 f c −1E99 1E99
Param7
04s Transform 12 f c −1E99 1E99
Param8
05a Reference 12 T r −9999.999999 9999.999999
Time
05b Reference 8 P r 0 99999999
Row
05c Reference 8 P r 0 99999999
06a =GLatitude, 11 =GI, X R =G-90, −9999999999 =G90,
Geocentric X 9999999999
06b =GLongitude, 12 =GL, X R =G-180, −9999999999 =G180,
Geocentric Y 9999999999
06c =GAltitude, 11 X R −9999999999 9999999999
Geocentric Z
06d Sensor X 8 X R −99999 99999
Offset
06e Sensor Y 8 X R −99999 99999
Offset
06f Sensor Z 8 X R −99999 99999
Offset
07 Atitude Euler 1 Z R N, Y
Angles
07a Sensor Angle 1 A C 1: YPR, 2: RPY, 3: PRY
Model
07b Sensor Angle 10 a C −180 180
1
07c Sensor Angle 9 a C −90 90
2
07d Sensor Angle 10 a C −180 180
3
07e Platform 1 A R N,Y
Relative
07f Platform 9 a c 0 360
Heading
07g Platform 9 a c −90 90
Pitch
07h Platform Roll 10 a c −180 180
08 Attitude Unit 1 Z R N, Y
Vectors
08a =GICX North, 10 F R −1 1
ICX X
08b =GICX East, 10 F R −1 1
ICX Y
08c =GICX Down, 10 F R −1 1
ICX Z
08d =GICY North, 10 F R −1 1
ICY X
08e =GICY East, 10 F R −1 1
ICY Y
08f =GICY Down, 10 F R −1 1
ICY Z
08g =GICZ North, 10 F R −1 1
ICZ X
08h =GICZ East, 10 F R −1 1
ICZ Y
08i =GICZ Down, 10 F R −1 1
ICZ Z
09 Attitude 1 Z R N, Y
Quatemions
09a Attitude Q1 10 F R −1 1
09b Attitude Q2 10 F R −1 1
09c Attitude Q3 10 F R −1 1
09d Attitude Q4 10 F R −1 1
10 Sensor 1 Z R N, Y
Velocity
10a =GVelocity 9 V R −99999 99999
North,
Velocity X
10b =GVelocity 9 V R −99999 99999
East, Velocity
Y
10c =GVelocity 9 V R −99999 99999
Down,
Velocity Z
11 Point Sets 2 I R 0 99
11a Point Set 25 A R Image Center, Image Extent,
Type Ground Area, Ground Control
11 b Point Count 3 I R 0 999
11c Row 8 P R 0 99999999
11d Column 8 P R 0 99999999
11e Latitude 10 I r −90 90
11f Longitude 11 L r −180 180
11g Elevation 6 X r =X-1500 =X30000
11h Range 8 X r 0 99999999
12 Time Stamps 2 I R 0 99
Sets
12a Time Stamp 3 B R =F02b, 10c
Type
12b Time Stamp 4 I R 0 9999
Count
12c Time Stamp 12 T R −99999.99999 99999.99999
Time
12d =R12a =R12a =R12a R =R12a =R12a =R12a
13 Pixel 2 I R 0 99
Reference
Sets
13a Pixel 3 B R =F02b, 10c
Reference
Type
13b Pixel 4 I R 0 9999
Reference
Count
13c Pixel 8 P R 1 99999999
Reference
Row
13d Pixel 8 P R 1 99999999
Reference
Column
13e =R13a =R13a =R13a R =R13a =R13a =R13a
14 Uncertainties 3 I R 0 999
14a Uncertainty 11 B R =f02b, 13e
First Type
14b Uncertainty 11 B r =f02b, 13e
Second Type
14c Uncertainty 10 =W14a R =R14a =R14a =UUncertaintY,
Value Correlation
15 Additional 3 I R 0 999
Parameters
15a Parameter 25 A R
Name
15b Parameter 3 I R 1 999
Size
15c Parameter 4 I R 0 9999
Count
15d =V15a, =V15b F R −1E99 1E99
Value
Index Field Name Inst Level S1 S2 Notify Definition
01 General Data 0 0 0 0 General Data Flag. Flag field indicating the presence of general data. This module provides identifying
information for the associated NITF image segment and sets reference systems for subsequent module
parameters. ‘Y’ in this field indicates the presence of the Fields 01a to 01r; ‘N’ omits their presence. The
value of this field must be ‘Y’ for the first instance of SENSRB associated with each NITF image
segment. See Z.5.1 General Data Module for additional guidance regarding these fields.
01a Sensor 0 0 0 0 Sensor Registered Name or Model. Identifies the common name for the payload sensor that collected
the image segment. The twenty-five character name must be unique, explicit, and registered with the
NTB.c For sensor payloads made up of multiple of sensors, SENSOR URI (index 01b), DETECTION
(index 02a), METHOD (04a), and MODE (04b) fields allow for further characterization of the imagery source.
01b Sensor URI 0 0 0 0 Sensor Uniform Resource Identifier. This optional field allows a unique identifier specific to the collecting
sensor. This field supports serial numbers or, preferably, Uniform Resource Identifiers (URI)-which can
facilitate access to more detailed sensor information, such as geometric or radiometric calibration data,
via the intemet. See Z.5.1.X Uniform Resource Identifiers.
01c Platform 0 0 0 0 Platform Common Name. Identifies the platform type upon which the sensor is operating. The twenty-five
character name must be unique, explicit, and registered with the NTB.c
01d Platform URI 0 0 0 0 Platform Uniform Resource Identifier. This optional field allows a unique platform identification, especially
for non-aircraft platforms where ACFTB TRE AC_TAIL_NO does not apply. The thirty-two byte field
accommodates a Uniform Resource Identifiers (URI), which can facilitate access to more detailed
platform information via the intemet. See Z.5.1.X Uniform Resource Identifiers.
01e Operational 0 0 0 0 Operational Domain. Specifies the sensor platform's domain of operation during the collection-providing
Domain some indication of the imageiy perspective.
01f Content Level 0 0 0 0 Content Level. Quantifies the level of SENSRB data content to enhance data discovery; see Z.5.1.2
Application-Required Content Level for examples. This single value allows users to imply and/or infer if
imagery will meet certain exploitation requirements and discover data with a specific content level.
01g Geodetic System 0 0 0 0 Geodetic Reference System. Specifies the geodetic system to which the geocoordinates in this TRE are
referenced. The default is WGS84 for the World Geodetic System-1984.c (See Z.4.6.1 Geospatial
Coordinate Systems.)
01h Geodetic Type 0 0 0 0 Geodetic Coordinate Type. Specifies the coordinate system used to report the sensor location and the
reference system for attitudes and velocities. The two allowed field values are: G (Geographic/Geodetic)
and C (geocentric Cartesian); see also Z.4.6.1 Geospatial Coordinate Systems Summary. The local
geographic coordinate frame shall be North-East-Down (NED).
01i Elevation Datum 0 0 0 0 Elevation and Altitude Datum. Specifies the reference datum from which elevations and altitudes will be
reported. The three allowed field values are: HAE (height above ellipsoid), MSL (height above mean sea
level), and AGL (height above ground level). HAE is strongly encouraged; see Z.4.6.1 Geospatial
Coordinate Systems Summary.
01j Length Unit 0 0 0 0 Length Unit System. Specifies the unit system used for the spatial parameters within this TRE. The two
allowed field values are: SI (International System of Units) and EE (English Engineering Unit System). SI
is encouraged. (See Z.4.4.1 Length Units.)
01k Angular Unit 0 0 0 0 Angle Unit Type. Specifies the angular units used for the angular parameters within this TRE (unless
explicitly overridden by the UNITS column). The three allowed field values are: DEG (degrees), RAD
(radians), or SMC (semi-circles). DEG is encouraged. (See Z.4.4.2 Angular Units.)
01l Start Date 0 0 0 0 Imaging Start Date. Date at the start of the NITF image segment collection, formatted as YYYYMMDD
and referenced to UTC. (See Z.4.2 Times and Dates.)
01m Start Time 0 0 0 0 Imaging Start Time. The number of UTC seconds into the day, specified in START_DATE (index 01l),
when the first photon contacted the detector for the first collected pixel stored in the NITF image
segment. The day starts at the UTC zero seconds (0.000000000 s) and ends just before the start of the
next day (86399.99999999 s). The value may be equal to END_TIME for imagery sensors modeled as
instantaneous collectors. (See Z.4.2 Times and Dates.)
01n End Date 0 0 0 0 Imaging End Date. Date at the end of the NITF image segment collection, formatted as YYYYMMDD and
referenced to UTC. Must be the same as the imaging start date (or after-for imagery collections extending
into a subsequent day). (See Z.4.2 Times and Dates.)
01o End Time 0 0 0 0 Imaging End Time. The number of UTC seconds into the day, specified in END_DATE (index 01n), when
the first photon contacted the detector for the last collected pixel stored in the NITF image segment. See
START_TIME (index 01m) for more guidance. (See also Z.4.2 Times and Dates.) END_DATE and
END_TIME may be equal to START_DATE and START_TIME, respectively, for imagery collections
modeled as instantaneously collected.
01p Generation Count 0 0 0 0 Generation Count. The number of times the data contained in this TRE has been resected or adjusted.
Zero (00) shall represent that the data is the original NITF form, which might have undergone
manipulations reflected in Module 4. If subsequent modifications are made to this metadata through
resection or adjustments, this count will be incremented accordingly. (See Z.5.1.X Image Parameter Post-
Collection Adjustments.)
01q Generation Date 0 0 0 0 Generation Date. The date when the current resection or adjustment was made, formatted as
YYYYMMDD. This value is ignored if generation count is zero but must be non-default filled if generation
count is greater than zero. (See Z.4.2 Times and Dates.)
01r Generation Time 0 0 0 0 Generation Time. The UTC time of day when the current resection or adjustment was made, formatted as
HHMMSS.sss (See Z.5.1.X Image Parameter Post-Collection Adjustments for specific instructions
regarding this field's precision). This value is ignored if generation count is zero but must be non-default
filled if generation count is greater than zero.
02 Sensor Array 0 0 0 0 Sensor Array Data Flag. Flag field indicating the presence of data describing the sensor array. This
Data module provides information regarding the image collection hardware and process, as is often needed for
geopositioning. ‘Y’ in this field indicates the presence of the Fields 02a to 02i; ‘N’ omits their presence.
Units depend on the values for LENGTH_UNIT (index 01j) and ANGULAR_UNIT (index 01k). See Z.5.2
Sensor Array Data Module for additional guidance regarding these fields.
02a Detection 0 0 0 0 Detection Type. Specifies the detection spectrum of the sensor array. Currently approved values are:c
02b Row Detectors 0 0 0 0 Number of Detectors Used in Row and Column. The number of detectors used in the “instantaneous”
collection process as counted in the row- and column-aligned dimensions. This number shall correspond
with the row and column physical dimensions of the used sensor array (ROW_METRIC and
COLUMN_METRIC, indices 02d and 02e)
02c Column Detectors 0 0 0 0 Number of Detectors Used in Row and Column. The number of detectors used in the “instantaneous”
collection process as counted in the row- and column-aligned dimensions. This number shall correspond
with the row and column physical dimensions of the used sensor array (ROW_METRIC and
COLUMN METRIC, indices 02d and 02e)
02d Row Metric 0 0 0 0 Row and Column Physical Dimension. The physical length of the sensor array used in the
“instantaneous” collection process as measured along the row- and column-aligned dimensions.e These
dimensions shall correspond with the number of detectors used (ROW_DETECTORS and
COLUMN_DETECTORS, indices 02b and 02).
02e Column Metric 0 0 0 0 Row and Column Physical Dimension. The physical length of the sensor array used in the
“instantaneous” collection process as measured along the row- and column-aligned dimensions.e These
dimensions shall correspond with the number of detectors used (ROW_DETECTORS and
COLUMN DETECTORS, indices 02b and 02c).
02f Focal Length 0 0 0 0 Best Known Focal Length. The best known value of the effective focal Iength.e
02g Row Fov 0 0 0 0 Field of View along Sensor Array Row and Column. The angle measuring the effective field-of-view
projected onto the sensor array center row and column (e.g. Sensor Horizontal Field of View and Sensor
Vertical Field of View, respectively).e
02h Column Fov 0 0 0 0 Field of View along Sensor Array Row and Column. The angle measuring the effective field-of-view
projected onto the sensor array center row and column (e.g. Sensor Horizontal Field of View and Sensor
Vertical Field of View, respectively).e
02i Calibrated 0 0 0 0 Focal Length Calibration Flag. Indicates if the focal length and/or fields of view are based on a calibration
process. ‘Y’ in this field will indicate that the focal length and/or the fields of view with the detector metrics
are based on a calibration. ‘N’ in this field will indicate that they are not.
03 Sensor 0 0 0 0 Sensor Calibration Data Flag. Flag field indicating the presence of metric sensor calibration parameters.
Calibration Data This module provides sensor geometric calibration values and coefficients to facilitate precision
geopositioning. ‘Y’ in this field indicates the presence of the Fields 03a to 03l; ‘N’ omits their presence.
Additional definitions and implementation guidance for these parameters are provided in Z.5.3 Sensor
Calibration Data Module.
03a Calibration Unit 0 0 0 0 03b, Calibration Unit System. Identifies the unit system used for the subsequent calibration parameters. If
03c, ‘mm’, then the parameters are referenced to millimeters. If ‘px’, then the parameters are referenced in pixel units.
03d,
03e,
03f,
03g,
03h, 03i
03b Principal Point 0 0 0 0 Principal Point Offset in x-direction (x0). The number of row-aligned units (+/−) from the center of the
Offset X image where the optical axis of the sensor intersects the effective sensor array; see Z.5.3.2 Principal
Point Offset. Positive values are in the direction of the positive XI axis; see Z.4.6.5 Image Coordinate
System.
03c Principal Point 0 0 0 0 Principal Point Offset in y-direction (y0). The number of column-aligned units (+/−) from the center of the
Offset Y image where the optical axis of the sensor intersects the effective sensor array; see Z.5.3.2 Principal
Point Offset. Positive values are in the direction of the positive YI axis; see Z.4.6.5 Image Coordinate
System.
03d Radial Distort 1 0 0 0 0 First Radial Distortion Coefficient (k1). This value is defined in Z.5.3.3 Radial Distortion.
03e Radial Distort 2 0 0 0 0 Second Radial Distortion Coefficient (k2). This value is defined in Z.5.3.3 Radial Distortion.
03f Radial Distort 3 0 0 0 0 Third Radial Distortion Coefficient (k3). This value is defined in Z.5.3.3 Radial Distortion.
03g Radial Distort 0 0 0 0 Limit of Radial Distortion Fit. The maximum valid radial distance (from the principal point) for the
Limit application of the polynomial distortion correction given in Z.5.3.3 Radial Distortion.
03h Decent Distort 1 0 0 0 0 First Decentering Distortion Coefficient (p1). This value is defined in Z.5.3.4 Additional Distortions.
03i Decent Distort 2 0 0 0 0 Second Decentering Distortion Coefficient (p2). This value is defined in Z.5.3.4 Additional Distortions.
03j Affinity Distort 1 0 0 0 0 First Affinity Distortion Coefficient (b1). This value is defined in Z.5.3.4 Additional Distortions.
03k Affinity Distort 2 0 0 0 0 Second Affinity Distortion Coefficient (b2). This value is defined in Z.5.3.4 Additional Distortions.
03l Calibration Date 0 0 0 0 Calibration Report Date. Date when the calibration values above, indices 03a to 03k, were computed.
(See Z.5.3.6. Calibration Notes.) Field shall be formatted as YYYYMMDD.
04 Image Formation 0 0 0 0 Image Formation Data Flag. Flag field indicating the presence of parameters describing the image
Data formation process. This module provides information regarding how the image array (which may be equal
to the recorded NITF array) was formed from the sensor array. This data is typically needed for
geopositioning. ‘Y’ in this field indicates the presence of the Fields 04a to 04k; ‘N’ omits the presence of
Fields 04a to 04s. See Z.5.4 Image Formation Data Module for additional guidance regarding the
definitions and implementation of these fields.
04a Method 0 0 0 0 Imaging Method. Specifies the method the sensor utilized to collect the image data. Currently approved
values are:c
04b Mode
0 0 0 0 Imaging Mode. A registered three digit code indicating the actual mode used by the sensor to collect or
detect the image data. This may differ from the planned mode given in ACFTB's MPLAN. The three digit
codes, and their meaning, are to be registered with the NTBc.
04c Row Count 0 0 0 0 Row and Column Count. The number of rows and columns in the raw image array. One or both of these
values will differ from the numbers of detectors used in rows and columns (indices 02b and 02c) with
scanning sensors or if some sub- or super-sampling process is performed as part of the image formation.
These values shall be equal to ISH NROWS and NCOLS in cases where the raw image array is recorded
directly as the NITF image segment. In cases where the raw image array is modified prior to NITF
generation, these values may differ from those in ISH NROWS and NCOLS. Difference between image
subheader fields and these fields may indicate scaling, rotation, skewing, etc.
04d Column Count 0 0 0 0 Row and Column Count. The number of rows and columns in the raw image array. One or both of these
values will differ from the numbers of detectors used in rows and columns (indices 02b and 02c) with
scanning sensors or if some sub- or super-sampling process is performed as part of the image formation.
These values shall be equal to ISH NROWS and NCOLS in cases where the raw image array is recorded
directly as the NITF image segment. In cases where the raw image array is modified prior to NITF
generation, these values may differ from those in ISH NROWS and NCOLS. Difference between image
subheader fields and these fields may indicate scaling, rotation, skewing, etc.
04e Row Set 0 0 0 0 Row and Column Detection Set. The number of rows and columns collected simultaneously to form the
image array. These fields are applicable to all intended sensors. For instantaneous framing sensors,
these values shall be equal to positive row and column counts (indices 04c and 04d). For a single-line
pushbroom (as an example), one field shall have a meaningful value (equal to the row or column count)
and the other shall be one. These values shall pertain to the NITF image segment (stored pixel data),
where negative column set indicates column aggregation right to left and negative row set indicates row
aggregation bottoM to top when pixel data is displayed.
04f Column Set 0 0 0 0 Row and Column Detection Set. The number of rows and columns collected simultaneously to form the
image array. These fields are applicable to all intended sensors. For instantaneous framing sensors,
these values shall be equal to positive row and column counts (indices 04c and 04d). For a single-line
pushbroom (as an example), one field shall have a meaningful value (equal to the row or column count)
and the other shall be one. These values shall pertain to the NITF image segment (stored pixel data),
where negative column set indicates column aggregation right to left and negative row set indicates row
aggregation bottom to top when pixel data is displayed.
04g Row Rate 0 0 0 0 Row and Column Detection Rate. The duration in time to collect a single or simultaneous set of rows
(ROW_SET) and columns (COLUMN_SET) when forming the image array. These fields are applicable to
all intended sensors. For framing sensors, these values shall be zeros. For pushbroom sensors, one field
(in the direction of the scanning) shall have a meaningful value and the other shall have zeros. To
indicate positive or negative scan direction, this field value shall be positive when FIRST_PIXEL_ROW is
stored within first image row (first NITF row) and negative when stored within last image row. Or this field
value shall be positive when FIRST_PIXEL_COLUMN is stored within first image column (first NITF
column) and negative when stored within last image column.
04h Column Rate 0 0 0 0 Row and Column Detection Rate. The duration in time to collect a single or simultaneous set of rows
(ROW_SET) and columns (COLUMN_SET) when forming the image array. These fields are applicable to
all intended sensors. For framing sensors, these values shall be zeros. For pushbroom sensors, one field
(in the direction of the scanning) shall have a meaningful value and the other shall have zeros. To
indicate positive or negative scan direction, this field value shall be positive when FIRST_PIXEL ROW is
stored within first image row (first NITF row) and negative when stored within last image row. Or this field
value shall be positive when FIRST_PIXEL_COLUMN is stored within first image column (first NITF
column) and negative when stored within last image column.
04i First Pixel Row 0 0 0 0 Row and Column of First Collected Pixel. The row and column of the NITF array (image segment)
containing the first collected pixel or line set of the image array. These fields are applicable to all intended
sensors to relate the image array to the NITF image segment. For framing sensors, these values shall be
top left (00000001, 000000001) of the displayed image array unless rotated for NITF processing. For
pushbroom, one field shall have a meaningful value (typically either one or the row or column count) and the other
shall be zeros.
04j First Pixel 0 0 0 0 Row and Column of First Collected Pixel. The row and column of the NITF array (image segment)
Column containing the first collected pixel or line set of the image array. These fields are applicable to all intended
sensors to relate the image array to the NITF image segment. For framing sensors, these values shall be
top left (00000001, 000000001) of the displayed image array unless rotated for NITF processing. For
pushbroom, one field shall have a meaningful value (typically either one or the row or column count) and the other
shall be zeros.
04k Transform 0 0 0 0 Number of Image Transform Parameters Provided. This flag field indicates the number of subsequent
Params fields (indices 04l to 04s) that are present; ‘0’ omits the presence of these fields. See Z.5.4.X Image
Transformation Parameters for additional information.
04l Transform 0 0 0 0 Image Transform Parameters (h1, h2, h3, h4, h5, h6, h7, h8). These parameters are defined in Z.5.4.X
Param1 Image Transformation Parameters. They provide the elements of the transformation matrix between the
image coordinates associated with the metadata in this TRE and those corresponding to the NITF image
segment's data array. If the metadata values in this TRE correspond with the NITF-stored image array,
these parameters would be the elements of an identity matrix. A Transform Parameter must be provided
if its number is less than or equal to IMAGE_TRANSFORM_PARAM_COUNT. Otherwise the parameter
is not provided. A meaningful value for each provided parameter is required. See section Z.5.7 for additional
information.
04m Transform 0 0 0 0 Image Transform Parameter (h2)
Param2
04n Transform
0 0 0 0 Image Transform Parameter (h3)
Param3
04o Transform
0 0 0 0 Image Transform Parameter (h4)
Param4
04p Transform
0 0 0 0 Image Transform Parameter (h5)
Param5
04q Transform
0 0 0 0 Image Transform Parameter (h6)
Param6
04r Transform
0 0 0 0 Image Transform Parameter (h7)
Param7
04s Transform
0 0 0 0 Image Transform Parameter (h8)
Param8
05a Reference Time 0 0 0 0 Reference Time of Applicability. This is the time for which the values in the TRE are to be applied. The
value is in seconds before (negative) or after (positive) START_TIME. A zero value places the reference
time at the imaging start time, such as would be the case for sensors modeled with instantaneous
collections-where START_TIME equals END_TIME. This field shall not be defaulted if either
REFERENCE_PIXEL_ROW or REFERENCE_PIXEL_COLUMN (index 05b or 05c) is defaulted.
05b Reference Row 0 0 0 0 Reference Pixel Row and Column of Applicability. Reference row and column pixel index relative to the
NITF image segment at which the field values in this TRE apply. The pixel indices shall refer to the entire
NITF image segment associated with this TRE. Pixel indexing is per the standard NITF raster order-
column indices increasing to the right and row indices increasing downward from the viewer's
perspective; see Z.4.X Array Definitions. These fields shall not be defaulted if REFERENCE_TIME (index 05a) is
defaulted.
05c Reference 0 0 0 0 Reference Pixel Row and Column of Applicability. Reference row and column pixel index relative to the
Column NITF image segment at which the field values in this TRE apply. The pixel indices shall refer to the entire
NITF image segment associated with this TRE. Pixel indexing is per the standard NITF raster order-
column indices increasing to the right and row indices increasing downward from the viewer's
perspective; see Z.4.X Array Definitions. These fields shall not be defaulted if REFERENCE_TIME (index 05a) is
defaulted.
06a =GLatitude, 0 0 0 0 Sensor or Platform Latitude or ECEF X Position. This value provides the platform or sensor geolocation;
Geocentric X see Z.5.6 Sensor Position Data Module. It shall reflect the geodetic type specified in GEODETIC_TYPE (index 01h).
06b =GLongitude, 0 0 0 0 Sensor or Platform Longitude or ECEF Y Position. This value provides the platform or sensor
gGeocentric Y eolocation; see Z.5.6 Sensor Position Data Module. It shall reflect the geodetic type specified in
GEODETIC TYPE (index 01h).
06c =GAItitude 0 0 0 0 Sensor or Platform Altitude or ECEF Z Position. This value provides the platform or sensor geolocation;
Geocentric Z see Z.5.6 Sensor Position Data Module. It shall reflect the geodetic type specified in GEODETIC_TYPE (index 01h).
06d Sensor X Offset 0 0 0 0 Sensor X Position Offset Relative to Platform Coordinate System. The location of the sensor perspective
center relative to the above-reported position; measured respectively in the platform's X, Y, and Z
direction. See Z.5.6 Sensor Position Data Module for further clarification. These values shall reflect the
units specified in LENGTH_UNIT (index 01j).
06e Sensor Y Offset 0 0 0 0 Sensor Y Position Offset Relative to Platform Coordinate System. The location of the sensor perspective
center relative to the above-reported position; measured respectively in the platform's X, Y, and Z
direction. See Z.5.6 Sensor Position Data Module for further clarification. These values shall reflect the
units specified in LENGTH_UNIT (index 01j).
06f Sensor Z Offset 0 0 0 0 Sensor Z Position Offset Relative to Platform Coordinate System. The location of the sensor perspective
center relative to the above-reported position; measured respectively in the platform's X, Y, and Z
direction. See Z.5.6 Sensor Position Data Module for further clarification. These values shall reflect the
units specified in LENGTH_UNIT (index 01j).
07 Attitude Euler 0 0 0 0 Attitude Euler Angle Flag. Flag field indicating the combined presence of platform and sensor Euler angle
Angles values. This module provides platform and sensor attitude measurements in terms of their respective
Euler angles. The sensor attitude, as provided either by this module or by Modules 8 or 9, is typically
required for geopositioning. If provided, the parameter values in Modules 8 and 9 take precedence over
Module 7. ‘Y’ in this field indicates the presence of the Fields 07a to 07h; ‘N’ omits their presence. The
units depend on the value of ANGULAR UNIT (index 01k). Additional definitions and implementation
guidance for these fields are provided in Z.5.7 Euler Angles Module.
07a Sensor Angle 0 0 0 0 Type of Sensor Angle Rotations. Specifies the Euler angle model using a coded value. Typically the
Model model type will depend on the sensor's mounting and/or pointing gimbal layout. The definitions for the
types of sensor angle rotations are provided in Z.5.7.2 Sensor Angle Model Type.
07b Sensor Angle 1 0 0 0 0 First Sensor Rotation Angle. Rotations to the sensor coordinate system from the platform or local-level
coordinate system. The rotation definitions are given in Z.5.7.3 Sensor Euler Angle Definitions. The
definitions depend on SENSOR_ANGLE_MODEL (index 07a) and PLATFORM_RELATIVE (index 07e);
see Z.5.7.4. Euler Angle Attitude Notes. All three fields shall be populated with meaningful values when
this module is present.
07c Sensor Angle 2 0 0 0 0 Second Sensor Rotation Angle. Rotations to the sensor coordinate system from the platform or local-level
coordinate system. The rotation definitions are given in Z.5.7.3 Sensor Euler Angle Definitions. The
definitions depend on SENSOR_ANGLE_MODEL (index 07a) and PLATFORM_RELATIVE (index 07e);
see Z.5.7.4. Euler Angle Attitude Notes. All three fields shall be populated with meaningful values when
this module is present.
07d Sensor Angle 3 0 0 0 0 Third Sensor Rotation Angle. Rotations to the sensor coordinate system from the platform or local-level
coordinate system. The rotation definitions are given in Z.5.7.3 Sensor Euler Angle Definitions. The
definitions depend on SENSOR_ANGLE_MODEL (index 07a) and PLATFORM_RELATIVE (index 07e);
see Z.5.7.4. Euler Angle Attitude Notes. All three fields shall be populated with meaningful values when
this module is present.
07e Platform Relative 0 0 0 0 Sensor Angles Relative to Platform Flag. If this flag is set to ‘Y’, the above sensor angles are relative to
the platform coordinate system (see Z.5.7.3 Sensor Euler Angle Definitions); otherwise, the above angles
are relative to the time-relevant, sensor-local NED coordinate system, regardless of the
GEODETIC_TYPE (index 01h). (See Z.5.7.4. Euler Angle Attitude Notes.)
Finished 33, Start 34
07f Platform Heading 0 0 0 0 Platform Heading Angle. The three Euler angles define the attitude of the platform coordinate system;
see Z.5.7.1 Platform Euler Angle Definitions. They are relative to the time-relevant, platform/sensor-local
NED, regardless of the GEODETIC_TYPE (index 01h). Meaningful values for the three angles are
required if PLATFORM_RELATIVE (index 070 is set to +1Y’; otherwise, the angles may be default filled.
07g Platform Pitch 0 0 0 0 Platform Pitch Angle. The three Euler angles define the attitude of the platform coordinate system; see
Z.5.7.1 Platform Euler Angle Definitions. They are relative to the time-relevant, platform/sensor-local
NED, regardless of the GEODETIC_TYPE (index 01h). Meaningful values for the three angles are
required if PLATFORM_RELATIVE (index 07f) is set to ‘Y’; otherwise, the angles may be default filled.
07h Platform Roll 0 0 0 0 Platform Roll Angle. The three Euler angles define the attitude of the platform coordinate system; see
2.5.7.1 Platform Euler Angle Definitions. They are relative to the time-relevant, platform/sensor-local
NED, regardless of the GEODETIC_TYPE (index 01h). Meaningful values for the three angles are
required if PLATFORM_RELATIVE (index 07f) is set to ‘Y’; otherwise, the angles may be default filled.
Attitude Unit Vector Flag. Flag field indicating the presence of image coordinate system's unit vectors
components. This module provides the image coordinate system direction cosines relative to the sensor-
08 Attitude Unit 0 0 0 0 local NED or the ECEF coordinate system. The sensor attitude, as provided either by this module or by
Vectors Modules 7 or 9, is typically required for geopositioning. The parameter values in this module take
precedence over those in Modules 7 and 9. ‘Y’ in this field indicates the presence of the Fields 08a to 08i;
‘N’ omits their presence. See Z.5.8 Unit Vectors Module for additional information regarding these fields.
08a =GICX North, ICX 0 0 0 0 Image Coordinate (IC) System X Axis Attitude Unit Vector Relative to North (NED) or X (ECEF
X Coordinate System). The coordinates of the image coordinate system-aligned unit vectors (or
equivalently the image coordinate system's direction cosines). These component values are relative to
either the sensor-local NED.or the ECEF cbordinate system, as specified by GEODETIC_TYPE (index
01h). All fields shall be populated with meaningful values when this module is present. The uncertainties
associated with these values are reported in Module 14 (Uncertainty Data) using a unique method as
described in Z.5.14.X Attitude Unit Vector Uncertainties.
08b =GICX East, ICX 0 0 0 0 Image Coordinate (IC) System X Axis Attitude Unit Vector Relative to East (NED) or Y (ECEF Coordinate
Y System). See 08a
08c =GICX Down, 0 0 0 0 Image Coordinate (IC) System X Axis Attitude Unit Vector Relative to Down (NED) or Z (ECEF
ICX Z Coordinate System). See 08a
08d =GICY North, ICY 0 0 0 0 Image Coordinate (IC) System Y Axis Attitude Unit Vector Relative to North (NED) or X (ECEF
X Coordinate System). See 08a
08e =GICY East, ICY 0 0 0 0 Image Coordinate (IC) System Y Axis Attitude Unit Vector Relative to East (NED) or Y (ECEF Coordinate
Y System). See 08a
08f =GICY Down, 0 0 0 0 Image Coordinate (IC) System Y Axis Attitude Unit Vector Relative to Down (NED) or Z (ECEF
ICY Z Coordinate System). See 08a
08g =GICZ North, ICZ 0 0 0 0 Image Coordinate (IC) System Z Axis Attitude Unit Vector Relative to North (NED) or X (ECEF
X Coordinate System). See 08a
08h =GICZ East, ICZ 0 0 0 0 Image Coordinate (IC) System Z Axis Attitude Unit Vector Relative to East (NED) or Y (ECEF Coordinate
Y System). See 08a
08i =GICZ Down, ICZ 0 0 0 0 Image Coordinate (IC) System Z Axis Attitude Unit Vector Relative to Down (NED) or Z (ECEF
Z Coordinate System). See 08a
09 Attitude 0 0 0 0 Attitude Quatemion Flag. Flag field indicating the presence of a sensor attitude quatemion. This module
Quatemions provides the sensor attitude using a quatemion relative to the sensor-local NED or the ECEF coordinate
system. The sensor attitude, as provided either by this module or by Modules 7 or 9, is typically required
for geopositioning. The parameter values in this module take precedence over those in Module 7, but
those in Module 8 take precedence over these. ‘Y’ in this field indicates the presence of the Fields 09a to
09d; ‘N’ omits their presence. See Z.5.9 Quatemion Module for additional information regarding these fields.
09a Attitude Q1 0 0 0 0 Attitude Quatemion Vector Components. Three vector elements of a normalized quatemione defining the
conceptual sensor attitude relative to either the sensor-local NED or the ECEF coordinate system, as
specified by GEODETIC TYPE (index 01h).
09b Attitude Q2 0 0 0 0 See 09a
09c Attitude Q3 0 0 0 0 See 09a
09d Attitude Q4 0 0 0 0 Attitude Quatemion Scalar Component. Fourth (scalar) element of a normalized quatemione (absolute
magnitude of the four-element quatemion is unity). See ATTITUDE Q1 (index 09a) description above.
10 Sensor Velocity 0 0 0 0 Sensor Velocity Data Flag. Flag field indicating the presence of sensor velocity data. This module
provides the sensor velocity components relative to the sensor-local NED or the ECEF coordinate
system, as may be useful with a pushbroom or whiskbroom sensor. ‘Y’ in this field indicates the presence
of the Fields 10a to 10c; ‘N’ omits their presence. See Z.5.10 Sensor Velocity Module for additional
information regarding these fields.
10a =GVelocity North, 0 0 0 0 Sensor North; East, and Down Velocity Vectors. The velocity vector components of the sensor's velocity
Velocity X vector.e The velocity components are relative to either the sensor-local NED or the ECEF coordinate
system, as specified by GEODETIC_TYPE (index 01h). Units are specified by LENGTH_UNIT (index 01j).
10b =GVelocity East 0 0 0 0 See 10a
Velocity Y
10c =GVelocity Down, 0 0 0 0 See 10a
Velocity Z
11 Point Sets 1 0 0 0 Polygon or Point Set Count Flag. Flag field indicating the number of points-of-interest sets associated
with the NITF image segment. This module provides point sets (a set of one or more points such as
polygon vertices) to identify or bound features within the image and provides geometric information
associated with those points. ‘00’ in this field omits the presence of Fields 11a to 11h. A non-zero value
defines the number of point sets included in the TRE; see Z.4.7 Looping Fields Concept. See Z.5.11
Point Sets Module for additional information regarding these fields.
11a Point Set Type 0 1 1 0 Type of Point Set = S. The type of point of interest set. This field identifies the type of feature being
identified or bound by point set = S.
11b Point Count 1 1 1 0 Number of Points in Set = S. The number of points used to identify or bound the feature identified by
POINT_SET_TYPE = S (index 11a.m). For each occurrence of fields 11a and 11b, fields 11c through 11h
occur POINT_COURT_ = S (index 11b.m) times.
11c Row 0 2 1 1 Row Location for Point = s. The NITF-stored array coordinates for point = s in point set = S.
11d Column
0 2 1 1 Column Location for Point = s. The NITF-stored array coordinates for point = s in point set = S.
11e Latitude
0 2 1 1 Latitude for Point = s. The estimated geographic latitude coordinate for point = s in point set = S. The
geographic coordinates of the imaged feature or region will be provided using the latitude and longitude,
regardless of the value specified in GEODETIC_TYPE (index 01h). Fields needing data that is
unavailable should be default filled.
11f Longitude 0 2 1 1 Longitude for Point = s. The estimated geographic longitude coordinate for point = s in point set = S. The
geographic coordinates of the imaged feature or region will be provided using the latitude and longitude,
regardless of the value specified in GEODETIC_ TYPE (index 01h). Fields needing data that is
unavailable should be default filled.
11g Elevation 0 2 1 1 Elevation for Point = s. The estimated geographic elevation for point = s in point set = S. The geographic
coordinates of the imaged feature or region will be provided using the latitude and longitude, regardless
of the value specified in GEODETIC_TYPE (index 01h). The elevation will be relative to the datum
specified by ELEVATION_DATUM (index 01i). Fields needing data that is unavailable should be default filled.
11h Range 0 2 1 1 Range for Point = s. The measured range (line-of-sight distance from sensor to point) for point = s in point
set = S. Fields needing data that is unavailable should be default filled.
12 Time Stamps 1 0 0 0 Time Stamp Count Flag. Flag field indicating the number of dynamic parameters recorded with time-
Sets stamps in this TRE. This module associates parameter values directly to multiple specific reference
times. ‘00’ in this field omits the presence of Fields 12a to 12d. A non-zero value defines the number of
module indicates the values stored in the index-referenced fields prior to this module shall be nominal,
average, or approximate values per the entire image segment. See Z.5.12 Time-Stamped Data Module
for additional guidance and possible exception.
12a Time Stamp Type −1 1 1 0 12d Index of the lime-Stamped Parameter = S. lime-stamped parameter = S's index value. Indexing is limited
to reasonably applicable fields (dynamic parameters) in the value range.
12b Time Stamp 1 1 1 0 Number of Occurrences of Parameter = S. The number of time stamps for which this parameter's value is
Count recorded. This field determines the number of times the two fields below (12c and 12d) repeat to record
the time stamps and values of the mth indexed parameter.
12c Time Stamp Time 0 2 1 1 Instance = s of a Time-Stamp Time. The reference time in seconds relative to START_TIME (index 01m)
associated with the subsequent value of parameter = S. Negative times are possible, if the parameter is
explicitly measured before the imaging start time.
12d =R12a 0 2 1 1 Instance = s of Time-Stamped Parameter = S's Value. The parameter value at the time-stamp time above
(TIME_STAMP_TIME_=s, index 12c.m.n).
13 Pixel Reference 1 0 0 0 Pixel Reference Count Flag. Flag field indicating the number of dynamic parameters recorded with
Sets references to pixel indices. This module associates parameter values directly to multiple NITF image.
segment locations. ‘00’ in this field omits the presence of Fields 13a to 13e. A non-zero value defines the
number of pixel-referenced parameters present in the module; see Z.4.7 Looping Fields Concept. The
presence of this module indicates the values stored in the index-referenced fields prior to this module
shall be nominal, average, or approximate values per the entire image segment. See Z.5.13 Pixel-
Referenced Data Module for additional guidance and possible exception.
13a Pixel Reference −1 1 1 0 13e Index of Pixel-Referenced Parameter = S. Pixel-referenced parameter = S's index value. Indexing is
Type limited to reasonably applicable fields (dynamic parameters) in the range.
13b Pixel Reference 1 1 1 0 Number of Occurrences of Parameter = S. The number of pixel reference locations for which this
Count parameters value is recorded. This field determines the number of times the three fields (13c, 13d, 13d)
repeat to record the pixel locations and values of parameter = S.
13c Pixel Reference 0 2 1 1 Instance = s of a Reference Pixel Row and Column Index. The row and column index for the reference
Row pixel associated with the subsequent value of parameter = S. The center row or center column index shall
be used, respectively, when the parameter is only column- or row-varying. Each set of pixel coordinates
pertains to the NITF-stored image segment.
13d Pixel Reference 0 2 1 1 Instance = s of a Reference Pixel Row and Column Index. The row and column index for the reference
Column pixel associated with the subsequent value of parameter = S. The center row or center column index shall
be used, respectively, when the parameter is only column- or row-varying. Each set of pixel coordinates
pertains to the NITF-stored image segment.
13e =R13a 0 2 1 1 Instance = s of Pixel-Referenced Parameter = S's Value. The parameter value associated with the
reference pixel location defined above (indices 13c.m.n and 13d.m.n).
The at least one external rules file 16 is used to specify standard specific rules to be applied during data input to test data accuracy and data consistence. There are three classes of data in the at least one external rules file 16. The three classes of data are constants, module rules, and Tagged Record Extension (TRE) rules. Constants modify program behavior and define values that can be used elsewhere in the rules. Module rules define the rules for data accuracy within any single module and are invoked when the user completes a module. TRE rules define rules for data consistency and module completeness and are generally applied across modules. TREs are invoked when the user has completed module input and saves the TRE.
An example of the standard rules file 16 associated with SENSRB is shown in Table II.
Table II: Example Standard Rules File Associated with SENSRB
  • /Constants
  • baseDirectory, d:
  • eps, 1E-6
  • 01, GeneralData, 01p, 01q, 01r
  • 02, SensorArray, 02d, 02e, 02f, 02g, 02h
  • 04, ImageFormation, 04k, 04l, 04m, 04n, 04o, 04p, 04q, 04r, 04s
  • 05, ReferenceTime, 05a, 05b, 05c
  • 06, SensorPosition, 06a, 06b, 06c, 11000, 6352800
  • 07, EulerAngle, 07e, 07f, 07g, 07h
  • 08, UnitVector, 08a, 08b, 08c, 08d, 08e, 08f, 08g, 08h, 08i, eps
  • 09, Quaternion, 09a, 09b, 09c, 09d, eps
  • /TRERules
  • 00, GroundPts, 11, 11c, 11d, 11e, 11f=1, ModuleOr2, 12, 13, 14a, Pushbroom
  • =1, ModuleOr2, 12, 13, 04a, Whiskbroom
  • >, ModuleReq, 14, 09f
FIG. 4 illustrates an image of the hierarchical structure of an SENSRB Data Conversion System, according to embodiments of the invention. Module 1 60A, Module 2 60B, and Module 3 60C depict graphic representations of the fifteen modules in the first instantiation of a SENSRB data set. The hierarchical structure is further defined by showing input forms 70A through 70O associated with Module 2 60B.
For a user looking to create a SENSRB file from scratch, or looking to explore the details of the SENSRB specification, the user presses the SENSRB View icon 32B (shown in FIG. 3). When there is no SENSRB data yet, a new, first instance of a Modules Form 60B, is shown FIG. 5. When SENSRB data does exist at this level, pressing pressing the SENSRB View icon 32B will open all instances of the Modules Form 60B.
The Modules Form 60B is a graphic representation of the fifteen modules in the first instantiation of a SENSRB data set. Each of the fifteen named rectangles is a button that opens a new or existing module. The buttons are color-schemed. When a button is outlined in red on the screen, data exists within that module and that data will be presented to the user when that button is pressed. The SENSRB standard is composed of fifteen modules per TRE. Other standards, however, may have more or less modules than the SENSRB standard without detracting from the scope of the invention.
The color of each button represents whether that module is Required, Conditional, a One or Other module, or an Optional module. Initially, the Modules Form 60B shows the modules required for a Content Level 2 SENSRB.
When a general data icon button 62 is selected in FIG. 5, an input form 70 is open and displayed to the user as shown in FIG. 6. The terminology 1.01 as shown in FIG. 6 means Modules Form (instance) 1 and the module is 01. The data associated with the first module set is extremely important and drives all subsequent forms in terms of elements including units, geodetic coordinate type, Content Level, Sensor and Platform types. As a result, the form is presented automatically, and the buttons on the Modules Form for other modules are not active until the first form has been fully defined and saved (using either the Save 72 or Save and Close 74 buttons on the bottom of InputForm 1.01).
The user can enter data using a keyboard and move between fields using the tab or enter key (use shift-tab or shift-enter to move in the upward direction) or computer mouse. For numeric fields, the Value Range includes a “min to max” specification of the permissible range of acceptable values. For alphanumeric fields, the Value Range includes a drop down list of possible, generally NTB registered, values for that field.
The Units column 76 defines the system of units that should be used to enter data for that field. Unitless values, including integers and pixels, are represented as a “-”. All other fields have a variety of units that can be used to enter or view the data. For example, when entering a velocity, the possible units are meters/sec, ft/sec, km/hr, miles/hr, and knots. Even though the final results will be stored in the default units as defined in form 1.01, the user may enter or view any input in any units without concern for its final state. In addition, a Value Range column 78 will adjust, when appropriate, to show the min and max range in the selected units.
For any form, those fields that have a gray background are fields that have yet to be defined. All fields, with the exception of “NA” fields, must be input, and thus turned non- gray, in order to be identified as defined, that is, a value having been entered. Another way to cause a value to be “defined” is to move the insertion point into the field and then hit the enter key (tab will only move to the next field). The input form 70 will not close until all values have been defined. The Default button 80 can be used to cause all fields to be designated as “defined.”
The user may have as many modules (Input Forms) open as needed. Similarly, the user may have as many Tagged Record Extensions, TRE's (Module Forms), open as needed. However, all Input Forms must be closed, generally by selecting the “Save and Close” button 74 before the user can close the corresponding parent Module Form.
Modules 11, 12, 13, and 14 (shown in FIG. 5) contain looping fields, that is, potentially multiple instantiations of the high level and/or internal data sets. When the user enters the value for a Set or Count, the Input Form will automatically expand or contract to support the new definition. Scroll bars are added to the form when it extends beyond the height of the screen. The user may also reduce the size of a window by dragging the mouse down when at the top of a window. The representation of the form is modified slightly to provide indents and full Index descriptions to aid in understanding of the information.
Moving the mouse over Definition column 82 of a field, the third column of the Input Form 70, will provide a pop-up of the full specification description of a particular item.
When the input mode is completed by filling in the necessary fields for the application and closing the Input Form 70, the user presses the Save and Close button 74 on the bottom right of the Input Form 70. When Instance 1 of the Modules Form 60B, then the entire SENSRB and all subsequent TREs will be written into the top form, the SENSRB Data Conversion form. When the Module Form selected is an instance higher than 1, then only that subordinate SENSRB TRE will be written (appended to any prior SENSRB characters already in that high-level window).
The character based SENSRB header will be displayed in the window and the drop down list will show “TRE View”. When error messages are created, accompanied by a “bell” sound, they will be displayed in the “Input Mode” view. Short error messages are displayed in a pop-up window.
When the SENSRB information has been created, pressing the write button 38B to the left of the SENSRB View (shown in FIG. 3) writes the SENSRB information to a file. A browser window permits the user to specify the name and location of the file and the default file suffix will be “.txt.” Once the file is written, it is automatically released so that other applications may use it.
To view an existing, stored SENSRB file, the user touches the “External SENSRB” icon and a browser window will allow the user to select the file, which is then displayed in the visualization window.
To read an existing SENSRB file into the input forms, the user selects the read icon 34B to the left of the SENSRB View to read a selected External SENSRB file, or to select a file when one has not been selected, into the complete set of Module Forms and Input Forms. The full set of existing Module Forms will be displayed with red borders on the individual modules that have data.
When errors are detected during the Read process, the particular Input Form that contains the first error will be presented and the field(s) that are not properly defined will be shown with a light red background and an error message will be displayed for each error found on that particular form. The system will not move to subsequent modules once an error has been detected because typically a character based input file will generate many errors from a single character offset error.
When a SENSRB model has been created, the user can compare the results of that TRE to another TRE stored on disk. When necessary, the user may release any external file currently open by selecting the release icon 48B (FIG. 3) and then selecting the External SENSRB icon 32B, or simply selecting the Compare button icon 42B, and selecting the file. A compare operation may be performed between the two character-based TREs and the region surrounding the first differences will be displayed in the visualization window.
The ten icons to the right of the SENSRB View icon represent operations on various representations of the data. To get data into data view format, the user presses the Convert icon 40B (FIG. 3) after creating a SENSRB model (it is not necessary to create the actual SENSRB format or to create a file). The result is a format of the SENSRB called a Name View. A portion of a sample Name View is shown below in Table III.
TABLE III
Sample of Name View Format
1.01 General Data Y
1.01a Sensor SHARP
1.01b Sensor URI
1.01c Platform F-18
1.01d Platform URI
1.01e Operation Domain Airborne
1.01f Content Level 6
1.01g Geodetic System WGS84
1.01h Geodetic Type G
1.01i Elevation Datum HAE
1.01j Length Unit EE
1.01k Angular Unit DEG
1.01l Start Date 20100319
1.01m Start Time 000000000012.3
1.01n End Date 20100319
1.01o End Time 000000000012.3
1.01p Generation Count 00
1.01q Generation Date --------
1.01r Generation Time ----------
1.02 Sensor Array Data Y
1.02a Detection Visible
1.02b Row Detectors 00001024
1.02c Column Detectors 00001024
1.02d Row Metric 000.9442
1.02e Column Metric 000.9442
1.02f Focal Length 3.937008
1.02g Row Fov --------
1.02h Column Fov --------
1.02i Calibrated N
1.03 Sensor Calibration Data N
1.04 Image Formation Data Y
1.04a Method Single Frame
1.04b Mode 001
1.04c Row Count 00001024
The Index View format is a two column table similar to the Name View with the names of the fields removed.
Table IV shows a portion of a sample XML representation of the TREs and the modules within each TRE.
TABLE IV
Portion of Sample XML View
<?xml version=“1.0” encoding=“UTF-8”?>
<!-- Processed by SENSRB Data Conversion System -->
<Document>
 <name>SENSRB</name>
 <Folder>
  <name>TRE1</name>
  <Folder>
   <name>Module01</name>
   <1.01>Y</1.01>
   <1.01a>SHARP   </1.01a>
   <1.01b>  </1.01b>
   <1.01c>F-18 </1.01c>
   <1.01d>  </1.01d>
   <1.01e>Airborne </1.01e>
   <1.01f>6</1.01f>
   <1.01g>WGS84</1.01g>
   <1.01h>G</1.01h>
   <1.01i>HAE</1.01i>
   <1.01j>EE</1.01j>
   <1.01k>DEG</1.01k>
   <1.01l>20100319</1.01l>
   <1.01m>000000000012.3</1.01m>
   <1.01n>20100319</1.01n>
   <1.01o>000000000012.3</1.01o>
   <1.01p>00</1.01p>
   <1.01q>--------</1.01q>
   <1.01r>----------</1.01r>
  </Folder>
  <Folder>
   <name>Module02</name>
   <1.02>Y</1.02>
   <1.02a>Visible </1.02a>
   <1.02b>00001024</1.02b>
   <1.02c>0000102441.02c>
To switch between data views, the user selects the drop down box 46B below the Data View icon 36B (FIG. 3) to select the representation desired. The dominant view is the Name View and generally the other views are derived from it. However, when only an Index View or an XML View exist, for example in an external data file, that view can be converted into a Name View by using generic Field Names (the names used in the documentation but not precisely the names used in operation, for example “Latitude” for Geodetic Type and “Geocentric X” for Cartesian Type. Future work includes adding other data views to the system.
The user converts data view to and from the SENSRB view by selecting the convert icons 40B (FIG. 3) to convert from a SENSRB to a Data View and vice versa. When creating a SENSRB from a Data View, both a character based TRE is created as well as the complete set of module forms and input forms.
The user compares data view format to SENSRB view format by selecting the compare icon 42B (FIG. 3) below the convert icons 40B. Although these are very different types of files, a report is generated in a Name View style of format, a sample of which is shown below in TABLE V.
TABLE V
Sample of Data View to SENSRB View Comparison
SENSRB to Data - Comparison Results: Differences Have Been Detected
Index FieldName Value Left Value Right
1.01l Start Date 20100319 20100312
1.01m Start Time 12.3 16.875
1.01n End Date 20100319 20100312
1.01o End Time 12.3 16.875
1.04 Image Formation Data Y N
1.04a Method Single Frame [not-defined]
1.04b Mode 001 [not-defined]
1.04c Row Count 00001024 [not-defined]
1.04d Column Count 00001024 [not-defined]
1.04e Row Set 00001024 [not-defined]
1.04f Column Set 00001024 [not-defined]
1.04g Row Rate 0 [not-defined]
1.04h Column Rate 0 [not-defined]
1.04i First Pixel Row 00000000 [not-defined]
1.04j First Pixel Column 00000000 [not-defined]
1.04k Transform Params 0 [not-defined]
The user may write any format data view to diskette by first selecting the format, or view that the user wishes to save. The user then selects the write icon 38B (FIG. 3). A dialog box permits the user to name the file and select its file location. The file is released after writing. A Name View is a tab-delimited file and so you can subsequently open that file in a spreadsheet program, for example, by creating a multi-column worksheet.
The user may read any format external file into data view by selecting the external data icon 20 (FIG. 3) and selecting the file, or simply selecting the read icon 34B (FIG. 3). The file will be analyzed to determine the type of file and then it will be read into the appropriate View. After the file has been read, the user selects the Name View when the user wishes to convert that into SENSRB View.
The erase icons 44B (FIG. 3) are good ways to get a fresh start in creating a SENSRB file from scratch when the user needs to start over for some reason. Selecting the erase icons 44B will delete all ModuleForms and all associated InputForms. The erase icon under the Data View will erase the particular Data View currently being viewed. For example, the user can erase the XML View and go back to the Name View and make changes. By again selecting the XML View, the changes the user has made will be reflected in the XML View.
Data may be modified in the active text window 50B. The active text window 50B is used to show all of the views. A red box surrounds whichever data type is currently active, for example, External SENSRB, SENSRB, Data View, or External Data. By selecting each major icon, the user will cause the active text window 50B to change to show the contents of the selected icon.
The text window 50B is always active. The user can edit the file, including insertions, deletions, copy, paste, and selecting all. For example, to copy a Data View into a word processing program, the user touches the Data View text window, enters commands specific to the particular word processing program, then actuates the word processing program, and pastes the selected item.
A change made in the text window 50B of any active component becomes a part of that data and any use of that data will reflect those changes. For example, when the user makes changes to a Name View table, and then “Convert” into SENSRB, the new SENSRB will contain those changes.
Three attitude input forms are unique in that they provide three different means of defining the same information in a SENSRB file. A person having ordinary skill in the art will recognize that attitude input forms are also called attitude modules. The user may input the attitude data using any of these modules: Euler Angles, Unit Vectors, and Quaternions.
FIG. 7A illustrates a screen shot of a SENSRB attitude input form 90 in Euler angles, according to embodiments of the invention.
FIG. 7B illustrates a screen shot of a SENSRB attitude input form 100 in unit vectors, according to embodiments of the invention.
FIG. 8 illustrates a screen shot of a SENSRB attitude input form 110 in quaternions, according to embodiments of the invention.
Using the SENSRB Data Conversion Tool, the user may open another attitude module and view that same information in another format. In all views, the form is pre-populated with the values appropriate for those forms consistent with the first defined form. As shown in FIGS. 7A, 7B, and 8, a flag field 92, 102, and 112 is “N” and all of the new fields are gray (undefined).
The SENSRB TRE can include multiple attitude modules, thus making it permissible to change the flag field 92, 102, and 112 to “Y” and select a default icon 94, 104, and 114 and accept the pre-populated values in those fields.
FIG. 9 illustrates a screen shot of a SENSRB inconsistent attitude representation 120, according to embodiments of the invention. This occurs where two or more attitude modules could be defined that are inconsistent representations of the attitude. In FIG. 9, the sign is changed on one of the terms in the Quaternion Module. The user sees the FIG. 9 screen when attempting to save that module.
A consistency window 130 has appeared indicating that, in this case, there are two inconsistent Attitude Modules. The buttons in the window provide a variety of means to address the problem.
FIG. 10A illustrates a screen shot of a SENSRB consistency window 130 notification, according to embodiments of the invention. In the example shown in FIG. 10A, the user sees an Euler icon 132 and a Quatemion icon 134. The user will select either the Euler icon 132 or the Quaternion icon 134 to indicate which definition should prevail. Additionally, the consistency window 130 includes two leave inconsistent icons 136 and 138 and two deactivate icons 140 and 142. The user could elect to leave the two modules inconsistent in which case both modules will be eventually written to the SENSRB file. The deactivate icons cause the flag field 92, 102, or 112 in FIGS. 7A, 7B, or 8 of the corresponding Module to be set to “N”.
Selecting the quaternion icon 134 in FIG. 10A displays FIG. 10B, which illustrates a screen shot of a SENSRB consistency window after recalculating to eliminate inconsistencies 150, according to embodiments of the invention. Additionally, selecting the quaternion icon results in gray fields in the updated Euler form. An Euler icon 152 and a Quaternion icon 154 are displayed in green to indicate consistency.
FIG. 11 illustrates a screen shot of a SENSRB module 13 for pixel referenced sets 160, according to embodiments of the invention. FIG. 11 is a sample of a user data form for module 13 including multiple instantiations of data sets 162A, 162B, 162C, 162D, and 162E and data counts 164A, 164B, 164C, and 164D. Data forms are created dynamically because the contents are defined from the external sb Records Template file and extended in real-time as a result of user input. For example, the number of fields (rows) in a form would increase when an instantiation value, a sets parameter, is changed from the default value of one to, as an example, four 162A as shown in sample Module 13 160. In addition, index values 166, field names 168, drop-down menus 170, minimum/maximum range, and even pop-up descriptions presented in the form change during the course of user input.
Converted data may also be represented in graphical form such as, for example, a representation of the three-dimensional location and orientation of sensor-related information relative to the earth.
Where a range of values is provided, it is understood that each intervening value, to the tenth of the unit of the lower limit unless the context clearly dictates otherwise, between the upper and lower limits of that range is also specifically disclosed. Each smaller range between any stated value or intervening value in a stated range and any other stated or intervening value in that stated range is encompassed within the invention. The upper and lower limits of these smaller ranges may independently be included or excluded in the range, and each range where either, neither or both limits are included in the smaller ranges is also encompassed within the invention, subject to any specifically excluded limit in the stated range. Where the stated range includes one or both of the limits, ranges excluding either or both of those included limits are also included in the invention.
While the invention has been described, disclosed, illustrated and shown in various terms of certain embodiments or modifications which it has presumed in practice, the scope of the invention is not intended to be, nor should it be deemed to be, limited thereby and such other modifications or embodiments as may be suggested by the teachings herein are particularly reserved especially as they fall within the breadth and scope of the claims here appended.

Claims (7)

What is claimed is:
1. A data conversion apparatus for converting infrared, radar, and photographic sensor data with a non-transitory electronic-processor-readable medium executing a sensor data conversion computer program, comprising:
an electronic processor configured to convert sensor data and output said converted sensor data in a tangible medium;
an infrared, radar, and photographic sensor data conversion computer program operatively associated with said electronic processor, said infrared, radar, and photographic sensor data conversion computer program using a graphical user interface to display on a visual display device, wherein said visual display device operatively associated with said electronic processor;
a user-selection device operatively associated with said visual display device;
at least one sensor standard definition file consistent with at least one sensor standard, said at least one standard definition file configured for input to said infrared, radar, and photographic sensor data conversion computer program, wherein said at least one sensor standard definition file is a programmable external file defining specifics of said at least one sensor standard, wherein said specifics define a plurality of elements used to communicate sensor data in said infrared, radar, and photographic data conversion computer program, wherein said at least one sensor standard definition file is editable and extendable to support changes in said at least one sensor standard:
at least one sensor standard rules file consistent with said at least one sensor standard, said at least one sensor standard rules file operatively associated with said infrared, radar, and photographic sensor data conversion computer program;
at least one sensor external standard file consistent with said at least one sensor standard, said at least one sensor external standard file operatively associated with said infrared, radar, and photographic sensor data conversion computer program; and
at least one sensor external data file consistent with said at least one sensor standard, said at least one sensor external data file operatively associated with said infrared, radar, and photographic sensor data conversion computer program.
2. The apparatus according to claim 1, wherein said graphical user interface is icon-based and displayed using said visual display device, said graphical user interface further comprising:
at least one external standard file ion;
at least one external data file icon;
an internal structure view icon operatively associated with said infrared, radar, and photographic sensor data conversion computer program;
a plurality of read command icons operatively associated with said infrared, radar, and photographic sensor data conversion computer program, said plurality of read command icons used to read data into a data view format;
a data view icon operatively associated with said infrared, radar, and photographic sensor data conversion computer program, said data view icon used to allow a user to view and manipulate data associated with said data view format;
a plurality of write command icons operatively associated with said infrared, radar, and photographic sensor data conversion computer program;
a plurality of convert icons operatively associated with said infrared, radar, and photographic sensor data conversion computer program;
plurality of compare icons operatively associated with said infrared, radar, and photographic sensor data conversion computer program;
a plurality of erase icons operatively associated with said infrared, radar, and photographic sensor data conversion computer program, said plurality of erase icons used to allow a user to erase data inputted into said apparatus;
a plurality of display format drop-down list icons operatively associated with said infrared, radar, and photographic sensor data conversion computer program, said plurality of display format drop-down list icons used to display converted data in different formats;
a plurality of release icons operatively associated with said infrared, radar, and photographic sensor data conversion computer program; and
at least one text window operatively associated with said infrared, radar, and photographic sensor data conversion computer program, said at least one text window used to display a plurality of date view formats.
3. The apparatus according to claim 1, wherein stud at least one sensor standard definition file provides complete documentation of said at least one sensor standard upon a user mouse-over of a section or said at least one sensor standard in said graphical user interface.
4. The apparatus according to claim 1, wherein said at least one sensor standard rules file is an external file used to specify sensor standard specifics rules during data input to test data accuracy and data consistency.
5. The apparatus according to claim 1, wherein said at least one sensor external standard file is a text file configured for input and output of metadata into and out of said infrared, radar, and photographic sensor data conversion computer program.
6. The apparatus according to claim 1, wherein said at least one sensor external data file is a text file configured for input and output of formatted, human readable records into and out of said infrared, radar, and photographic sensor data conversion computer program.
7. The apparatus according to claim 1, wherein said at least one sensor standard definition file, further comprising:
a plurality of rows;
a plurality of columns associated with said plurality of rows; and
wherein the intersection of each row and each column in said plurality of rows and said plurality of columns define said specifies of said at least one sensor standard and are discrete sensor parameters associated with said at least one sensor standard.
US12/930,910 2010-12-30 2010-12-30 Structured data conversion system Active 2031-12-24 US8719317B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US12/930,910 US8719317B1 (en) 2010-12-30 2010-12-30 Structured data conversion system
US12/932,342 US8756259B1 (en) 2010-12-30 2011-02-10 Structured data conversion system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US12/930,910 US8719317B1 (en) 2010-12-30 2010-12-30 Structured data conversion system

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US12/932,342 Division US8756259B1 (en) 2010-12-30 2011-02-10 Structured data conversion system

Publications (1)

Publication Number Publication Date
US8719317B1 true US8719317B1 (en) 2014-05-06

Family

ID=50552943

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/930,910 Active 2031-12-24 US8719317B1 (en) 2010-12-30 2010-12-30 Structured data conversion system
US12/932,342 Active 2033-03-07 US8756259B1 (en) 2010-12-30 2011-02-10 Structured data conversion system

Family Applications After (1)

Application Number Title Priority Date Filing Date
US12/932,342 Active 2033-03-07 US8756259B1 (en) 2010-12-30 2011-02-10 Structured data conversion system

Country Status (1)

Country Link
US (2) US8719317B1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090265684A1 (en) * 2008-04-18 2009-10-22 Ids Scheer Aktiengesellschaft Systems and methods for graphically developing rules for transforming models between description notations
US20140156672A1 (en) * 2011-05-03 2014-06-05 Space-Time Insight Space-time-nodal type signal processing
US9052907B2 (en) 2011-10-25 2015-06-09 Software Ag Selective change propagation techniques for supporting partial roundtrips in model-to-model transformations
US20160019684A1 (en) * 2014-07-15 2016-01-21 Microsoft Corporation Wide field-of-view depth imaging
US9823089B1 (en) * 2016-06-21 2017-11-21 Amazon Technologies, Inc. Unmanned aerial vehicle sensor calibration as part of departure from a materials handling facility
US9972212B1 (en) 2016-06-21 2018-05-15 Amazon Technologies, Inc. Unmanned aerial vehicle camera calibration as part of departure or arrival at a materials handling facility
US9969486B1 (en) 2016-06-21 2018-05-15 Amazon Technologies, Inc. Unmanned aerial vehicle heat sensor calibration
US10032275B1 (en) 2016-06-21 2018-07-24 Amazon Technologies, Inc. Unmanned aerial vehicle sensor calibration during flight
US10220964B1 (en) 2016-06-21 2019-03-05 Amazon Technologies, Inc. Unmanned aerial vehicle sensor calibration validation before flight

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020049777A1 (en) * 2000-03-17 2002-04-25 Yasuhiro Terayama File conversion method, file converter, and file display system
US20030055836A1 (en) * 2001-09-14 2003-03-20 David Dubovsky Methods for generating data structures for use with environment based data driven automated test engine for GUI applications
US20050144166A1 (en) * 2003-11-26 2005-06-30 Frederic Chapus Method for assisting in automated conversion of data and associated metadata
US20050216917A1 (en) * 2004-03-23 2005-09-29 Srivatsa Krishnaswamy Method and system for data object transformation
US7043716B2 (en) * 2001-06-13 2006-05-09 Arius Software Corporation System and method for multiple level architecture by use of abstract application notation
US20060107997A1 (en) * 2004-11-22 2006-05-25 Fujikura Ltd. Electrode, photoelectric conversion element, and dye-sensitized solar cell
US20070297684A1 (en) * 2004-12-16 2007-12-27 Mitsubishi Electric Corporation Data Conversion Apparatus, Data Conversion Method, and Data Conversion System
US7321901B1 (en) * 2000-09-29 2008-01-22 Microsoft Corporation Application service provider model for manufacturers product specification data
US7562352B2 (en) * 2002-07-08 2009-07-14 Hitachi, Ltd. Data format conversion method and equipment, and controller management system using data format conversion equipment
US20090204921A1 (en) * 2008-02-07 2009-08-13 Vestyck Anthony R System and Method for Organizing, Managing, and Using Electronic Files
US7603368B2 (en) * 2007-03-30 2009-10-13 Fmr Llc Mapping data on a network
US20100169353A1 (en) * 2008-12-31 2010-07-01 Ebay, Inc. System and methods for unit of measurement conversion and search query expansion
US20110078211A1 (en) * 2009-09-30 2011-03-31 Albrecht Gass Systems and methods for analyzing and transforming an application from a source installation to a target installation
US7933755B1 (en) * 2003-11-17 2011-04-26 The Mathworks, Inc. Conversion of model components into references
US8326850B2 (en) * 2009-07-30 2012-12-04 Fujitsu Limited Data converting apparatus, method, and computer product

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020049777A1 (en) * 2000-03-17 2002-04-25 Yasuhiro Terayama File conversion method, file converter, and file display system
US7321901B1 (en) * 2000-09-29 2008-01-22 Microsoft Corporation Application service provider model for manufacturers product specification data
US7043716B2 (en) * 2001-06-13 2006-05-09 Arius Software Corporation System and method for multiple level architecture by use of abstract application notation
US20030055836A1 (en) * 2001-09-14 2003-03-20 David Dubovsky Methods for generating data structures for use with environment based data driven automated test engine for GUI applications
US7562352B2 (en) * 2002-07-08 2009-07-14 Hitachi, Ltd. Data format conversion method and equipment, and controller management system using data format conversion equipment
US7933755B1 (en) * 2003-11-17 2011-04-26 The Mathworks, Inc. Conversion of model components into references
US20050144166A1 (en) * 2003-11-26 2005-06-30 Frederic Chapus Method for assisting in automated conversion of data and associated metadata
US20050216917A1 (en) * 2004-03-23 2005-09-29 Srivatsa Krishnaswamy Method and system for data object transformation
US20060107997A1 (en) * 2004-11-22 2006-05-25 Fujikura Ltd. Electrode, photoelectric conversion element, and dye-sensitized solar cell
US20070297684A1 (en) * 2004-12-16 2007-12-27 Mitsubishi Electric Corporation Data Conversion Apparatus, Data Conversion Method, and Data Conversion System
US7603368B2 (en) * 2007-03-30 2009-10-13 Fmr Llc Mapping data on a network
US20090204921A1 (en) * 2008-02-07 2009-08-13 Vestyck Anthony R System and Method for Organizing, Managing, and Using Electronic Files
US20100169353A1 (en) * 2008-12-31 2010-07-01 Ebay, Inc. System and methods for unit of measurement conversion and search query expansion
US8326850B2 (en) * 2009-07-30 2012-12-04 Fujitsu Limited Data converting apparatus, method, and computer product
US20110078211A1 (en) * 2009-09-30 2011-03-31 Albrecht Gass Systems and methods for analyzing and transforming an application from a source installation to a target installation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Qiming Zhou et al. Digital Image Processing and Interpretation Hong Kong, 1999. *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090265684A1 (en) * 2008-04-18 2009-10-22 Ids Scheer Aktiengesellschaft Systems and methods for graphically developing rules for transforming models between description notations
US9405513B2 (en) * 2008-04-18 2016-08-02 Software Ag Systems and methods for graphically developing rules for transforming models between description notations
US20140156672A1 (en) * 2011-05-03 2014-06-05 Space-Time Insight Space-time-nodal type signal processing
US9052907B2 (en) 2011-10-25 2015-06-09 Software Ag Selective change propagation techniques for supporting partial roundtrips in model-to-model transformations
US20160019684A1 (en) * 2014-07-15 2016-01-21 Microsoft Corporation Wide field-of-view depth imaging
US9805454B2 (en) * 2014-07-15 2017-10-31 Microsoft Technology Licensing, Llc Wide field-of-view depth imaging
US9823089B1 (en) * 2016-06-21 2017-11-21 Amazon Technologies, Inc. Unmanned aerial vehicle sensor calibration as part of departure from a materials handling facility
US9972212B1 (en) 2016-06-21 2018-05-15 Amazon Technologies, Inc. Unmanned aerial vehicle camera calibration as part of departure or arrival at a materials handling facility
US9969486B1 (en) 2016-06-21 2018-05-15 Amazon Technologies, Inc. Unmanned aerial vehicle heat sensor calibration
US10032275B1 (en) 2016-06-21 2018-07-24 Amazon Technologies, Inc. Unmanned aerial vehicle sensor calibration during flight
US10220964B1 (en) 2016-06-21 2019-03-05 Amazon Technologies, Inc. Unmanned aerial vehicle sensor calibration validation before flight
US10302452B1 (en) 2016-06-21 2019-05-28 Amazon Technologies, Inc. Unmanned aerial vehicle sensor calibration via sensor channel

Also Published As

Publication number Publication date
US8756259B1 (en) 2014-06-17

Similar Documents

Publication Publication Date Title
US8719317B1 (en) Structured data conversion system
Shahbazi et al. Development and evaluation of a UAV-photogrammetry system for precise 3D environmental modeling
CN102567967B (en) For the air of spatial image and the technology of solar correction
Roth et al. PhenoFly Planning Tool: flight planning for high-resolution optical remote sensing with unmanned areal systems
Meister et al. Outdoor stereo camera system for the generation of real-world benchmark data sets
Grenzdörffer et al. Photogrammetric image acquisition and image analysis of oblique imagery
US20230162449A1 (en) Systems and methods for data transmission and rendering of virtual objects for display
US20130159924A1 (en) Systems and methods for efficient spatial feature analysis
Manconi et al. Optimization of unmanned aerial vehicles flight planning in steep terrains
Duarte et al. Open-source GIS application for UAV photogrammetry based on MicMac
CN111612901A (en) Extraction feature and generation method of geographic information image
Warren et al. Long-range stereo visual odometry for extended altitude flight of unmanned aerial vehicles
Milosavljević et al. A method for estimating surveillance video georeferences
Mårtensson et al. Height uncertainty in digital terrain modelling with unmanned aircraft systems
Roncella et al. Photogrammetric digital surface model reconstruction in extreme low-light environments
Elhadary et al. The influence of flight height and overlap on UAV imagery over featureless surfaces and constructing formulas predicting the geometrical accuracy
Alves Júnior et al. High accuracy mapping with cartographic assessment for a fixed-wing remotely piloted aircraft system
Chevallier et al. Linear contrails detection, tracking and matching with aircraft using geostationary satellite and air traffic data
Congedo Land cover classification of cropland: a tutorial using the semi-automatic classification plugin for QGIS
US20180285640A1 (en) System for target material detection
Candan et al. Methodology of real-time 3D point cloud mapping with UAV lidar
Teague et al. Imagery synthesis for drone celestial navigation simulation
Saur et al. Experimental application of simulation tools for evaluating UAV video change detection
Kim et al. Virtual testbed for monocular visual navigation of small unmanned aircraft systems
Lee et al. Validation of object detection in UAV-based images using synthetic data

Legal Events

Date Code Title Description
AS Assignment

Owner name: NAVY, U.S.A. AS REPRESENTED BY THE DEPARTMENT OF T

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CRAIN, LOUS M.;REEL/FRAME:025722/0396

Effective date: 20101221

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551)

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8