WO2009073009A1 - Method and apparatus for projecting viewable data onto an imaged object - Google Patents

Method and apparatus for projecting viewable data onto an imaged object Download PDF

Info

Publication number
WO2009073009A1
WO2009073009A1 PCT/US2007/025033 US2007025033W WO2009073009A1 WO 2009073009 A1 WO2009073009 A1 WO 2009073009A1 US 2007025033 W US2007025033 W US 2007025033W WO 2009073009 A1 WO2009073009 A1 WO 2009073009A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
interest
image data
imager
predetermined criteria
Prior art date
Application number
PCT/US2007/025033
Other languages
French (fr)
Inventor
Craig Schwartz
Larry Elliott
Original Assignee
Craig Schwartz
Larry Elliott
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Craig Schwartz, Larry Elliott filed Critical Craig Schwartz
Priority to PCT/US2007/025033 priority Critical patent/WO2009073009A1/en
Publication of WO2009073009A1 publication Critical patent/WO2009073009A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/143Sensing or illuminating at different wavelengths
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition

Definitions

  • the present invention relates generally to imaging technology and, more particularly, to a system and method for projecting viewable data onto an object.
  • an "object” may be an inanimate object or a life form.
  • a thermal image can be used to see invisible heat variations of a target object.
  • the user To view the thermal image, the user must obtain a thermal imager and look through the viewer of the thermal imager; or view the video output of the thermal imager on a remote TV or computer monitor.
  • direct or backscatter X-ray images can be used to see internal or hidden features of a target object with the aid of an appropriate imager and an associated viewer or remote monitor. It would be desirable to obtain and view images in a manner more convenient to users.
  • a system and method for identifying an object of interest includes at least one imager configured to capture an image of at least one object in a filed of view and generate image data from the captured image.
  • An image processing unit compares the generated image data to predetermined criteria, and produces an output when the generated image data meet the predetermined criteria.
  • At least one image projector is configured to respond to the output by displaying a viewable image onto an object of interest whose captured image resulted in meeting the predetermined criteria.
  • the system and method preferably operate in real time when imaging objects that are moving or are expected to move so that the viewable image displayed onto the object of interest follows the object when it moves.
  • the system and method may be configured to project different viewable images and/or more than one viewable image on selected and/or different objects.
  • a system and method for highlighting an object of interest includes capturing an image of at least one object of interest in a field of view with at least one imager; generating image data from the captured image; transforming at least a portion of the image data into a viewable format of preselected configuration; and displaying with at least one image projector an image in accordance with the preselected configuration onto the object of interest.
  • the displayed image may take a variety of forms and serve a variety of purposes.
  • FIG. 1 is a block diagram of a display system consistent with the present invention.
  • Fig. 2 is an example of an arrangement of optics for use in the display system of
  • Figs. 3A-3D are examples of adjustments made for aligning the field of view of the imager with the projection of the image projector of the display system of Fig. 1.
  • Fig. 4 is an example of an area that can be covered using the display system of
  • Fig. 5 is an example of a thermal image of a human.
  • Figs. 6A-6D show an example of imaging, processing, and projecting a vector outline image on an object of interest consistent with the present invention.
  • Figs. 7A-7D show an example of imaging, processing, and projecting a raster line image on an object of interest consistent with the present invention.
  • Fig. 8 is an example of a control panel that can be used in the display system of
  • Fig. 9 is an example of projecting an image on objects of interest at a distance consistent with the present invention.
  • Fig. 10 is an example of highlighting objects of interest in the example of Fig.
  • Fig. 11 is an example of providing a frame to the highlighted objects of interest in the example of Fig. 10.
  • Figs. 12A-12C show examples of varying frame shapes that can be projected in the display system of Fig. 1.
  • Fig. 13 is an example of an alternative application of the system of Fig. 1 for controlling a fire.
  • Fig. 14 is an example of an alternative application of the system of Fig. 1 for controlling an air mass.
  • Fig. 15 is an example of an application of the display system of Fig. 1 for identifying stress areas in a bridge.
  • Figs. 16A-16B are examples of an application of the display system of Fig. 1 for identifying hot spots in an electrical power apparatus.
  • Fig. 17 is an example of an application of the display system of Fig. 1 for displaying the contents of a container.
  • Fig. 18 is a block diagram of a display system consistent with the present invention that uses multiple imagers.
  • an observer can see properties of an object (which may be a life form) that otherwise may be difficult or impossible to see with the naked eye.
  • properties are extracted from data that is provided by either a thermal imager, an x-ray machine or any other examining device capable of revealing properties that are contained in or radiating from the object or life form that are not visible to the human eye.
  • properties can also be, for example, the contrasting phenomenon created by the object and its physical surroundings, as detected by the examining device.
  • the detected properties are displayed onto the object or life form by the projection of light.
  • This projection of light onto the object or life form can either be a direct representation of the data obtained from the examining device or a pertinent extraction thereof.
  • the properties displayed onto the object or life form are preferably displayed in such a way so as to be in direct proportion dimensionally to the properties that are found by the examining device to be contained in or radiating from the object or life form. The result of the projection enables anyone in the proximity of the projection to see the properties displayed onto the object or life form that is being detected by the imager.
  • Fig. 1 is a block diagram of a display system consistent with the present invention.
  • the display system includes an object of interest 10 (hereinafter object 10), at least one imager 20, at least one image projector 30, an image processing unit 40, a control panel 50, and a mechanical adjuster 60.
  • object 10 can be any type of object or life form that can be viewed and captured by the imager 20.
  • the object 10 may be humans, animals, buildings, containers, bridges, machinery, vehicles, electrical power apparatuses, etc.
  • the imager 20 can be implemented, for example, as a thermal imager, an X-ray machine, or any other type of imaging device that can detect and capture characteristics of an object that may not necessarily be seen with the naked eye, such as multi-spectral imagers, radio-wave imagers, electromagnetic field imagers, ultrasonic imagers, ultraviolet imagers, gamma ray imagers, microwave imagers, radar imagers, magnetic resonance imagers (MRIs), and infrared imagers (near, mid, and far, which is the thermal infrared imager).
  • the image projector 30 can be implemented, for example, as a laser projector or video projector.
  • the image processing unit 40 preferably includes processing hardware, such as a CPU, microprocessor, or multi-processor unit, software configured to transform image data captured by the imager 20 into projection data that can be displayed by the image projector 30, and memory or storage for storing the software and other instructions used by the image processing unit 40 to perform its functions.
  • processing hardware such as a CPU, microprocessor, or multi-processor unit, software configured to transform image data captured by the imager 20 into projection data that can be displayed by the image projector 30, and memory or storage for storing the software and other instructions used by the image processing unit 40 to perform its functions.
  • the image processing unit 40 can be configured with commercially available software applications, such as the LD2000 from Pangolin Laser Systems Inc. Where multiple imagers are used together (see, e.g., Fig. 18), the image processing unit may be configured to mix the image data from the imagers and output composite image data for projection of a viewable composite image onto the object being examined. Alternatively, the image processing unit can be configured to respond separately to the imaging data received from each imager, and output image data to the projector accordingly.
  • the image processing unit 40 may be configured to perform an automatic screening or discriminating function by comparing the image data generated by the imager 20 to predetermined criteria, e.g., historical data, such as may be contained in a data library 41 , and to produce an output automatically when the generated image data meet the predetermined criteria.
  • predetermined criteria e.g., historical data, such as may be contained in a data library 41
  • the image projector 20 would respond to the output by displaying a viewable image onto an object of interest whose captured image resulted in meeting the predetermined criteria.
  • the image processing unit may be configured to cause the projected image to blink at least initially so as to call attention to the highlighted object of interest.
  • the system operates in real time so that the viewable image displayed onto the object of interest follows the object when it moves.
  • the image displayed by the projector onto the object of interest may be in any color or form including, but not limited to, one or more dots; lines (curved, squiggle, straight, cross-hair); geometric shapes (e.g., circle, oval, triangle, square, polygon); a predetermined pattern; company logo; movie or TV or literary characters (persons or animals) or their costumes; text message; or any combination; outline or raster representations of the object of interest; etc.
  • the generated image data may comprise biometric data (including thermal or facial configuration data); body shape data; body stance data; behavioral data; etc.
  • shape data or thermal image data may be used.
  • the viewable image may be projected onto the center of data mass of the object of interest.
  • one or more viewable images may be displayed onto an object of interest; and objects in different classes may be illuminated with different images so as to distinguish them from one another (e.g., to distinguish enemy combatants from friendly combatants for the purpose of targeting weapons).
  • all combatants may be imaged thermally and illuminated to indicate their locations, but only enemy combatants would be illuminated by an additional image (e.g., a laser-generated dot or circle) to set them apart from the friendly combatants.
  • an additional image e.g., a laser-generated dot or circle
  • Each of the "friendlies” could be outfitted with a device that creates a distinctive image, for example, a specific article that blocks a certain recognizable shape (e.g., a star) in his thermal image, or another imaged "tag" that informs the system of his location, with such distinctive characteristic preventing the projection of the secondary image on him that is reserved only for non- friendly combatants.
  • the invention may be used to project a predetermined image onto an object of interest to highlight the object for various purposes.
  • a dot or a cross-hair image may be projected onto the object for targeting by a weapon.
  • the image displayed onto the object of may be in any color or form including, but not limited to, one or more dots; lines (curved, squiggle, straight, cross-hair); geometric shapes (e.g., circle, oval, triangle, square, polygon); a predetermined pattern; company logo; movie or TV or literary characters (persons or animals) or their costumes; text message; or any combination; outline or raster representations of the object of interest; etc.
  • the control panel 50 preferably includes a display, such as an LCD, plasma, or CRT screen, and an input unit, such as a keyboard, pointing device, and/or touch pad.
  • the display of the control panel 50 shows the image captured by the imager 20.
  • the input unit includes various controls that permit the user to make changes to the display system, such as the field of view of the imager 20, the positioning of the imager 20 and the image projector 30, and the addition of elements to be projected by the image projector 30.
  • the image projector 30 can be mounted on top of the imager 20, although other configurations, such as side by side, are also possible. Regardless of the arrangement between them, the mechanical adjuster 60 adjusts the relative positioning of the imager 20 with respect to the image projector 30.
  • the mechanical adjuster 60 adjusts the vertical, horizontal and axial (azimuth) positioning of the imager 20 and/or the image projector 30.
  • the imager 20 and the image projector 30 are properly aligned when the image captured by the imager 30 is aligned with the image projected by the image projector 30.
  • the adjustment by the mechanical adjuster 60 can be made to either the imager 20 or the image projector 30 or to both.
  • the adjustment of the mechanical adjuster 60 can be done manually by a user or can be done automatically through inputs made to the control panel 50.
  • the control panel 50 can be used to provide electronic adjustments, independent of the mechanical adjuster 60, to provide further refinements to the alignment of the imager 20 and the image projector 30.
  • Fig. 2 is an example of an arrangement of optics for use in the display system of Fig. 1.
  • the display system can be configured to include an optical system comprising a mirror 72 and a transmitter/reflector 74.
  • the transmitter/reflector 74 is designed to transmit or pass through certain electromagnetic waves and to reflect certain other electromagnetic waves.
  • the transmitter/reflector 74 can have a certain threshold such that electromagnetic waves with a wavelength under the threshold (e.g., visible light) are reflected, and electromagnetic waves with a wavelength greater than the threshold (e.g., thermal waves) are transmitted.
  • the imager 20 receives electromagnetic waves having a 9 micron wavelength, which is transmitted through transmitter/reflector 74.
  • the image projector 30, such as a laser projector projects an image comprising electromagnetic waves having a 0.5 micron wavelength onto the mirror 72, which reflects the electromagnetic waves to the transmitter/reflector 74. Because the electromagnetic waves from the image projector 30 are sufficiently short, i.e., shorter than the threshold of the transmitter/reflector 74, the transmitter/reflector 74 reflects the light waves from the image projector toward the object imaged by the imager 30.
  • 3A-3D are examples of adjustments made for aligning the field of view of the imager with the projection of the image projector of the display system of Fig. 1.
  • the double, solid line box corresponds to the optical field of view of the imager 20
  • the dashed-line box corresponds to the perimeter of the projection of the image projector 30.
  • the projection of the image projector 30 is off-axis from the optical field of view of the imager 20.
  • the mechanical adjuster 60 is used to change the axial (azimuth) positions of the imager 20 and the image projector 30 with respect to each other.
  • the projection of the image projector 30 is smaller in the vertical and horizontal directions with respect to the optical field of view of the imager 20.
  • an electronic adjustment of the projection of the image projector 30 can be made.
  • the electronic adjustment can be made, for example, through the control panel 50 or through a direct adjustment on the image projector 30.
  • the electronic adjustment can be used to adjust the vertical and horizontal size of the projection of the image projector 30.
  • the electronic adjustment can also be made to adjust the vertical and horizontal size of the imager 20, i.e., the field of view of the imager 20, through the control panel 50 or through direct adjustment of the imager 20.
  • the projection of the image projector 30 is too low and too far to the left from the optical field of view of the imager 20.
  • the projection of the image projector 30 is adjusted to center the projection horizontally and vertically. This adjustment can be done using the mechanical adjuster 60 and/or the electronic adjustment.
  • Fig. 3D shows the projection of the image projector 30 properly aligned with the optical field of view of the imager 20.
  • the image projector 30 can project an image onto the object 10 that is in direct proportion dimensionally to the object 10 itself. There is alignment when the dashed-line box is within the double, line box.
  • Fig. 4 is an example of an area that can be covered using the display system of Fig. 1.
  • the imager 20 if the imager 20 is implemented as a thermal imager, such as the Raytheon 640x480 Common Uncooled Engine, then with a horizontal field of view at 45 degrees, the imager 20 can detect objects or activity up to 2000 feet away. At this distance, the field of view would measure at 1500 feet x 1125 feet. At ground level, this would cover 1 ,500,000 square feet. In a vertical plane at 2000 feet, the imager would detect 1,687,500 square feet.
  • the images projected by the image projector 30 can be seen very clearly at distances of better than 2000 feet.
  • the image projector 30 projects a sharp image that does not need to be focused.
  • the laser used is preferably in the green wavelength, around 532 nm.
  • the color green is preferable because it is the brightest color perceptible to the human eye, although other visible colors can be used.
  • the field of view, with a display system viewing at 45 degrees, can be expanded to 360 degrees by using multiple units side by side each viewing 45 degrees until 360 degrees are obtained.
  • the imager 20 can be implemented with a lens assembly that allows only 3 to 6 degrees field of view horizontally, but providing an ability to capture images at greater distances. Such an implementation could be useful at border crossings. At 3 to 6 degrees field of view, the imager 20 can detect a human presence up to and sometimes well over a mile away. In addition, even low powered lasers emitted by the image projector 30 can be seen at these distances.
  • Fig. 5 is an example of a thermal image of a human.
  • the imager 20, implemented as a thermal imager captures the thermal image of a human.
  • the captured image is processed by the image processing unit 40 and provided to the image projector 30, which projects the thermal image of the human directly onto the human.
  • Figs. 6A-6D show an example of imaging, processing, and projecting a vector outline image on an object of interest consistent with the present invention.
  • Fig. 6A shows the video output from the imager 20, such as when implemented as a thermal imager.
  • the video output from the imager 20 can be displayed on the display of the control panel 50.
  • Fig. 6A shows the video output from the imager 20, such as when implemented as a thermal imager.
  • the video output from the imager 20 can be displayed on the display of the control panel 50.
  • FIG. 6B shows the image of the object 10 captured by the imager 20 after converting the analog signal provided by the imager 20 into a digital signal and adjusting the contrast and brightness so that the highest contrast can be seen against the background.
  • the analog to digital conversion and brightness and contrast adjustment are performed by the image processing unit 40.
  • a vector outline is generated where white meets black.
  • the generation of the vector outline can also be performed by the image processing unit 40, and can be implemented in the image processing unit 40 with a vector graphics software program as are know in the art. Other means, including appropriate software, may be used to extract a silhouette of the object of interest needed to generate a vector outline for projection.
  • Figs. 7A-7D show an example of imaging, processing, and projecting a raster line image on an object of interest consistent with the present invention.
  • Figs. 7A and 7B are the same as Figs. 6A and 6B, respectively, described above. Accordingly, description of Figs. 7A and 7B are omitted.
  • Fig. 7C instead of generating a vector outline where white meets black, as shown in Fig. 6C, raster lines are generated wherever white is present.
  • the generation of raster lines can be performed by the image processing unit 40, and can be implemented in the image processing unit 40 with a raster graphics software program as are know in the art.
  • the image data corresponding to the raster lines generated by the image processing unit is provided to the image projector 30, which projects the raster lines over the object 10 that was imaged by the imager 20, as shown in Fig. 6D.
  • the image projector 30 thus visibly illuminates the body of each object 10 captured by the imager 20.
  • the outline and illuminating, as well as any other type of image projection can be performed in real time.
  • the video output of the imager 20, while it is imaging, is provided in real time to the image processing unit 40, which processes these video frames one by one in real time, such as with a video-to-vector graphics software program.
  • the image processing unit 40 analyzes each frame of video one by one in real time and creates a vector line(s) (or raster line or other type of image for projection) wherever white meets black on that frame.
  • the created vector line (or raster line or other type of image projection) replaces the frames of video one by one in real time with vector outline frames (or raster line frames or other type of image projection frames).
  • These newly created graphics frames are delivered electronically one by one in real time to the image projector 30, which in turn projects them directly over the object 10 that is being detected by the imager 20.
  • Fig. 8 is an example of a control panel that can be used in the display system of Fig. 1.
  • the control panel 50 includes a display 51, graphics keys 52, blink key 53, reset key 54, perimeter key 55, and pan and tilt key 56.
  • the display 51 can be implemented, for example, as a CRT, LCD, plasma, or other type of video display.
  • the graphics keys 52, blink key 53, reset key 54, perimeter key 55, and pan and tilt key 56 can be implemented as buttons on a panel separate from the display 51 or as a touch panel on the display 51 itself.
  • the graphics keys 52 can be used to block out portions of the image captured by the imager 20 and to add images to the image captured by the imager 20.
  • the graphics keys 52 include two different sized circles, two different sized rectangles, and four arrows.
  • the circles and arrows are graphics that can be added to the image captured by the imager 20, and the solid rectangles are graphics that can be used to block out portions of the image captured by the imager.
  • other shapes can be used for the graphics keys 52, both for graphics to be added to the image and for blocking out part of the image.
  • the added graphics may be of any size, such as to surround the object of interest (as shown in Fig. 8) or to be wholly within the outline of the object of interest.
  • the graphics keys 52 can also include a changeable size tool that permits the user to demarcate the size of an image portion deleted or an image added.
  • the position of the deleted image portion or the added image can be set using the pan and tilt key 52.
  • a pointing device such as a mouse or pen device can be used to set the position. It is also possible to permit a user to touch the location at which the selected graphic is placed.
  • the blink key 53 is selected when the user wants the projected image in a particular area to blink. To do so, the user can touch the area of the video screen (or demarcate the area with a changeable size tool in conjunction with a pointing device) and then select the blink key 53. This action causes the projected image in that area to blink, which is useful in drawing a viewer's attention to the blinking object.
  • the reset key 54 removes any image portions deleted and any images added by the graphics keys 52.
  • the perimeter key 55 adds a frame to the view on the display 51 and to the image projected by the image projector 30. The frame added by the perimeter key corresponds to the field of view of the imager 20.
  • the pan and tilt key 56 can be used, for example, to move the position the imager 20 (and correspondingly the position of the image projector 30), to change the size of the field of view of the imager 20, and to move the placement of objects added to the display 51.
  • a portion of a building is shown to include five human objects that are identifiable by the imager 20, such as by their heat signature when the imager 20 is implemented as a thermal imager.
  • the display 51 also includes two particular human objects that have circular images added by the graphics keys 52. The user may add these circular images to identify high value objects from among the objects captured by the imager 20 so that when the image projector 30 displays the image with the added circles onto the building itself including the human objects, anyone viewing the image displayed by the image projector 30 will see the circles around the high valued objects, and thus be able to discriminate objects of interest from objects that are not of interest.
  • the circle objects can be enemy combatants and the non-circled objects can be friendly combatants.
  • a frame can be added to the overall image.
  • the frame provides an outline of the actual image captured by the imager 20, i.e., the field of view of the imager 20.
  • the frame can be useful as it shows viewers exactly how much or how little the imager 20 is seeing.
  • Fig. 9 is an example of projecting an image on objects of interest at a distance consistent with the present invention.
  • a vehicle in which the display system has been implemented is positioned at night at a distance from the same building shown in Fig. 8.
  • the imager 20 can identify objects, in this case human objects, at a distance and illuminate them with the image projector 30.
  • a laser emitted by the image projector 30 can be in the near field infrared range, around 940 nm, which is invisible to the naked eye and thus allow only those with standard night vision capabilities to view the projection.
  • Fig. 10 is an example of highlighting objects of interest in the example of Fig. 9.
  • Fig. 10 shows two specific objects that are surrounded by circles, which are graphics added using the image add keys 54 of the control panel 50.
  • the image processing unit 40 can be configured to follow a highlighted object (e.g., an object around which a graphic is added) if the object moves while being imaged by the imager 20.
  • a highlighted object e.g., an object around which a graphic is added
  • the image processing unit 40 can process the image so that the circles remain around the moving objects.
  • Fig. 11 is an example of providing a frame to the highlighted objects of interest in the example of Fig. 10.
  • the frame in Fig. 11 shows how much of the building is being imaged by the imager 20.
  • Figs. 12A-12C show examples of varying frame shapes that can be projected in the display system of Fig. 1.
  • the horizontal and vertical size of this projected window can be adjusted independently to fit the specific needs of the operator.
  • the image projector 30 displays a full screen, which is the default size of the projected window.
  • Fig. 12B shows the display of a panoramic view in which the height of the projection window is made smaller.
  • the image projector displays a vertical view in which the width of the projection window is narrowed, such as if only a tall building needs to be examined. With these various window dimensions set, the image projector 30 does not project beyond those dimensions even though the imager 20 may capture an image larger than the window dimensions.
  • Fig. 13 is an example of an alternative application of the system of Fig. 1 for controlling a fire.
  • the system including the image processing unit 40 and the imager 20 can be suspended over an object on fire, such as a ship 82.
  • the display system can be suspended, for example, by a helicopter, a balloon, an airplane, or other aerial vehicle.
  • the imager 20 provides a thermal image of the ship 82, which identifies the hot spots, i.e., the fire locations, to the image processing unit 40.
  • the image processing unit 40 can be configured to identify the hot spots from the thermal image and provide that information to water cannon and guidance assemblies 80.
  • the image processing unit 40 can be configured to map digitally the perimeter of the entire theater of combustion including all hot spots and any thermal data relevant to this unstable condition. Based on this information, the assemblies 80 can be automatically directed to position and provide water to the most needed spots on the ship 82 and thus effectively and efficiently put out the fire on the ship. The identified hot spots can also determine the force at which the assemblies 80 provide water to the fire. Although assemblies 80 are described as using water, it should be understood that other fire retardants can be used.
  • Fig. 14 is an example of an alternative application of the system of Fig. 1 for controlling an air mass.
  • the system here would be carried by an aerial vehicle that is capable of positioning the system over a cold air mass 84 and a warm air mass 86.
  • the cold air mass 84 is on a trajectory course towards a warm air mass 86 or visa versa.
  • a hurricane or other violent weather front may start to form.
  • the imager 20, implemented as a thermal imager, with an aerial view of the air masses 84, 86 provides thermal data to the image processing unit 40.
  • the image processing unit can be configured to map digitally the entire thermal domain relevant to this weather event and calculate where the image projector 30, implemented as a powerful overhead laser, would best be directed in order to warm part or all of the cold air mass 84 so as to mitigate or stop the inevitable weather condition.
  • Fig. 15 is an example of an application of the display system of Fig. 1 for identifying stress areas in a bridge.
  • the imager 20 images at least a portion of the bridge. If implemented as a thermal imager, the image captured by the imager 20 would highlight the areas of the bridge that are mechanically stressed.
  • the image is then processed by the image processing unit 40, which provides the processed image to the image projector 30, and the image projector 30 projects the image onto the bridge so that viewers can witness exactly where on the bridge the stress spots are located.
  • Fig. 16A-16B are examples of an application of the display system of Fig. 1 for identifying hot spots in an electrical power apparatus. As shown in Figs.
  • the imager 20 images at least a portion of the electrical power apparatus. If implemented as a thermal imager, the image captured by the imager 20 would highlight the areas of the electrical power apparatus that correspond to hot spots. The image is then processed by the image processing unit 40, which provides the processed image to the image projector 30, and the image projector 30 projects the image onto the electrical power apparatus so that viewers can witness exactly where on the electrical power apparatus the hot spots are located.
  • the image processing unit 40 provides the processed image to the image projector 30, and the image projector 30 projects the image onto the electrical power apparatus so that viewers can witness exactly where on the electrical power apparatus the hot spots are located.
  • Fig. 17 is an example of an application of the display system of Fig. 1 for displaying the contents of a container.
  • the imager 20 is preferably implemented as an X-ray device.
  • the display system can be used to detect and display the contents of a shipping container 86.
  • the shipping container 86 passes through an X-ray area 22, which corresponds to a region that can be captured by the imager 20.
  • the X-ray image data is provided to the image processing unit 40, which transforms the X-ray image data into an image that can be projected by the image projector 30.
  • the image projector 30 projects the image onto the side of the container 86 so that viewers can witness the shape and position of the contents of the container without having to open the container.
  • the display system configured to remember first findings and display them longer, i.e., not display the image in real time. For example, if a person is detected and that person recognizes that his position is now being displayed, he would likely try to duck out of the sight of the imager 20, which would in turn stop the display system from displaying his position further.
  • the display system can be configured to remember the last position that was displayed by the image projector 30 and direct the image projector 30 to continue displaying that specific area for a predetermined period of time. This would give the viewers additional time to evaluate these sightings.
  • Embodiments according to the invention have wide-ranging applications, including but not limited to the intelligence, security and military fields, such as (purely by way of example):

Abstract

A system and method for highlighting an object of interest includes capturing an image of at least one object of interest in a field of view with at least one imager; generating image data from the captured image; transforming at least a portion of the image data into a viewable format; and displaying with at least one image projector a resulting image onto the object of interest. The projected image may be a preselected image. An image processing unit may compare the generated image data to predetermined criteria, and produce an output when the generated image data meet the predetermined criteria. The system and method preferably operate in real time so that the viewable image displayed onto the object of interest follows the object when it moves.

Description

METHOD AND APPARATUS FOR PROJECTING VIEWABLE DATA ONTO AN IMAGED OBJECT
FIELD OF THE INVENTION
[0001] The present invention relates generally to imaging technology and, more particularly, to a system and method for projecting viewable data onto an object. As used herein, an "object" may be an inanimate object or a life form.
BACKGROUND OF THE INVENTION
[0002] Many objects possess properties or are associated with data that are not readily apparent when the object is viewed with the naked eye, but can be seen with technological assistance. For example, a thermal image can be used to see invisible heat variations of a target object. To view the thermal image, the user must obtain a thermal imager and look through the viewer of the thermal imager; or view the video output of the thermal imager on a remote TV or computer monitor. Similarly, direct or backscatter X-ray images can be used to see internal or hidden features of a target object with the aid of an appropriate imager and an associated viewer or remote monitor. It would be desirable to obtain and view images in a manner more convenient to users.
SUMMARY OF THE INVENTION
[0003] According to an aspect of the invention, a system and method for identifying an object of interest includes at least one imager configured to capture an image of at least one object in a filed of view and generate image data from the captured image. An image processing unit compares the generated image data to predetermined criteria, and produces an output when the generated image data meet the predetermined criteria. At least one image projector is configured to respond to the output by displaying a viewable image onto an object of interest whose captured image resulted in meeting the predetermined criteria. The system and method preferably operate in real time when imaging objects that are moving or are expected to move so that the viewable image displayed onto the object of interest follows the object when it moves. The system and method may be configured to project different viewable images and/or more than one viewable image on selected and/or different objects.
[0004] According to another aspect of the invention, a system and method for highlighting an object of interest includes capturing an image of at least one object of interest in a field of view with at least one imager; generating image data from the captured image; transforming at least a portion of the image data into a viewable format of preselected configuration; and displaying with at least one image projector an image in accordance with the preselected configuration onto the object of interest. The displayed image may take a variety of forms and serve a variety of purposes. [0005] Further features, aspects and advantages of the present invention will become apparent from the detailed description of preferred embodiments that follows, when considered together with the accompanying figures of drawing.
BRIEF DESCRIPTION OF THE DRAWINGS
[0006] Fig. 1 is a block diagram of a display system consistent with the present invention.
[0007] Fig. 2 is an example of an arrangement of optics for use in the display system of
Fig. 1
[0008] Figs. 3A-3D are examples of adjustments made for aligning the field of view of the imager with the projection of the image projector of the display system of Fig. 1.
[0009] Fig. 4 is an example of an area that can be covered using the display system of
Fig. 1.
[0010] Fig. 5 is an example of a thermal image of a human.
[0011] Figs. 6A-6D show an example of imaging, processing, and projecting a vector outline image on an object of interest consistent with the present invention.
[0012] Figs. 7A-7D show an example of imaging, processing, and projecting a raster line image on an object of interest consistent with the present invention.
[0013] Fig. 8 is an example of a control panel that can be used in the display system of
Fig. 1.
[0014] Fig. 9 is an example of projecting an image on objects of interest at a distance consistent with the present invention. [0015] Fig. 10 is an example of highlighting objects of interest in the example of Fig.
9.
[0016] Fig. 11 is an example of providing a frame to the highlighted objects of interest in the example of Fig. 10.
[0017] Figs. 12A-12C show examples of varying frame shapes that can be projected in the display system of Fig. 1.
[0018] Fig. 13 is an example of an alternative application of the system of Fig. 1 for controlling a fire.
[0019] Fig. 14 is an example of an alternative application of the system of Fig. 1 for controlling an air mass.
[0020] Fig. 15 is an example of an application of the display system of Fig. 1 for identifying stress areas in a bridge.
[0021] Figs. 16A-16B are examples of an application of the display system of Fig. 1 for identifying hot spots in an electrical power apparatus.
[0022] Fig. 17 is an example of an application of the display system of Fig. 1 for displaying the contents of a container.
[0023] Fig. 18 is a block diagram of a display system consistent with the present invention that uses multiple imagers.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
[0024] In a display system consistent with the present invention, an observer can see properties of an object (which may be a life form) that otherwise may be difficult or impossible to see with the naked eye. Such properties are extracted from data that is provided by either a thermal imager, an x-ray machine or any other examining device capable of revealing properties that are contained in or radiating from the object or life form that are not visible to the human eye. These properties can also be, for example, the contrasting phenomenon created by the object and its physical surroundings, as detected by the examining device.
[0025] The detected properties are displayed onto the object or life form by the projection of light. This projection of light onto the object or life form can either be a direct representation of the data obtained from the examining device or a pertinent extraction thereof. Furthermore, the properties displayed onto the object or life form are preferably displayed in such a way so as to be in direct proportion dimensionally to the properties that are found by the examining device to be contained in or radiating from the object or life form. The result of the projection enables anyone in the proximity of the projection to see the properties displayed onto the object or life form that is being detected by the imager.
[0026] Fig. 1 is a block diagram of a display system consistent with the present invention. As shown in Fig. 1, the display system includes an object of interest 10 (hereinafter object 10), at least one imager 20, at least one image projector 30, an image processing unit 40, a control panel 50, and a mechanical adjuster 60. The object 10 can be any type of object or life form that can be viewed and captured by the imager 20. For example, the object 10 may be humans, animals, buildings, containers, bridges, machinery, vehicles, electrical power apparatuses, etc.
[0027] The imager 20 can be implemented, for example, as a thermal imager, an X-ray machine, or any other type of imaging device that can detect and capture characteristics of an object that may not necessarily be seen with the naked eye, such as multi-spectral imagers, radio-wave imagers, electromagnetic field imagers, ultrasonic imagers, ultraviolet imagers, gamma ray imagers, microwave imagers, radar imagers, magnetic resonance imagers (MRIs), and infrared imagers (near, mid, and far, which is the thermal infrared imager). The image projector 30 can be implemented, for example, as a laser projector or video projector. An exemplary commercially available laser projector is the Colorburst by Lumalaser. As illustrated in Fig. 18, a plurality of imagers of different type may be used together, such as an X-ray imager, a back scatter imager, and a thermal imager. [0028] The image processing unit 40 preferably includes processing hardware, such as a CPU, microprocessor, or multi-processor unit, software configured to transform image data captured by the imager 20 into projection data that can be displayed by the image projector 30, and memory or storage for storing the software and other instructions used by the image processing unit 40 to perform its functions. To transform the image data captured by the imager 20 into projection data that can be displayed by the image projector 30, the image processing unit 40 can be configured with commercially available software applications, such as the LD2000 from Pangolin Laser Systems Inc. Where multiple imagers are used together (see, e.g., Fig. 18), the image processing unit may be configured to mix the image data from the imagers and output composite image data for projection of a viewable composite image onto the object being examined. Alternatively, the image processing unit can be configured to respond separately to the imaging data received from each imager, and output image data to the projector accordingly.
[0029] In accordance with certain embodiments, the image processing unit 40 may be configured to perform an automatic screening or discriminating function by comparing the image data generated by the imager 20 to predetermined criteria, e.g., historical data, such as may be contained in a data library 41 , and to produce an output automatically when the generated image data meet the predetermined criteria. In that case the image projector 20 would respond to the output by displaying a viewable image onto an object of interest whose captured image resulted in meeting the predetermined criteria. The image processing unit may be configured to cause the projected image to blink at least initially so as to call attention to the highlighted object of interest. Preferably, the system operates in real time so that the viewable image displayed onto the object of interest follows the object when it moves. The image displayed by the projector onto the object of interest may be in any color or form including, but not limited to, one or more dots; lines (curved, squiggle, straight, cross-hair); geometric shapes (e.g., circle, oval, triangle, square, polygon); a predetermined pattern; company logo; movie or TV or literary characters (persons or animals) or their costumes; text message; or any combination; outline or raster representations of the object of interest; etc.
[0030] In such embodiments, where the object of interest is a person, the generated image data may comprise biometric data (including thermal or facial configuration data); body shape data; body stance data; behavioral data; etc. Where the object of interest is an inanimate object, such as a vehicle, shape data or thermal image data may be used. The viewable image may be projected onto the center of data mass of the object of interest. Depending on the controlling sets of predetermined criteria utilized, and the imager and projector configurations employed, one or more viewable images may be displayed onto an object of interest; and objects in different classes may be illuminated with different images so as to distinguish them from one another (e.g., to distinguish enemy combatants from friendly combatants for the purpose of targeting weapons). For example, all combatants may be imaged thermally and illuminated to indicate their locations, but only enemy combatants would be illuminated by an additional image (e.g., a laser-generated dot or circle) to set them apart from the friendly combatants. Each of the "friendlies" could be outfitted with a device that creates a distinctive image, for example, a specific article that blocks a certain recognizable shape (e.g., a star) in his thermal image, or another imaged "tag" that informs the system of his location, with such distinctive characteristic preventing the projection of the secondary image on him that is reserved only for non- friendly combatants.
[0031] In other embodiments, which need not (but may) provide a screening or discriminating function, the invention may be used to project a predetermined image onto an object of interest to highlight the object for various purposes. For example, a dot or a cross-hair image may be projected onto the object for targeting by a weapon. Or, as noted above, the image displayed onto the object of may be in any color or form including, but not limited to, one or more dots; lines (curved, squiggle, straight, cross-hair); geometric shapes (e.g., circle, oval, triangle, square, polygon); a predetermined pattern; company logo; movie or TV or literary characters (persons or animals) or their costumes; text message; or any combination; outline or raster representations of the object of interest; etc. [0032] The control panel 50 preferably includes a display, such as an LCD, plasma, or CRT screen, and an input unit, such as a keyboard, pointing device, and/or touch pad. The display of the control panel 50 shows the image captured by the imager 20. The input unit includes various controls that permit the user to make changes to the display system, such as the field of view of the imager 20, the positioning of the imager 20 and the image projector 30, and the addition of elements to be projected by the image projector 30. [0033] In general, the image projector 30 can be mounted on top of the imager 20, although other configurations, such as side by side, are also possible. Regardless of the arrangement between them, the mechanical adjuster 60 adjusts the relative positioning of the imager 20 with respect to the image projector 30. To obtain a proper alignment between the image projector 30 and the imager 20, the mechanical adjuster 60 adjusts the vertical, horizontal and axial (azimuth) positioning of the imager 20 and/or the image projector 30. The imager 20 and the image projector 30 are properly aligned when the image captured by the imager 30 is aligned with the image projected by the image projector 30. The adjustment by the mechanical adjuster 60 can be made to either the imager 20 or the image projector 30 or to both. In addition, the adjustment of the mechanical adjuster 60 can be done manually by a user or can be done automatically through inputs made to the control panel 50. As will be described herein, the control panel 50 can be used to provide electronic adjustments, independent of the mechanical adjuster 60, to provide further refinements to the alignment of the imager 20 and the image projector 30.
[0034] Fig. 2 is an example of an arrangement of optics for use in the display system of Fig. 1. As shown in Fig. 2, the display system can be configured to include an optical system comprising a mirror 72 and a transmitter/reflector 74. The transmitter/reflector 74 is designed to transmit or pass through certain electromagnetic waves and to reflect certain other electromagnetic waves. For example, the transmitter/reflector 74 can have a certain threshold such that electromagnetic waves with a wavelength under the threshold (e.g., visible light) are reflected, and electromagnetic waves with a wavelength greater than the threshold (e.g., thermal waves) are transmitted.
[0035] As shown in Fig. 2, the imager 20, such as a thermal imager, receives electromagnetic waves having a 9 micron wavelength, which is transmitted through transmitter/reflector 74. The image projector 30, such as a laser projector, projects an image comprising electromagnetic waves having a 0.5 micron wavelength onto the mirror 72, which reflects the electromagnetic waves to the transmitter/reflector 74. Because the electromagnetic waves from the image projector 30 are sufficiently short, i.e., shorter than the threshold of the transmitter/reflector 74, the transmitter/reflector 74 reflects the light waves from the image projector toward the object imaged by the imager 30. [0036] Figs. 3A-3D are examples of adjustments made for aligning the field of view of the imager with the projection of the image projector of the display system of Fig. 1. As shown in Figs. 3A-3D, the double, solid line box corresponds to the optical field of view of the imager 20, and the dashed-line box corresponds to the perimeter of the projection of the image projector 30. In Fig. 3 A, the projection of the image projector 30 is off-axis from the optical field of view of the imager 20. To correct for this misalignment, the mechanical adjuster 60 is used to change the axial (azimuth) positions of the imager 20 and the image projector 30 with respect to each other.
[0037] In Fig. 3B, the projection of the image projector 30 is smaller in the vertical and horizontal directions with respect to the optical field of view of the imager 20. To correct for this misalignment, an electronic adjustment of the projection of the image projector 30 can be made. The electronic adjustment can be made, for example, through the control panel 50 or through a direct adjustment on the image projector 30. The electronic adjustment can be used to adjust the vertical and horizontal size of the projection of the image projector 30. The electronic adjustment can also be made to adjust the vertical and horizontal size of the imager 20, i.e., the field of view of the imager 20, through the control panel 50 or through direct adjustment of the imager 20.
[0038] In Fig. 3C, the projection of the image projector 30 is too low and too far to the left from the optical field of view of the imager 20. To correct for this position misalignment, the projection of the image projector 30 is adjusted to center the projection horizontally and vertically. This adjustment can be done using the mechanical adjuster 60 and/or the electronic adjustment.
[0039] Fig. 3D shows the projection of the image projector 30 properly aligned with the optical field of view of the imager 20. By making this alignment, the image projector 30 can project an image onto the object 10 that is in direct proportion dimensionally to the object 10 itself. There is alignment when the dashed-line box is within the double, line box.
[0040] Fig. 4 is an example of an area that can be covered using the display system of Fig. 1. In general, the wider the field of view of the imager 20, the shorter the distance at which the imager 20 can effectively detect objects. Conversely, the shorter the field of view of the imager 20, the farther the distance at which the imager 20 can effectively detect objects. In Fig. 4, if the imager 20 is implemented as a thermal imager, such as the Raytheon 640x480 Common Uncooled Engine, then with a horizontal field of view at 45 degrees, the imager 20 can detect objects or activity up to 2000 feet away. At this distance, the field of view would measure at 1500 feet x 1125 feet. At ground level, this would cover 1 ,500,000 square feet. In a vertical plane at 2000 feet, the imager would detect 1,687,500 square feet.
[0041] At night or at twilight, the images projected by the image projector 30 can be seen very clearly at distances of better than 2000 feet. When implemented as a laser projector, the image projector 30 projects a sharp image that does not need to be focused. To be visible, the laser used is preferably in the green wavelength, around 532 nm. The color green is preferable because it is the brightest color perceptible to the human eye, although other visible colors can be used. The field of view, with a display system viewing at 45 degrees, can be expanded to 360 degrees by using multiple units side by side each viewing 45 degrees until 360 degrees are obtained. [0042] The imager 20 can be implemented with a lens assembly that allows only 3 to 6 degrees field of view horizontally, but providing an ability to capture images at greater distances. Such an implementation could be useful at border crossings. At 3 to 6 degrees field of view, the imager 20 can detect a human presence up to and sometimes well over a mile away. In addition, even low powered lasers emitted by the image projector 30 can be seen at these distances.
[0043] Fig. 5 is an example of a thermal image of a human. As shown in Fig. 5, the imager 20, implemented as a thermal imager, captures the thermal image of a human. The captured image is processed by the image processing unit 40 and provided to the image projector 30, which projects the thermal image of the human directly onto the human. [0044] Figs. 6A-6D show an example of imaging, processing, and projecting a vector outline image on an object of interest consistent with the present invention. Fig. 6A shows the video output from the imager 20, such as when implemented as a thermal imager. The video output from the imager 20 can be displayed on the display of the control panel 50. [0045] Fig. 6B shows the image of the object 10 captured by the imager 20 after converting the analog signal provided by the imager 20 into a digital signal and adjusting the contrast and brightness so that the highest contrast can be seen against the background. The analog to digital conversion and brightness and contrast adjustment are performed by the image processing unit 40. With this contrast against the background, as shown in Fig. 6C, a vector outline is generated where white meets black. The generation of the vector outline can also be performed by the image processing unit 40, and can be implemented in the image processing unit 40 with a vector graphics software program as are know in the art. Other means, including appropriate software, may be used to extract a silhouette of the object of interest needed to generate a vector outline for projection. [0046] The image data corresponding to the vector outline generated by the image processing unit is provided to the image projector 30, which projects the outline over the object 10 that was imaged by the imager 20, as shown in Fig. 6D. The image projector 30 thus visibly outlines the body of each object 10 captured by the imager 20. [0047] Figs. 7A-7D show an example of imaging, processing, and projecting a raster line image on an object of interest consistent with the present invention. Figs. 7A and 7B are the same as Figs. 6A and 6B, respectively, described above. Accordingly, description of Figs. 7A and 7B are omitted. In Fig. 7C, instead of generating a vector outline where white meets black, as shown in Fig. 6C, raster lines are generated wherever white is present. The generation of raster lines can be performed by the image processing unit 40, and can be implemented in the image processing unit 40 with a raster graphics software program as are know in the art.
[0048] The image data corresponding to the raster lines generated by the image processing unit is provided to the image projector 30, which projects the raster lines over the object 10 that was imaged by the imager 20, as shown in Fig. 6D. The image projector 30 thus visibly illuminates the body of each object 10 captured by the imager 20. [0049] Accordingly, using the display system of Fig. 1 , it is possible to outline the object 10 imaged by the imager 20, as shown in Figs. 6A-6D, or to illuminate the object 10, as shown in Figs. 7A-7D. In addition, the outline and illuminating, as well as any other type of image projection, can be performed in real time. To do so, the video output of the imager 20, while it is imaging, is provided in real time to the image processing unit 40, which processes these video frames one by one in real time, such as with a video-to-vector graphics software program. The image processing unit 40 analyzes each frame of video one by one in real time and creates a vector line(s) (or raster line or other type of image for projection) wherever white meets black on that frame. The created vector line (or raster line or other type of image projection) replaces the frames of video one by one in real time with vector outline frames (or raster line frames or other type of image projection frames). These newly created graphics frames are delivered electronically one by one in real time to the image projector 30, which in turn projects them directly over the object 10 that is being detected by the imager 20.
[0050] Fig. 8 is an example of a control panel that can be used in the display system of Fig. 1. As shown in Fig. 8, the control panel 50 includes a display 51, graphics keys 52, blink key 53, reset key 54, perimeter key 55, and pan and tilt key 56. The display 51 can be implemented, for example, as a CRT, LCD, plasma, or other type of video display. The graphics keys 52, blink key 53, reset key 54, perimeter key 55, and pan and tilt key 56 can be implemented as buttons on a panel separate from the display 51 or as a touch panel on the display 51 itself.
[0051] The graphics keys 52 can be used to block out portions of the image captured by the imager 20 and to add images to the image captured by the imager 20. As shown in Fig. 8, the graphics keys 52 include two different sized circles, two different sized rectangles, and four arrows. The circles and arrows are graphics that can be added to the image captured by the imager 20, and the solid rectangles are graphics that can be used to block out portions of the image captured by the imager. It should be understood that other shapes can be used for the graphics keys 52, both for graphics to be added to the image and for blocking out part of the image. The added graphics may be of any size, such as to surround the object of interest (as shown in Fig. 8) or to be wholly within the outline of the object of interest. They may be of any shape or design, including, for example, one or more dots; lines (curved, squiggle, straight, cross-hair); geometric shapes (e.g., circle, oval, triangle, square, polygon); a predetermined pattern; company logo; movie or TV characters or their costumes; text message; or any combination. The graphics keys 52 can also include a changeable size tool that permits the user to demarcate the size of an image portion deleted or an image added. The position of the deleted image portion or the added image can be set using the pan and tilt key 52. Alternatively, a pointing device such as a mouse or pen device can be used to set the position. It is also possible to permit a user to touch the location at which the selected graphic is placed.
[0052] The blink key 53 is selected when the user wants the projected image in a particular area to blink. To do so, the user can touch the area of the video screen (or demarcate the area with a changeable size tool in conjunction with a pointing device) and then select the blink key 53. This action causes the projected image in that area to blink, which is useful in drawing a viewer's attention to the blinking object. [0053] The reset key 54 removes any image portions deleted and any images added by the graphics keys 52. The perimeter key 55 adds a frame to the view on the display 51 and to the image projected by the image projector 30. The frame added by the perimeter key corresponds to the field of view of the imager 20. The pan and tilt key 56 can be used, for example, to move the position the imager 20 (and correspondingly the position of the image projector 30), to change the size of the field of view of the imager 20, and to move the placement of objects added to the display 51.
[0054] In the exemplary image shown in the display 51 in Fig. 8, a portion of a building is shown to include five human objects that are identifiable by the imager 20, such as by their heat signature when the imager 20 is implemented as a thermal imager. The display 51 also includes two particular human objects that have circular images added by the graphics keys 52. The user may add these circular images to identify high value objects from among the objects captured by the imager 20 so that when the image projector 30 displays the image with the added circles onto the building itself including the human objects, anyone viewing the image displayed by the image projector 30 will see the circles around the high valued objects, and thus be able to discriminate objects of interest from objects that are not of interest. For example, in a military context, the circle objects can be enemy combatants and the non-circled objects can be friendly combatants. In addition to the circular images, a frame can be added to the overall image. The frame provides an outline of the actual image captured by the imager 20, i.e., the field of view of the imager 20. The frame can be useful as it shows viewers exactly how much or how little the imager 20 is seeing.
[0055] Fig. 9 is an example of projecting an image on objects of interest at a distance consistent with the present invention. As shown in Fig. 9, a vehicle in which the display system has been implemented is positioned at night at a distance from the same building shown in Fig. 8. Through the use of the system, the imager 20 can identify objects, in this case human objects, at a distance and illuminate them with the image projector 30. For covert operations, a laser emitted by the image projector 30 can be in the near field infrared range, around 940 nm, which is invisible to the naked eye and thus allow only those with standard night vision capabilities to view the projection.
[0056] Fig. 10 is an example of highlighting objects of interest in the example of Fig. 9. In particular, Fig. 10 shows two specific objects that are surrounded by circles, which are graphics added using the image add keys 54 of the control panel 50. The image processing unit 40 can be configured to follow a highlighted object (e.g., an object around which a graphic is added) if the object moves while being imaged by the imager 20. For example, if the objects surrounded by circles in Fig. 10 are moving, the image processing unit 40 can process the image so that the circles remain around the moving objects. [0057] Fig. 11 is an example of providing a frame to the highlighted objects of interest in the example of Fig. 10. In particular, the frame in Fig. 11 shows how much of the building is being imaged by the imager 20.
[0058] Figs. 12A-12C show examples of varying frame shapes that can be projected in the display system of Fig. 1. In the display system of Fig. 1, the horizontal and vertical size of this projected window (field of view) can be adjusted independently to fit the specific needs of the operator. In Fig. 12A, the image projector 30 displays a full screen, which is the default size of the projected window. Fig. 12B shows the display of a panoramic view in which the height of the projection window is made smaller. In Fig. 12C, the image projector displays a vertical view in which the width of the projection window is narrowed, such as if only a tall building needs to be examined. With these various window dimensions set, the image projector 30 does not project beyond those dimensions even though the imager 20 may capture an image larger than the window dimensions.
[0059] Fig. 13 is an example of an alternative application of the system of Fig. 1 for controlling a fire. As shown in Fig. 13, the system including the image processing unit 40 and the imager 20 can be suspended over an object on fire, such as a ship 82. The display system can be suspended, for example, by a helicopter, a balloon, an airplane, or other aerial vehicle. If implemented as a thermal imager, the imager 20 provides a thermal image of the ship 82, which identifies the hot spots, i.e., the fire locations, to the image processing unit 40. The image processing unit 40 can be configured to identify the hot spots from the thermal image and provide that information to water cannon and guidance assemblies 80. More specifically, the image processing unit 40 can be configured to map digitally the perimeter of the entire theater of combustion including all hot spots and any thermal data relevant to this unstable condition. Based on this information, the assemblies 80 can be automatically directed to position and provide water to the most needed spots on the ship 82 and thus effectively and efficiently put out the fire on the ship. The identified hot spots can also determine the force at which the assemblies 80 provide water to the fire. Although assemblies 80 are described as using water, it should be understood that other fire retardants can be used.
[0060] Fig. 14 is an example of an alternative application of the system of Fig. 1 for controlling an air mass. Like the system in Fig. 13, the system here would be carried by an aerial vehicle that is capable of positioning the system over a cold air mass 84 and a warm air mass 86. In the example of Fig. 14, the cold air mass 84 is on a trajectory course towards a warm air mass 86 or visa versa. When this condition exists, a hurricane or other violent weather front may start to form. As shown in Fig. 14, the imager 20, implemented as a thermal imager, with an aerial view of the air masses 84, 86 provides thermal data to the image processing unit 40. The image processing unit can be configured to map digitally the entire thermal domain relevant to this weather event and calculate where the image projector 30, implemented as a powerful overhead laser, would best be directed in order to warm part or all of the cold air mass 84 so as to mitigate or stop the inevitable weather condition.
[0061] Fig. 15 is an example of an application of the display system of Fig. 1 for identifying stress areas in a bridge. As shown in Fig. 15, the imager 20 images at least a portion of the bridge. If implemented as a thermal imager, the image captured by the imager 20 would highlight the areas of the bridge that are mechanically stressed. The image is then processed by the image processing unit 40, which provides the processed image to the image projector 30, and the image projector 30 projects the image onto the bridge so that viewers can witness exactly where on the bridge the stress spots are located. [0062] Fig. 16A-16B are examples of an application of the display system of Fig. 1 for identifying hot spots in an electrical power apparatus. As shown in Figs. 16A-16B, the imager 20 images at least a portion of the electrical power apparatus. If implemented as a thermal imager, the image captured by the imager 20 would highlight the areas of the electrical power apparatus that correspond to hot spots. The image is then processed by the image processing unit 40, which provides the processed image to the image projector 30, and the image projector 30 projects the image onto the electrical power apparatus so that viewers can witness exactly where on the electrical power apparatus the hot spots are located. Thus, using the display system of Fig. 1 for bridges and electrical power apparatuses, multiple users can see on the objects themselves exactly where items of interest are located.
[0063] Fig. 17 is an example of an application of the display system of Fig. 1 for displaying the contents of a container. In this example, the imager 20 is preferably implemented as an X-ray device. In this implementation, the display system can be used to detect and display the contents of a shipping container 86. In particular, the shipping container 86 passes through an X-ray area 22, which corresponds to a region that can be captured by the imager 20. The X-ray image data is provided to the image processing unit 40, which transforms the X-ray image data into an image that can be projected by the image projector 30. The image projector 30 projects the image onto the side of the container 86 so that viewers can witness the shape and position of the contents of the container without having to open the container. [0064] It would be desirable in some instances to have the display system configured to remember first findings and display them longer, i.e., not display the image in real time. For example, if a person is detected and that person recognizes that his position is now being displayed, he would likely try to duck out of the sight of the imager 20, which would in turn stop the display system from displaying his position further. By using a first glance capture mode, the display system can be configured to remember the last position that was displayed by the image projector 30 and direct the image projector 30 to continue displaying that specific area for a predetermined period of time. This would give the viewers additional time to evaluate these sightings.
[0065] Embodiments according to the invention have wide-ranging applications, including but not limited to the intelligence, security and military fields, such as (purely by way of example):
> To make visible the location of a weapon or contraband carried and concealed by a person.
> To make visible the location of a particular moving or stationary vehicle (or type of vehicle) in the field.
> To make visible a specific person or persons of interest either alone or in a crowd, e.g., using facial recognition software, or behavioral data to identify a person acting suspiciously, or body shape data to identify a person concealing an explosive vest, and projecting a viewable image onto the head or torso of the person of interest.
> To make visible the location of a person of interest in order to assist in the apprehension of that person or to track that person's movements.
> To make visible the location of a person of interest in order to assist in the search and rescue of that person.
> To make visible a laser dot on a specific location on that object, device, vehicle or persons for purposes of military targeting, such as to assist laser-guided munitions.
> To make visible the location of vehicles, humans, animals, or structures of interest in order to assist in the gathering of intelligence.
> To make visible the boundaries of either the projector, or the field of view of a camera.
> To make visible the contents of a shipping container.
> To make visible the temperature differences on the human body. > To make visible humans and vehicles in a theater of operations, and to differentiate the humans from vehicles by projecting one color light onto the humans and another color of light onto the vehicles.
> To make visible the location of aircraft in the air.
> To make visible the location of boats and ships in the water.
> To make visible the graphics of an event that has already taken place by recording such data at its original occurrence and projecting it later, when practical or in a more convenient location, for the purposes of training, or to locate weapons, objects or devices that have been dropped or hidden, or to locate the hidden contents of a vehicle or a container (cargo, luggage, etc.) that has passed through an imaging portal or station.
[0066] The foregoing description of preferred embodiments of the invention has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed, and modifications and variations are possible in light of the above teachings or may be acquired from practice of the invention. The embodiments (which can be practiced separately or in combination) were chosen and described in order to explain the principles of the invention and as practical application to enable one skilled in the art to make and use the invention in various embodiments and with various modifications suited to the particular uses contemplated. It is intended that the scope of the invention be defined by the claims appended hereto and their equivalents. [0067] International application No. PCT/US2006/021450 is incorporated by reference herein in its entirety.

Claims

WHAT IS CLAIMED IS:
1. A system for identifying an object of interest, comprising: at least one imager configured to capture an image of at least one object in a field of view and generate image data from the captured image; an image processing unit configured to compare the generated image data to predetermined criteria, and to produce an output when the generated image data meet the predetermined criteria; and at least one image projector configured to respond to the output of the image processing unit by displaying a viewable image onto an object of interest, whose captured image resulted in meeting the predetermined criteria.
2. A system according to claim 1 , wherein the image processing unit and the image projector are configured to operate in real time so that the viewable image displayed onto the object of interest follows the object of interest when it moves.
3. A system according to claim 2, wherein the object of interest is a person, and the generated image data comprises biometric data.
4. A system according to claim 3, wherein the biometric data comprises facial configuration data.
5. A system according to claim 4, wherein the viewable image is projected onto the head of the object of interest.
6. A system according to claim 2, wherein the object of interest is a person, and the generated image data comprises one or more selected from the group consisting of body shape data, body stance data, and behavioral data.
7. A system according to claim 2, wherein the generated image data comprises object shape data.
8. A system according to claim 7, wherein the viewable image is centered on the center of data mass of the object of interest.
9. A system according to claim 2, wherein the viewable image comprises one or more selected from the group consisting of a dot, a line, a geometric shape, a predetermined pattern, a logo, a character, text, numeral(s), an outline of the object of interest, and a raster image of the object of interest.
10. A system according to claim 9, wherein the viewable image is caused to blink.
11. A system according to claim 2, comprising a plurality of imagers, wherein: each imager is configured to capture its own image and generate its own image data, the image data generated by each imager being different from the others; and the image processing unit is configured to compare the generated image data from each imager to relevant predetermined criteria, and to produce an output to the image projector when the generated image data from any imager meet the relevant predetermined criteria.
12. A system according to claim 2, comprising a plurality of imagers, wherein: each imager is configured to capture its own image and generate its own image data, the image data generated by each imager being different from the others; and the image processing unit is configured to combine the image data generated by the imagers into composite image data, to compare the composite image data to relevant predetermined criteria, and to produce an output to the image projector when the composite image data meet the relevant predetermined criteria.
13. A system according to claim 2, wherein the imager is configured to capture images of one or more objects in a field of view and generate a set of image data from each captured image; the image processing unit is configured to compare the sets of generated image data to predetermined criteria, and to produce an output for each set of generated image data that meets the predetermined criteria; and the image projector is configured to respond to each output of the image processing unit by displaying a respective viewable image onto each object of interest, whose captured images resulted in meeting the predetermined criteria.
14. A system according to claim 13, further comprising a control panel configured to provide image controls in response to inputs made through the control panel, wherein each image control is configured to adjust the operation of at least one of the imager, the image processing unit, and the image projector.
15. A system according to claim 14, wherein at least one of the image controls enables selective non-projection of a viewable image on selected objects of interest.
16. A system according to claim 14, wherein at least one of the image controls enables selective projection of a different and/or an additional viewable image on selected objects of interest.
17. A system according to claim 2 or claim 13, wherein: the predetermined criteria comprise a plurality of sets of predetermined criteria; the image processing unit is configured to compare the generated image data to the sets of predetermined criteria, and to produce a unique output that corresponds to the set of predetermined criteria met by the generated image data; and the image projector is configured respond to a unique output by projecting a unique viewable image that corresponds to the output onto the object whose captured image resulted in meeting that set of predetermined criteria.
18. A system according to claim 17 capable of distinguishing objects in at least two classes, the objects in a first class possessing a common property, not possessed by the objects in other classes, which yields a distinctive characteristic in the generated image data of objects in said first class, wherein a first set of predetermined criteria produces no output for a set of generated image data that includes the distinctive characteristic, so that the viewable image unique to said first set is not projected onto objects in said first class.
19. A system according to claim 18, wherein the imager captures at least thermal images.
20. A system according to claim 2 or claim 14 for targeting at least one object of interest, further comprising a controller configured to aim a weapon at the viewable image displayed onto the targeted object(s) of interest.
21. A method for identifying an object of interest, comprising: capturing an image of at least one object in a field of view with at least one imager; generating image data from the captured image; comparing the generated image data to predetermined criteria, and producing an output when the generated image data meet the predetermined criteria; and in response to the output, displaying with at least one image projector a viewable image onto an object of interest, whose captured image resulted in meeting the predetermined criteria.
22. A method according to claim 21 operating in real time so that the viewable image displayed onto the object of interest follows the object of interest when it moves.
23. A method according to claim 22, wherein the object of interest is a person, and the generated image data comprises biometric data.
24. A method according to claim 23, wherein the biometric data comprises facial configuration data.
25. A method according to claim 24, wherein the viewable image is projected onto the head of the object of interest.
26. A method according to claim 22, wherein the object of interest is a person, and the generated image data comprises one or more selected from the group consisting of body shape data, body stance data, and behavioral data.
27. A method according to claim 22, wherein the generated image data comprises object shape data.
28. A method according to claim 27, wherein the viewable image is centered on the center of data mass of the object of interest.
29. A method according to claim 22, wherein the viewable image comprises one or more selected from the group consisting of a dot, a line, a geometric shape, a predetermined pattern, a logo, a character, text, numeral(s), an outline of the object of interest, and a raster image of the object of interest.
30. A method according to claim 29, wherein the viewable image is caused to blink.
31. A method according to claim 22 utilizing a plurality of imagers, comprising: capturing a separate image with each imager; generating separate image data from each captured image, the image data generated from each captured image being different from the others; and comparing the generated image data from each imager to relevant predetermined criteria, and producing an output to the image projector when the generated image data from any imager meet the relevant predetermined criteria.
32. A method according to claim 22 utilizing a plurality of imagers, comprising: capturing a separate image with each imager; generating separate image data from each captured image, the image data generated from each captured image being different from the others; combining the image data generated by the imagers into composite image data; and comparing the composite image data to relevant predetermined criteria, and producing an output to the image projector when the composite image data meet the relevant predetermined criteria.
33. A method according to claim 22, comprising: capturing images of one or more objects in a field of view; generating a set of image data from each captured image; comparing the sets of generated image data to predetermined criteria, and producing an output for each set of generated image data that meets the predetermined criteria; and in response to each output, displaying a respective viewable image onto each object of interest, whose captured images resulted in meeting the predetermined criteria.
34. A method according to claim 33, further comprising selectively preventing projection of a viewable image on selected objects of interest.
35. A method according to claim 34, further comprising selectively projecting of a different and/or an additional viewable image on selected objects of interest.
36. A method according to claim 22 or claim 34 for targeting at least one object of interest, further comprising aiming a weapon at the viewable image displayed onto the targeted object(s) of interest.
37. A method according to claim 22 or claim 33, wherein: the predetermined criteria comprise a plurality of sets of predetermined criteria; the generated image data is compared to the sets of predetermined criteria; at least one output is produced depending on which set(s) of predetermined criteria is met by the generated image data, each set of predetermined criteria, when met, resulting in an output unique to that set; and a unique viewable image that corresponds to the output is projected onto the object whose captured image resulted in meeting that set of predetermined criteria.
38. A method according to claim 33, wherein: the objects comprise at least two classes of objects, the objects in a first class possessing a common property, not possessed by the objects in other classes, which yields a distinctive characteristic in the generated image data of objects in said first class; and a first set of predetermined criteria produces no output for a set of generated image data that includes the distinctive characteristic, so that the viewable image unique to said first set is not projected onto objects in said first class.
39. A method according to claim 38 for targeting objects of interest in a field having different types of objects, the objects in said first class being friendly objects, comprising aiming a weapon at the viewable images displayed onto the objects not in said first class.
40. A method according to claim 39, wherein the objects comprise military equipment.
41. A method according to claim 39, wherein the objects comprise combatants, and the imager captures at least thermal images of each combatant.
42. A method according to claim 38, wherein the objects comprise persons, and the imager captures at least thermal images of each person.
43. A system for highlighting an object of interest, comprising: an imager configured to capture an image of at least one object of interest in a field of view and generate image data from the captured image; an image processing unit that transforms at least a portion of the image data into a viewable format of preselected configuration; and an image projector that displays an image in accordance with the preselected configuration onto the object of interest.
44. A system according to claim 43, wherein the displayed image comprises a laser- generated dot or cross-hair for targeting by a weapon.
45. A system according to claim 43, wherein the displayed image comprises a geometric shape.
46. A system according to claim 43, wherein the displayed image comprises the likeness of a person or an animal.
47. A system according to claim 43, wherein the displayed image comprises a text message.
48. A system according to claim 43, wherein the displayed image comprises one or more lines.
49. A system according to claim 43, wherein the displayed image comprises a pattern.
50. A system according to claim 43, wherein the displayed image comprises a logo.
51. A method highlighting an object of interest, comprising: capturing an image of at least one object of interest in a field of view with at least one imager; generating image data from the captured image; transforming at least a portion of the image data into a viewable format of preselected configuration; and displaying with at least one image projector an image in accordance with the preselected configuration onto the object of interest.
52. A method according to claim 51, wherein the displayed image comprises a laser- generated dot or cross-hair, further comprising targeting a weapon on the displayed image.
PCT/US2007/025033 2007-12-07 2007-12-07 Method and apparatus for projecting viewable data onto an imaged object WO2009073009A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2007/025033 WO2009073009A1 (en) 2007-12-07 2007-12-07 Method and apparatus for projecting viewable data onto an imaged object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2007/025033 WO2009073009A1 (en) 2007-12-07 2007-12-07 Method and apparatus for projecting viewable data onto an imaged object

Publications (1)

Publication Number Publication Date
WO2009073009A1 true WO2009073009A1 (en) 2009-06-11

Family

ID=40717998

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2007/025033 WO2009073009A1 (en) 2007-12-07 2007-12-07 Method and apparatus for projecting viewable data onto an imaged object

Country Status (1)

Country Link
WO (1) WO2009073009A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102010019880A1 (en) * 2010-05-07 2011-11-10 Smiths Heimann Gmbh Device for checking an object, in particular for checking persons for suspicious objects
US8937657B2 (en) 2012-07-15 2015-01-20 Erik Klass Portable three-dimensional metrology with data displayed on the measured surface

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020136435A1 (en) * 2001-03-26 2002-09-26 Prokoski Francine J. Dual band biometric identification system
US20030086603A1 (en) * 2001-09-07 2003-05-08 Distortion Graphics, Inc. System and method for transforming graphical images
US20040165766A1 (en) * 1996-10-08 2004-08-26 Yoshihiro Goto Method and apparatus for forming and displaying projection image from a plurality of sectional images
US20050031165A1 (en) * 2003-08-08 2005-02-10 Lockheed Martin Corporation. Method and apparatus for tracking an object

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040165766A1 (en) * 1996-10-08 2004-08-26 Yoshihiro Goto Method and apparatus for forming and displaying projection image from a plurality of sectional images
US20020136435A1 (en) * 2001-03-26 2002-09-26 Prokoski Francine J. Dual band biometric identification system
US20030086603A1 (en) * 2001-09-07 2003-05-08 Distortion Graphics, Inc. System and method for transforming graphical images
US20050031165A1 (en) * 2003-08-08 2005-02-10 Lockheed Martin Corporation. Method and apparatus for tracking an object

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102010019880A1 (en) * 2010-05-07 2011-11-10 Smiths Heimann Gmbh Device for checking an object, in particular for checking persons for suspicious objects
EP2567255B1 (en) * 2010-05-07 2016-03-09 Smiths Heimann GmbH Device for examining an object, in particular for inspecting persons for suspicious items
US8937657B2 (en) 2012-07-15 2015-01-20 Erik Klass Portable three-dimensional metrology with data displayed on the measured surface

Similar Documents

Publication Publication Date Title
US20100054545A1 (en) Method and apparatus for displaying properties onto an object or life form
US9270976B2 (en) Multi-user stereoscopic 3-D panoramic vision system and method
US8896701B2 (en) Infrared concealed object detection enhanced with closed-loop control of illumination by.mmw energy
US9564034B2 (en) Real time threat detection system using integrated passive sensors
US8599266B2 (en) Digital processing of video images
US20040041724A1 (en) Methods and apparatus for detecting concealed weapons
US9508017B2 (en) Device for capturing image of iris and user recognition device applying same in order to control access
EP2284814A1 (en) Systems and methods for night time surveillance
CN109643366A (en) For monitoring the method and system of the situation of vehicle driver
CN110476148A (en) For providing the display system and method for multiple view content
US20070235652A1 (en) Weapon detection processing
US20110052009A1 (en) Unconstrained spatially aligned head-up display
US8483567B2 (en) Infrared communication system and method
CN103885088A (en) Method For Operating A Handheld Screening Device And Handheld Screening Device
US20180232581A1 (en) Method and system for detecting concealed objects using handheld thermal imager
US7885429B2 (en) Standoff detection systems and methods
US20210142045A1 (en) Method Of And System For Recognising A Human Face
US20070141538A1 (en) Simulator utilizing a high resolution visual display
WO2009073009A1 (en) Method and apparatus for projecting viewable data onto an imaged object
US20220207974A1 (en) Active protection system and method of operating active protection systems
US7601958B2 (en) Broadband energy illuminator
JP7457774B1 (en) Information processing device, image processing device, and program
US20130076487A1 (en) Device and a method for the identification of persons
JP7472238B1 (en) Information processing device and program
Simard et al. Feature detection performance with fused synthetic and sensor images

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07867660

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07867660

Country of ref document: EP

Kind code of ref document: A1