WO2002090887A2 - Digital camera for and method of obtaining overlapping images - Google Patents
Digital camera for and method of obtaining overlapping images Download PDFInfo
- Publication number
- WO2002090887A2 WO2002090887A2 PCT/US2002/014566 US0214566W WO02090887A2 WO 2002090887 A2 WO2002090887 A2 WO 2002090887A2 US 0214566 W US0214566 W US 0214566W WO 02090887 A2 WO02090887 A2 WO 02090887A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- ofthe
- large area
- sub
- area object
- arrays
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformation in the plane of the image
- G06T3/40—Scaling the whole image or part thereof
- G06T3/4038—Scaling the whole image or part thereof for image mosaicing, i.e. plane images composed of plane sub-images
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/02—Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
Definitions
- This invention is related generally to photogrammetry and more specifically to large format digital cameras for acquiring images of large area objects or large areas of large objects, such as aerial images of portions of the surface of the earth.
- Each photodetector in the array produces an electric signal, the voltage or current of which is a function of the intensity of all the light that is focused onto that photodetector.
- the electric signal (voltage or current) level produced by that photodetector will be indicative of the average light energy reflected from the entire house. Therefore, such signal output might enable one to discern that there is something in the resulting mountain picture where the house is located, but it would not be able to produce a recognizable house shape, much less details of doors, windows, roof lines, and the like.
- no amount of enlargement or "blowing up" of the picture could produce more detail. It would just produce a larger spot in a larger image of the mountain scene, because the resolution is limited to a single electric signal for the entire house.
- Rectangular array sizes of about 1,000 x 1,000, i.e., about 1,000,000, photodetectors are now fairly common and readily available, while arrays of 5,000 x 5,000, i.e., about 25,000,000 photodetectors, or more are considered to be "large” and expensive.
- array sizes are impressive compared to only a few years ago, and sizes of photodetector arrays will no doubt continue to grow in coming years. They are, none the less, still limited.
- Several high-end camera manufacturers do make digital cameras with 10,000 x 10,000 arrays, i.e., about 100,000,000 photodetectors, but the costs of such cameras are prohibitive fore most purposes.
- the read-out time alone can reduce their usefulness to situations in which there is no movement in the object of between the object and the camera, because such movement before all the electric signals from all the photodetectors in the array that can be read could introduce distortions in the resulting picture or image.
- Such large detector for large format photogrammetry applications have to be nearly flawless, i.e., few, if any, bad photodetectors, because there is no redundancy and bad photodetectors in the array will leave flaws in images being acquired by the detector array. The requirement for such flawlessness puts further pressure on cost of such cameras.
- Scanning cameras are ones in which a linear array of photodetectors are moved in the stationary image plane of a camera to cover the entire image format or view.
- a similar effect, but different method, is to move (rotate) the camera, usually in an arc about an axis that extends through the lens and is parallel to the scene or object and to the image plane. Cameras that use this latter technique of scanning the object by rotating the camera are called panoramic cameras.
- the image of the object (earth or other planet or moon surface), therefore, is collected in individual linear paths corresponding to the linear array of photodetectors being swept over the object, and a plurality of such paths are then assembled together to produce the image.
- the object (earth, etc.) must not “move” as it is being scanned, i.e., the satellite with the camera remains a fixed distance from the center of mass of the earth during the satellite's movement and scan of the earth's surface.
- the problem with such "push-broom” scanning is that it is difficult to know the camera path accurately, which results in a notorious lack of accuracy in stereo measurements or mensuration in the resulting images.
- a point on the ground in the path of the multiple "push-broom" image acquisitions is imaged as many times as there are linear arrays in the camera.
- a problem is that each image line has a separate perspective center, so the resulting composite image of the object has many perspective centers.
- PCT Patent Cooperation Treaty
- PCT/DEOO/01163 International No. WO 00/66976
- W. Teuchert and W. Mayr wherein the linear arrays are replaced by a plurality of small rectangular arrays to populate the focal image.
- Three groups of multiple detectors in each group are used, so that one group is forward-looking, one group is nadir (looking straight down), and the third group is rearward-looking.
- This camera is also mounted on an airplane or spacecraft, and, as it flies, the image taking is repeated rapidly so that spaces between images acquired by individual detectors in a group are filled or partially filled by images acquired by the next row of detectors in the group, etc., until the three group images are filled and synthesized together from the individual detectors in the group.
- This system requires the motion of the airplane or spacecraft, and it cannot be used to image a moving object. It also has the problem of multiple perspective centers.
- a general object of this invention is to provide a method and camera apparatus that can produce high resolution, large format, digital images of large area objects.
- a more specific object of the present invention is to provide a large format, digital camera system and method for acquiring large format, digital images of large area objects, but without the requirement and costs of large, expensive, detector arrays.
- Another object of the present invention is to provide a large format, digital camera system and method for acquiring large format, digital images of objects that are either moving or stationary.
- Another specific object of the present invention is to provide a method and camera apparatus for acquiring instantaneous, large format, digital images of an object so that movement of the object or of the camera apparatus does not cause blurring or distortion of the images.
- Still another object of the present invention is to provide a method and camera apparatus for acquiring large format, digital images of a large area object inexpensively enough that there can be a degree of redundancy in order to produce high quality, large format, digital images without having to rely on flawless detector arrays.
- Yet another object of the present invention is to provide a method and camera apparatus for acquiring large format, digital images of large area objects with fast and efficient digital data read-out and processing.
- a further object of the present invention is to provide a method and camera apparatus with high resolution, wide angle geometry for a wide field of view and for good stereo digital imaging.
- a still further object of the present invention is to provide a method and digital camera ' apparatus for producing a large image format that has a single perspective center.
- a method of this invention may comprise digital camera systems in which two-dimensional arrays are exposed to multiple portions or sub-areas of large area objects to acquire overlapping sub-images of such sub-areas.
- the digital sub- images can then be pieced or stitched together in a digital mosaic to form a digital macro-image of the large area object.
- the sub-images can be acquired by a single, two-dimensional, detector array and exposed sequentially to the different sub-areas by different lens systems.
- clusters comprising multiple, spaced apart, two-dimensional detector arrays are exposed to different sub-areas of the large area object by a single lens system either sequentially or simultaneously to acquire the overlapping sub-images.
- a large format, digital camera system comprises more than one sub-camera system, and each sub-camera system has its own lens system with at least one detector array in the focal plane of each lens system.
- the lens systems have the same focal length and are mounted closely together in a manner that makes their focal planes coplanar and their optical exes parallel to each other.
- the detector array(s) of the different sub-cameras are exposed by the respective single lens systems of the sub-cameras to overlapping sub-areas of large area objects so that sub-images acquired by different sub-cameras have overlapping areas to facilitate piecing or stitching them together into a seamless macro-image of the large area object.
- the shutters and image acquisition systems of the respective sub-cameras can be actuated in timed sequence coordinated with the speed of movement to make the perspective centers of the sub-camera images match the perspective centers of the other sub-camera images. Therefore, when the macro-image is pieced or stitched together from the sub-images, it has a single perspective center. If it does not move, the shutters and image acquisition systems can be actuated either sequentially or simultaneously, but the macro-image will have a plurality of perspective centers, i.e., one perspective center for the images acquired by each sub-camera system.
- a large format, digital camera comprising four sub-camera systems will, if not moving with shutters actuating in the timed sequence, have four perspective centers.
- More detector arrays can be added to each sub-camera system to increase the format size of the large format, digital camera system, or the detector arrays can be larger to increase the format size without adding more sub-cameras. Of course, adding more sub-cameras can also increase format size.
- Repeat image acquisition of the same object through different color filters to provide overlapping color sub-images can add color to the macro-image.
- addition of another sub-camera system with overlapping color detector arrays or near infrared arrays can add color or near infrared imaging capabilities.
- the color and/or near infrared arrays can be two-dimensional or, if the camera moves, linear and can, but does not have to, be less resolution than the monochromatic or panchromatic sub-image arrays, thus can have different focal lengths.
- the lower resolution color overlay is registered to the higher resolution monochromatic or panchromatic macro-image, and the high resolution monocliromatic or panchromatic macro-image content is used to segment and enhance the quality of the color overlay.
- the sub-image read-out from the various detector arrays is accomplished simultaneously in a parallel manner for speed.
- Digital correlated double sampling is used to convert the individual photodetector output signals twice per pixel using an analog to digital converter at the double sampling rate and to perform the subtraction of the residual reset signal to the actual light exposed signal in the digital domain.
- An accelerometer on a dampened camera carrier platform outputs signals during sub-image acquisition, which are used to evaluate vibration or movement magnitude. Sub- images are rejected and the shutter is re-actuated to re-acquire images in short order when the magnitude of vibrations or movement is large enough to degrade the sub-image quality.
- Figure 1 is a perspective, diagrammatic, view of a digital camera with a single detector array and multiple lenses acquiring a large format image of a large area object according to this invention, such as, for example, an image of a geographical area of the earth with the digital camera mounted on a helicopter;
- Figures 2a-d illustrate a plurality of sub-images of overlapping sub-portions of the large area object acquired by the digital camera of Figure 1;
- Figures 3a-f illustrate the process of stitching the sub-images of Figures 2a-d together into a large format macro-image
- Figure 4 is a perspective view of a large format, digital camera of this invention with a single lens system and multiple detector arrays in the focal plane;
- Figure 5 is a view of overlapping sub-images produced by shifting the optical path of the camera system of Figure 4;
- Figure 6 is an isometric view of an embodiment of the large format digital camera system of this invention comprising a plurality of sub-camera systems, each having one or more detector arrays exposed by a lens system;
- Figure 7 is a view of overlapping digital sub-images produced by the digital camera system of Figure 6 to form a composite, large format macro-image;
- Figure 8 is a view similar to Figure 7, but with a different example number and pattern of sub-images
- Figure 9 is a view similar to Figure 7, but with yet another different example number and pattern of sub-images;
- Figure 10 is a diagrammatic view of the detector arrays of a large format, color, digital camera system of this invention.
- Figure 11 is a diagrammatic view of the monochromatic or panchromatic, large format, macro-image formed by combining sub-images acquired from sub-camera systems of the large format, color digital camera system of Figure 10;
- Figures 12a-c are blue, red, and green sub-images acquired from blue, red, and green detector arrays of the camera system of Figure 10;
- Figure 13 is a near infrared sub-image acquired from the near infrared detector array of the camera system of figure 10;
- Figure 14 is a function block diagram of the digital processing of photodetector signals of the large format, digital cameras of this invention.
- Figure 15 is a diagrammatic view of another embodiment large format, color, digital camera system of this invention.
- Figure 16 is a diagrammatic view of example detector arrays of a large format, digital camera system of this invention to facilitate describing the signal processing and stitching of acquired sub-images into large format, macro-images according to this invention;
- Figure 17 illustrates one phase of the signal processing and stitching process of this invention
- Figure 18 illustrates another phase of the signal processing and stitching process of this invention.
- Figure 19 illustrates a large format, macro-image comprising overlapping sub-images according to this invention.
- Figure 20 illustrates a dampened camera platform apparatus with accelerometer read-outs of residual vibration and movement for use in determining whether sub-images obtained by sub- camera systems of this invention are too degraded by vibrations and/or movement for use in making a large format, macro-image.
- FIG. 1 One of the simpler, yet effective, large format, digital camera systems of this invention is illustrated diagrammatically in Figure 1.
- the large area camera system 10 in figure 1 is represented by a plurality of lens systems 12, 14, 16, 18, all of which have the same focal length F and are positioned in a common plane P, and a photodetector array 20 positioned in the focal plane of the lens systems 12, 14, 16, 18, i.e., at a distance F from the plane P.
- the photodetector array 20 is connected electrically by suitable conductors 22 to a control circuit 24, which includes at least a microprocessor, input/output circuitry, memory, and power supply for driving the photodetector array 20, reading sub-image data out of the photodetector array 20, and storing such sub-image data.
- control circuit 24 includes at least a microprocessor, input/output circuitry, memory, and power supply for driving the photodetector array 20, reading sub-image data out of the photodetector array 20, and storing such sub-image data.
- Other data processing functions such as combining sub-images and/or image display functions, can be accomplished in the camera system 20 or with other peripheral data processing equipment.
- the large format camera system 10 in Figure 1 is illustrated simplistically in the process of acquiring a large format digital image (not shown in Figure 1) of a large area 20 of an object 32, for example, a semi-developed geographical area on the earth's surface that includes a stream 34, railroad 36, three building structures 38, 40, 42, two cross streets 44, 46, and a bridge 48, where on cross street 46 crosses the stream 34.
- a large format digital image (not shown in Figure 1) of a large area 20 of an object 32, for example, a semi-developed geographical area on the earth's surface that includes a stream 34, railroad 36, three building structures 38, 40, 42, two cross streets 44, 46, and a bridge 48, where on cross street 46 crosses the stream 34.
- the components of the camera system 10, and large area 30 of the object 32 are not shown in actual sizes, proportions, or details, which would not be practical or even possible within the constraints of the drawing format, as would be understood by persons skilled in the art.
- the camera system 10 is shown with four lens systems 12, 14, 16, 18 positioned in the common plane p and configured to focus four parts of sub-areas 50, 52, 54, 56, respectively, of the large area 30 onto the detector array 20.
- any plurality of lens systems would work for purposes of this invention.
- the first lens system 12 focuses a first sub-area 50 onto the detector array 20, and then the second lens system 14 focuses a second sub-area 52 onto the detector array 20, the third lens system 16 focuses a third sub-area 54 onto the detector array 20, and the fourth lens system 18 focuses a fourth sub-area 56 onto the detector array 20, all in sequential order.
- the lens systems 12, 14, 16, 18 are set so that marginal edge portions of the sub-areas 50, 52, 54, 56 overlap each other. Therefore, such overlapping marginal edge area 60 is common to sub-areas 50, 52. Likewise, overlapping marginal edge area 62 is common to sub-areas 50, 54, marginal edge area 64 is common to sub-areas 52, 56, and marginal edge area 66 is common to sub-areas 54, 56. [0053] Each lens system 12, 14, 16, 18 includes a shutter that can be opened and closed independently of the other lens systems, so that the detector array 20 can be exposed to only one of the sub-areas 50, 52, 54, 56 at a time.
- shutters and their actuators are not shown in Figure 1 because of drawing size constraints, but shutters and shutter actuators are well-known to persons skilled in the art, who would be able to implement them without difficulty to practice this invention. Therefore, there is no need to describe shutters in lens systems or their actuators for purposes of describing this invention. Suffice it to say that such shutter actuators are connected to the control circuit 24 by suitable electric leads or conductors 26 so that read-out of image data from the detector array 20 can be coordinated with actuation of the respective shutters in lens systems 12, 14, 16, 18, as will be described in more detail below.
- the control circuit 24 reads out the sub-image 70 ( Figure 2a) of sub-area 50 from the detector array 20.
- the detector array 20 comprises an array of individual photodetectors, e.g., semiconductor devices that output an electric signal, the magnitude of which is dependent on the intensity of light energy incident on such photodetector. Therefore, the signal from each photodetector in the array 20 is indicative of light energy intensity from a pixel area of the sub- area 50, and the signals from all of the individual photodetectors in the array are indicative of light energy intensity from all of the pixel areas of the sub-area 50.
- the signals from the photodetectors in the detector array 20, together, are indicative of the pattern of light energy from the sub-area 50, so a sub-image 70 ( Figure 2) of the sub-area 50 can be produced from such signals.
- the signals are amplified, digitized, processed, and stored, as is well-known to persons skilled in the art.
- the shutter for the second lens system 14 is actuated to acquire and store a sub-image 72 ( Figure 2b) of the sub-area 52 in digital form. Acquisitions of sub-image 74 ( Figure 2c) of sub-area 54 and of sub-image 76 ( Figure 2d) of sub-area 56 by lens system 16 and lens system 18, respectively, are accomplished in a similar manner.
- the read-out time for a detector array less than 5,000 x 5,000 is typically less than one second, such as about one-half a second.
- the preferred method is to use known or calculated mechanical measurements or relationships to piece the respective sub-images 70, 72, 74, 76 together on a pixel- by-pixel basis in a common, master X-Y coordinate system and then to use software-identifiable, distinct features or parts of features in the overlapping, marginal edge areas 60, 62, 64, 66 as guide markers for final pixel adjustments to align and "stitch" together the adjacent sub-images to form the large format, macro-image 80.
- one of the sub-images for example sub-image 70
- its pixel array is chosen as the master X-Y coordinate system 82.
- the second sub-image 72 For piecing the second sub-image 72 together with the first, or master, sub-image 70, it is first joined as closely as possible with known or measured mechanical dimensions and relationships based on relative positions and orientations of the respective lens systems 12, 14 ( Figure 1), size of the detector array 20, distances of the lens systems 12, 14, from the detector array 20 from the lens systems 12, 14, and the like.
- This initial placement of the second sub-image 72 by its pixel array coordinate system x- y is accurate enough to get very close to registering the common features 38, 40, 44 together in the overlapping marginal edge portion 60.
- the master coordinates of pixels of distinct features 38, 40, 44 in the first sub-image 70 should be very close to the same as the coordinates for pixels of those same features 38', 40', 44' in the second sub-image 72, both of these sub-images 70, 72 are acquired with the same detector array 20 ( Figure 1).
- the parallaxes due to the different locations of the respective lens systems 12, 14 and other inaccuracies in measurements, lengths, and the like cause slight variations or displacements between such features, as illustrated in Figure 3a.
- the software can identify distinct features in the overlapping marginal edge portion 60 of both sub- images 70, 72.
- Corners such as Qj, Q ', and Q 2 ', are usually easiest to identify, because of sha ⁇ light intensity difference, thus electric signal difference, between adjacent photodetectors at such corners cause larger digital value differences for adjacent pixels of the sub-images at such corners.
- other distinct features or variations in pixel values can also be located with software for this synthesizing or stitching process.
- Once several distinct and matching points, such as Qi, Qj', and Q 2 ', are identified in the respective sub-images 70, 72 their coordinates can be adjusted to register with each other. For example, the master X-Y coordinates of points Qi and Q 2 are compared to each other and their X-Y differences determined.
- the master X-Y coordinates of points Q 2 and Q 2 ' are compared and their X-Y differences determined. With that difference information, the master X-Y pixels for those points Qi' and Q 2 ' of the second sub-image 52 can be adjusted to match the master X- Y coordinates of the corresponding points Qi andQ in the first sub-image 50.
- Application of ordinary geometry and trigonometry can then be used to shift, including rotating, if necessary, the entire local coordinate system 84 of the second sub-image 72 and thereby adjust the coordinates of all the pixels in the second sub-image 72 to maintain their individual spatial relationships with the adjusted points Qi' and Q 2 '.
- the two sub-images 70, 72 are stitched together geometrically to form the top half 86 of the large format, macro-image, as shown in Figure 36.
- the remaining two sub-images 74, 76 can be stitched together in a similar manner, as will be described below. However, before doing so, it may be desirable to perform any radiometric adjustments necessary to match gray scales of pixels in the second sub-image 72 with those of the first sub-image 70, so that one part of the half-image 86 does not look darker or brighter than the other part.
- Such gray scale variations in this camera system 10 are usually not a significant problem, because all the digital sub-images 70, 72, 74,76 are created with the same detector array 20 ( Figure 1).
- radiometric adjustments can be made with software by comparing output signal intensity of at least one, but preferably more, pixels of the first sub-image 70 with matching pixels of the second sub- image 72 in the overlapping marginal edge area 60. Whatever adjustments of signal intensity is needed to make the compared pixels in the second sub-image 72 match or nearly match the corresponding pixels in the first sub-image 70 can then be applied also to the other pixels in the second sub-image.
- the third sub-image is also brought into the master X-Y coordinate system 82 based on mechanical relationships and/or measurements. Then, distinct, corresponding features or points, e.g., Q 3 , Q 3 ' and Q , Q ', of the upper half-image 86 and the third sub-image 74 are identified in the overlapping marginal edge area 62. Adjustments are made, as described above, to stitch the third sub-image 74 together with the upper half-image 86 to produce the three-fourth-image 88, shown in Figure 3d. Radiometric adjustments can also be made, as explained above.
- the fourth sub-image is also brought into the master X- Y coordinate system 82 based on mechanical relationships and/or measurements.
- the distinct, corresponding features or points, e.g., Q 5 , Q 5 ' and Q ⁇ , Q ⁇ ', of the three-fourths-image 88 and the fourth sub-image 76 are identified in either one or both of the overlapping marginal edge areas 64, 66. Adjustments are made, as described above, to stitch the fourth sub-image 76 together with the three- fourths-image 88 to produce the full, large format, macro-image 80, shown in Figure 3f.
- This large format, macro-image 80 is, of course, the large image of the large area object 30 in Figure 1.
- Radiometric adjustments can also be made, as explained above. Also, if perfect registration of the points Q 5 , Q 5 ' and Q ⁇ Q ⁇ or other identified points in overlapping marginal edge areas 64, 66 cannot be made due to distortions or other inaccuracies in the processes of the previous stitchings of sub- images 70, 72, 74 or other causes, the software can make blending adjustments to the pixel coordinates to achieve a fit of all the sub-images 70, 72, 74,76 together into the large, macro-image 80 without concentrating any noticeable distortions.
- the large format image 80 of the large area object 30 can be acquired as described above with a readily available, normal-size detector array 20, such as a 1,000 x 1,000 array or a 2,000 x 2,000 array, instead of a very expensive, large detector array, yet the resolution is four times better than would be attainable with a conventional digital camera with such a normal-size detector array.
- the digital camera system 10 could be equipped with more lens systems to create even more sub-images and thereby make even a larger format camera for even better resolution.
- a single lens system 102 is used to focus a large area object (not shown in Figure 4, but similar to the large area object in Figure 1) onto a plurality of detector arrays, for example, the nine detector arrays 104, 106, 108, 110, 112, 114, 116, 118, 120, all positioned in the focal plane 122 of the lens system 102.
- Each of the detector arrays can be a normal size, such as 1,000 x 1,000 or 2,000 x 2,000, but such nine normal-size detector arrays 104, 106, 108, 110, 112, 114, 116, 118, 120 provides nine times more pixels than one of such arrays, thus nine times better resolution.
- Each ofthe plurality of detector arrays 104, 106, 108, 110, 112, 114, 116, 118, 120 produces a sub-image of a part or sub-area ofthe large area object, and such sub-images can be combined together in a mosaic by software to produce the large format image ofthe large area object.
- a rotation ofthe prism (not shown) at 90, 135, and 180 degrees would cause shifts to the left, down, and left + down, respectively.
- Each detector array 104, 106, 108, 110, 112, 114, 116, 118, 120 has its sub-image read out before the prism shift and after each prism shift.
- the read-out time is typically less than one (1) second, such as, for example, about one-half second.
- this large format camera system 150 comprises a plurality, e.g., four, connected sub-camera systems designated as 1, 2, 3, and 4.
- Each sub-camera system 1, 2, 3, 4 in the macro-camera system 150 is, itself, a large format camera system similar to the large format camera system 100 in Figure 4, i.e., each has a plurality of detector arrays exposed to the large area object 230.
- the first sub-camera system 1 comprises a plurality (e.g., six) of detector arrays 152 positioned spaced apart from each other in the focal pane 154 ofthe lens system 156.
- the spaces 158 between the detector arrays 152 are preferably almost, but not quite, as long or as wide as a detector arrays 152, so that sub-images from other ones ofthe sub-camera systems 2, 3, and 4 with similar sized and shaped detector arrays can fill the gaps between sub-images acquired by detector arrays 152 with some overlap in the marginal edge portions for stitching together to make a large area macro-image in a manner similar to that described above for camera system 10 and as will be explained in more detail below for this camera system 150.
- each ofthe six detector arrays 152 of sub-camera 1 When all six ofthe detector arrays 152 of sub-camera 1 are exposed to the large area object 230 by lens system 156, each ofthe six detector arrays will create a sub-image of a part or sub-area l 1 ofthe large area object 230 with some overlap at marginal edge portions into adjacent sub-areas, as explained above.
- Those sub- images acquired by the sub-camera system 1 are designated 1 " in the macro-image 232 in Figure 7.
- the large format, macro-image 230" in Figure 7 is a macro-image ofthe large area object 230 in Figure 6 pieced or stitched together from the sub-images 1 ", 2", 3", 4" acquired by all the sub-camera systems 1, 2, 3, 4.
- center 234' ofthe large area object 230 is the perspective center 155 ofthe sub-camera system 1, i.e., on the optic axis 151 ofthe sub-camera system 1, it is not included in any ofthe sub-images 1" in Figure 7 acquired by the sub-camera system 1 of Figure 6, because the optic axis 151 extends through the focal plane 154 in a space 158 between the detector arrays 152 of sub-camera system 1.
- the other sub-camera systems 2, 3, 4 have their respective lens systems 166, 176, 186, all with the same focal length as lens system 156 of sub-camera system 1.
- the lens systems are mounted adjacent each other, preferably in a common plane, so that the optic axis 151, 161, 171, 181 of each lens system 156, 166, 176, 186, respectively, substantially parallel to the optic axes ofthe other lens systems.
- each sub-camera system 2, 3, 4 has one or more detector arrays 162, 172, 182, respectively mounted in the respective focal plane 164, 174, 184 of lens systems 166, 176, 186.
- sub-camera system 2 has four detector arrays 162
- sub-camera system 3 has three detector arrays 172
- sub-camera system 4 has two detector arrays 182, all positioned so that sub- images acquired by them fill gaps between sub-images acquired by sub-camera system 1 and by other sub-camera systems with some overlap in marginal edge portions to accommodate stitching sub- images together, as described above.
- One ofthe advantages ofthe large format, digital of this invention, including, but not limited to the camera system 150 in Figure 6, is that larger format macro- images can be acquired, if desired, by either adding more detector arrays, or using larger detector arrays, or both, without having to add more lens systems.
- the optic axes 151, 161, 171, 181 of all the sub-camera systems 1, 2, 3, 4 are substantially parallel to each other, as explained above.
- the shutters ofthe respective lens systems 156, 166, 176, 186 can be actuated in a timed sequence coordinated with the speed of travel, as will be explained in more detail below.
- each lens system 156, 166, 176, 186 which intersects the optic center 155, 165, 175, 185 of its respective image plane 154, 164, 174, 184, will also extend through the same center 234 ofthe large area object 230, so that the perspective centers 155, 165, 175, 185 of each sub-camera system 1, 2, 3, 4 is the same in order to have the corresponding image center 230" in Figure 7 the same for all the sub-images acquired by all the sub- camera systems 1, 2, 3, 4.
- the optic axis 151, 161, 171, 181 of each sub-camera system 1, 2, 3, 4 extends through the object center 234 as its respective shutter is actuated. Consequently, when the sub-images 1", 2", 3", 4" are acquired in this manner, they will all have the same perspective center 234" corresponding to the object center 234 and to the respective optic centers 155, 165, 175, 185 ofthe sub-cameras 1, 2, 3, 4.
- the lens systems 156, 166, 176, 186 can be mounted in a common plane 140, and the focal planes 154, 164, 174, 184 can be co-planar.
- the lens systems 156, 166, 176, 186 and their focal planes 154, 164, 174, 184 do not have to be positioned in a straight line as shown in Figure 6, but could instead be positioned at corners of a rectangle, square, or any other pattern, but preferably coplanar and close together to minimize the distance between parallel axes.
- sub-images 1" of sub-areas 1' acquired by sub-camera system 1 are stitched together, as described above, in a mosaic with sub-images 2", 3", 4" of respective sub-areas 2', 3', 4' acquired by sub-camera systems 2, 3, 4, respectively, as shown in Figure 7 to complete the large format, macro-image 230" of large area image 230 of Figure 6.
- an advantage of this kind of large format, digital camera system 150 is that it can create a large format image 230" with a plurality of common, conventional size, detector arrays and with one common optical center 234".
- this large format camera system 150 can accommodate acquisition of all the sub-images with all the detector arrays 152, 162, 172, 182 simultaneously. Stitching them together to get the macro-image 230" can be done later. Therefore, with sequential shutter actuation so that there is one perspective center 234 common to all the sub-cameras 1, 2, 3, 4 and to all the sub-images, the resulting macro-image 230" ofthe large format camera system 150 is much like a picture or image acquired with an ordinary film-based camera or an ordinary small format digital camera.
- the object 230 can be either stationary or moving at moderate speeds in relation to the camera system 150, and the resulting large format image 230" will not be distorted any more than would be a picture or image acquired by an ordinary film-based camera or a small format digital camera.
- the large format, digital camera 150 could be used in a stationary position to make a large format, digital image of a painting or of a landscape scene, or even scenes in which an object is moving at a moderate speed, where the large format is desired for better resolution of details than can be acquired with ordinary size format digital cameras.
- an example size for each detector array 152, 162, 172, 182 may be 2,000 x 3,000 detectors. Allowing for several hundred pixel rows and columns of overlap in marginal edge areas for stitching sub-images together, as described above, will produce a large format image 230" of about 9,000 x 8,500 pixels. Different numbers and arrangements or patterns of detector arrays in sub-camera systems can be used to produce different sized macro-images.
- the macro-image 236 in Figure 8 can be produced by a first sub-camera system with twelve detector arrays, a second sub-camera system with nine detector arrays, a third sub-camera system with eight detector arrays, and a fourth sub-camera system with six detector arrays.
- these example macro-images 236" and 238" of Figures 8 and 9 are produced with a common optical center for all sub-cameras in the system.
- each ofthe sub-camera systems 1, 2, 3, 4 ofthe large format, digital camera system 150 in Figure 6 preferably has an electronically controlled shutter (not shown) but is well-known to persons skilled in the art in its lens system 156, 166, 176, 186, which can be actuated simultaneously.
- Such electronic shutter actuation signals can also be used to synchronize detector array exposure time with the shutter actuations, which is important, especially if the camera system 150 is mounted on a moving vehicle, such as an aircraft or spacecraft.
- each ofthe detector arrays 152, 162, 172, 182 is connected to a separate electronic data processing module, such as sensor electronics, analog electronics, digital electronics, and computer interface to produce individual data streams from the respective detector arrays 152, 162, 172, 182 to the data storage media.
- a separate electronic data processing module such as sensor electronics, analog electronics, digital electronics, and computer interface to produce individual data streams from the respective detector arrays 152, 162, 172, 182 to the data storage media.
- This preferred arrangement facilitates parallel capture of sub-image data from the individual detector arrays or of small groups ofthe detector arrays, which is quicker than serial or sequential read-out of data from the detector arrays (also much quicker than read-out of data from a large detector array). Such quicker read-out capability also enables shorter time intervals from one simultaneous exposure ofthe detector arrays to the next.
- This capability is important for any application where multiple macro-images have to be acquired quickly one after the other, such as when a path of macro-images ofthe earth's surface are being acquired with the camera system 150 from an airplane. Only medium accuracy is needed for the shutter synchronization for simultaneous exposure ofthe detector arrays 152, 162, 172, 182, because the sub-image 1", 2", 3", 4" stitching process compensates for small errors in the synchronization.
- the method and apparatus of this invention can also facilitate quick acquisition of large format, color digital images. It is conventional for photodetectors, such as charge-coupled devices (CCD's) to be arranged to produce color digital images directly. However, that approach is typically at the expense of geometric resolution. Therefore, there is a trade-off between: (i) creating color directly with a smaller area photodetector (CCD) array; or (ii) using black and white photodetectors (CCD's) along with multiple cones (lens to focal plane optical path) and color (wavelength) filters.
- CCD charge-coupled devices
- CCD color photodetector
- CCD black and white photodetectors
- Color information acquisition with a large format digital camera system 250 illustrated in Figure 10 is similar to the large format camera system 150 in Figures 6 - 9, but is equipped with additional sub-camera systems, including lens and detector arrays, for color and near infrared (nir).
- sub-camera systems 1, 2, 3, 4, r, g, b, nir there are eight sub-camera systems 1, 2, 3, 4, r, g, b, nir. Only the light sensor surfaces ofthe detector arrays ofthe sub-camera systems are shown in Figure 10, which is sufficient for pu ⁇ oses of this description, since the lens systems and other elements ofthe sub-camera systems can readily be understood by reference back to other embodiments described above.
- the sub-camera systems 1, 2, 3, 4 of the large format, color, digital camera system have their respective detector arrays 252, 262, 272, 282 are positioned in co-planar focal planes 254, 264, 274, 284, and the respective lens systems (not shown in Figure 10) have the same focal lengths, as described above for camera system 150 in Figure 6.
- optical or perspective centers 255, 265, 275, 285 for all ofthe sub-camera systems 1, 2, 3, 4 in Figure 10 are the same, i.e., are on the optic axes of the respective lens systems (not shown in Figure 10) ofthe respective sub-camera systems 1, 2, 3, 4 and correspond to the same center ofthe large area object being images (not shown in Figure 10), as also described above for camera system 150 in Figure 6.
- the sub-images 262", 272", 282" of sub-areas ofthe large area object being imaged (not shown in Figure 10) produced by the detector arrays 262, 272, 282, when combined together with the sub-images 252" of sub-areas ofthe large area image, fill the gaps between the sub-images 252" to produce the large format, macro-image 240", as shown in Figure 11.
- the overlapping sub-images 252", 262", 272", 282" are stitched together, as described above, to produce the large format, macro-image ofthe large area object with a high resolution.
- the blue (b), red ®, green (g), and near infrared (n) detector arrays 256, 266, 276, 286, respectively, in the large format, color camera system 250, are also exposed to the large area object.
- Their resulting sub-images 256"m 266", 276", 286" shown in Figures 12a-c and 13, respectively, are then also combined with the large format, macro-image 24" to produce colored and/or near infrared large format macro-images.
- the color sub-images can be produced at the same resolution as the black and white (non-color) sub-images 252", 262", 272", 282", if desired. However, such high resolution is usually not necessary for color pu ⁇ oses.
- each color detector array 256, 266, 276, 286, which is about the same size (e.g., 2,000 x 3,000) as the black and white detector arrays 252, 262, 272, 282, is exposed not just to a sub-area, but to the entire large area object (not shown in Figure 10). Therefore, color sub-images 256", 266", 276", 286" are at a lower resolution than the black and white sub-images 252", 262", 272", 282".
- the blue, red, green, and near infrared sub-cameras that produce the color sub-images 256", 266", 276", 286" have to be configured differently than the black and white sub-cameras to produce such lower resolution, color, sub-images 256", 266", 276", 286" on the same size detector arrays.
- the focal lengths ofthe lens systems for the color sub-cameras could be shorter than the focal lengths ofthe black and white sub-cameras in order to expose the color detector arrays 256, 266, 276, 286 to the entire large area object.
- optical and perspective centers 257, 267, 277, 287 ofthe color sub-cameras and resulting perspective sub-image centers 257", 267", 277", 287" are the same as each other and the same as those ofthe black and white sub-cameras.
- each of such color sub-images can be superimposed onto composite, large format, black and white image 240" of Figure 11 by stitching, as described above.
- the macro-image 240" can be colored.
- the black and white pixels are assigned the color values ofthe dominant (strongest or most intense) color pixels in corresponding geometric locations.
- the nir pixel values are assigned to the black and white pixels at corresponding geometric locations.
- the above-described merging ofthe high-resolution panchromatic or black and white macro-image 240" with the lower resolution color and/or near infrared sub-images 256", 266", 276", 286" is preferably, but not necessarily, based on exploiting the high resolution image content ofthe panchromatic or black and white macro-image for an image segmentation based on edge detection in order to avoid lower resolution color mismatch blurring or bleeding across sha ⁇ edges or borders of features in the macro-image 240".
- software used to detect distinct features for the stitching process ofthe panchromatic or black and white macro-image 240" can also detect edges of features.
- the software can also identify edges of features in the sub-images 70, 72, 74, 76 and in the macro-image 80, such as the edges ofthe stream 34, railroad 36, buildings 38, 40, 42, streets 44, 46, bridge 48, and the like.
- the break between, for example, dominant color pixels of water in the stream 34 segment of macro-image 80 color pixels ofthe adjacent land segment of macro-image 80 would be based on the pixels in the high resolution image 80 that identify the edges ofthe stream 34 rather than on the lower resolution pixels of color and/or nir sub-images.
- the break between the blue color (stream 34) segment and the green color (adjacent land) segment will be made by assigning blue values to the high resolution pixels on the one side ofthe stream 34 edge and assigning green values to the high resolution pixels on the other side ofthe stream 34 edge.
- CSD Correlated Double Sampling
- a preferred, but not essential, feature of this invention is to convert the CCD output signals twice per pixel using an analog to digital (a D) converter at the double sampling rate and to perform the substraction ofthe two sampled output signals in the digital domain, instead ofthe analog domain.
- a digital signal processor which may also be used for other pu ⁇ oses, such as data compression, or in a programmable logic device used to provide signals necessary to control the CCD.
- the voltage ofthe output signal 300 of the CCD is sampled first at 301, after deleting the charge ofthe previous exposure, i.e., ofthe previous pixel information to obtain its reset level. Then, the actual pixel information is shifted to the output gate and, the output signal 300 is sampled again at 302.
- the difference between the reset voltage 301 and the exposed pixel voltage 302 is the voltage value that is indicative of he actual intensity ofthe light energy incident on the actual pixel position ofthe CCD during the exposure. [0081] However, rather than making an analog subtraction of these two voltage levels 310, 302, the subtraction is preferably done digitally.
- the analog output signal 300 ofthe CCD is preamplified and conditioned in a preamplifier circuit 304 and converted to digital format in the analog to digital converter 306.
- a preamplifier circuit 304 and analog to digital converter 306 circuits and devices are well-known in the art.
- the digital signal processor or programmable logic device 308 then performs the subtraction digitally to read out a digital value of the light intensity that is incident on the CCD.
- an analog to digital converter 306 has to e used at twice the sampling rate.
- FIG. 1 An alternative to the large format, color, digital camera system 250 of Figure 10 is the large format, color, digital camera system 310 shown in Figure 15, especially for kinematic (moving) digital imaging systems, e.g., aerial imaging systems in ai ⁇ lanes and the like.
- kinematic (moving) digital imaging systems e.g., aerial imaging systems in ai ⁇ lanes and the like.
- the monochromatic, high resolution sub-camera systems 1, 2, 3, 4 of this camera system 350 are shown, for example, arranged much like those in the camera system embodiment 150 of Figures 6 and 7, thus the same component designation numbers are used in Figure 15 for those components and no further explanation ofthe acquisition ofthe monochromatic or panchromatic sub-images 1", 2", 3", 4" and stitching to produce the monochromatic or panchromatic macro-images 230" is needed.
- another sub-camera system 5 with a set of color linear detector arrays, e.g., blue 312, red 314, green 316, is positioned adjacent, or intermingled with, the monochromatic sub-camera systems 1, 2, 3, 4, as shown in Figure 15.
- the lens system for the color sub-camera system 5 is not shown in Figure 15, because persons skilled in the art know how to focus the large area object 230 (Figure 6) onto the linear color arrays 312, 314, 316 ( Figure 15). Suffice it to say that the full width ofthe large area object that comprises the macro-image 230" should be detected by the color linear arrays 312, 314, 316, as the camera system 310 moves as indicated by arrow 318 ( Figure 15) over the large area object 230 9 Figure 6).
- the color sub-images acquired by the linear color detector arrays 312, 314, 316 do not have to be the same resolution as the sub-images 1", 2", 3", 4" acquired by the monochromatic on panchromatic detector arrays 152, 162, 172, 182. Therefore, the densities of photodetectors in the linear color arrays 312, 314, 316 can be more or less than the detector arrays 152, 162, 172, 183.
- the optical and perspective center 315 ofthe color sub-camera system 5 be the same as the optical and perspective centers 155, 165, 175, 185 ofthe sub-camera systems 1, 2, 3, 4, so its shutter should be operated in timed sequence related to travel speed, as explained above.
- the color sub-images acquired by linear color detector arrays 312, 314, 316 can be added by merging them together with the macro-image 230" in the same manner as described above for the large format, color, digital camera system 250 of Figures 10 - 13.
- sub-camera system similar to, or the same as, sub-camera system 310 with one or more linear arrays of monochromatic, panchromatic, or even color photoelectron capabilities can also be used to fill holes, spaces, or imperfections in or between the other sub-images 1", 2", 3", 4".
- FIG 16 shows two sub-camera systems 1, 2 (again, no lens systems or other camera components are shown in Figure 16 to avoid unnecessary complexity).
- the sub-camera system 1 is shown, for example, with six detector arrays 322, and the sub-camera system 2 is shown, for example with four detector arrays 324.
- each detector array is considered to be, for example a 2,000 x 3,000 array.
- the lens are, for example, 100 mm focal length, and the distance between the lens is, for example 200 mm.
- Separate respective sensor electronics modules capture and store data each respective detector array.
- a respective mechanical shutter operates for each respective lens system. Integration and synchronization of all these sub-systems has to be accomplished. Overlap in flight paths is used of acquiring redundant data in marginal edge portions of adjacent macro-images, if it is desired to piece together multiple macro-images.
- the data flow to produce a seamless image is organized in three phases as follows. Phase A: Calibration [0085] A laboratory calibration will be performed for each camera before any production images are made.
- the calibration assesses the optical properties ofthe camera parts by imaging a known test field in a laboratory environment. This defines where the sub images of a particular optical system go in the stitched final image.
- k 1, . . . n
- Camera calibration is done by using a calibration object and image measurements and multi- image overlay (field calibration).
- the camera will be mounted in an airplane and images will be produced by each cone, and within each cone, with each square array (image patch or sub-image). These sub- images will be stored in a digital mass storage medium for subsequent processing.
- the shutter in each cone will be released or actuated sequentially at carefully timed intervals related to speed of flight, so that the optical center 325, 326 of each cone 1,2 is at the same location along the flight path when the image is taken. That means that if two cones are a distance of 30 centimeters apart in the direction of flight, their shutters will be released one after the other so that the ai ⁇ lane has flown 30 centimeters.
- the geometric differences between the sub images ofthe two cones would be minimized and the sub-images all have the same optical center.
- the sub-camera systems 1, 2 are stationary with respect to the large area object being imaged, or where the shutters are actuated simultaneously, then the sub- cameras 1, 2 both have to be oriented and focused in such a manner that their respective optical axis intersect the large area object being imaged at the same center point (e.g., point on the ground) at the same time.
- Phase C Converting the sub-images into a large seamless "stitched "frame image
- a stitched output image is being produced from the many sub-images stored in the storage medium.
- the total number of sub-images is kl + 12 + . . .kn, if there are n optical sub-systems (cones), and in each cone i exist ki imaging arrays.
- Figure 17 illustrates the basic idea ofthe data stitching operation. Steps are as follows:
- Cone 1 is selected as the master frame. Sub-images 324" of cone 2 are transformed into the gaps between sub-images 322" of cone 1.
- One ofthe cones 9e.g., cone 1) is called the "master cone” ofthe camera. Its sub-images 322" will be placed into the proper locations ofthe output image using the calibration information from Phase A sub-images 324" from the other n-1 cones are then used to fill in the empty spaces left by the sub-images 322" from the master cone 1. to fill in those spaces, the sub-images 324" of those secondary cones(s) 2 (or n-1) are being matched geometrically and radiometrically with the sub- images 322" ofthe master cone 1.
- each sub-image 324" of cone 2 is transformed into the system of cone 1 to a sub-pixel accuracy, while the fit was not sub-pixel accurate before (e.g. 4 pixels at image scale of 1/5000, speed of 70 m sec and 3 msec delays.
- the raw image data After the process of image data acquisition, the raw image data get post-processed and converted into the large format, seamless, color frame images by exploiting the computing owner of the one or multiple digital data processors ofthe intelligent image "box".
- the large seamless color frame images are stored on the one or more mass data storage media ofthe intelligent image "box”.
- a vehicle e.g. an aircraft
- the camera will cover perhaps a 90-degree field-of-view, thus 500 meters from a flight altitude of 500 meters, images will be overlapping greatly. This redundancy can be used to fill in holes in the final image (see Figure 19), and it can be used to reduce the number of sub-camera systems needed, as compared to still camera applications.
- the field-of-view is defined by the optical lens ofthe camera. To increase this field of view, one can use additional optical systems (cones), or use larger detector array, or use more array detector arrays in one single cone. These alternatives are supported in this innovation. [0096] Without the kinematics from a moving platform, the motion-based (in-flight) redundancy is not available.
- the large format digital camera systems of this invention can be equipped with a passive platform assembly 400, as shown in Figure 20, to minimize vibrations and to minimize movements caused by the carrier 402 ofthe camera 410, which may degrade the quality ofthe images acquired by the camera 410 (motion blur).
- the camera carrier 402 ofthe passive platform assembly 400 is equipped with small scale acceleration sensors 404, which are mounted on the camera carrier 402 in such a way that residual vibrations and movements ofthe camera 410 are detected over a short period of time.
- the signals of the small scale acceleration sensors 404 are transmitted to the camera control system (not shown in Figure 20) and evaluated in such a way that a decision can be made whether a serious degration ofthe image quality (motion blur) caused by such residual vibrations or movements is expected or is not expected.
- the camera 410 is designed in such a say that multiple exposure of detector arrays are possible within an extremely short period of time.
- the digital sub-images acquired by these multiple exposure will not be stored unless the evaluation ofthe signal ofthe small scale acceleration sensors 404 show no serious vibrations or movements ofthe camera carrier 402.
- the series of multiple exposures is repeated until the evaluation o the signal ofthe small scale acceleration sensors 404 show that a sub-image is being acquired without quality degration due to vibrations or movements (motion blur).
- Such non-degraded sub-images i.e., the last ofthe series, is exposed within an extremely short period of time and is stored on digital media or memory during a much longer time.
- the camera carrier 402 ofthe platform assembly 400 has outer swingers or outriggers 406, 408 with camera 410 looking down and two masses 412, 414 connected rigidly to the respective outriggers 406, 408.
- Two inner masses 416, 418 are hung on an inner swinger bar 420, which is connected pivotally 422 to the camera carrier 402.
- a damper 424 is connected between at least outrigger 406 and at least one ofthe inner masses 416, 418 or to the inner swinger bar 410 to dampen any vibrations or relative movements between the camera carrier 402 and the inner masses 416, 418. residual vibrations or relative movements in the 404 as explained above.
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP18169169.2A EP3388784B1 (en) | 2001-05-04 | 2002-05-06 | Method and large format camera for acquiring a large format image of a large area object |
AU2002308651A AU2002308651A1 (en) | 2001-05-04 | 2002-05-06 | Digital camera for and method of obtaining overlapping images |
EP02769388.6A EP1384046B1 (en) | 2001-05-04 | 2002-05-06 | Digital camera for and method of obtaining overlapping images |
DK02769388.6T DK1384046T3 (en) | 2001-05-04 | 2002-05-06 | DIGITAL CAMERA AND PROCEDURE TO OBTAIN OVERLAPPING IMAGES |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US28900801P | 2001-05-04 | 2001-05-04 | |
US60/289,008 | 2001-05-04 | ||
US10/140,532 US7009638B2 (en) | 2001-05-04 | 2002-05-06 | Self-calibrating, digital, large format camera with single or multiple detector arrays and single or multiple optical systems |
US10/140,532 | 2002-05-06 |
Publications (3)
Publication Number | Publication Date |
---|---|
WO2002090887A2 true WO2002090887A2 (en) | 2002-11-14 |
WO2002090887A3 WO2002090887A3 (en) | 2003-01-09 |
WO2002090887A8 WO2002090887A8 (en) | 2003-07-03 |
Family
ID=26838261
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2002/014566 WO2002090887A2 (en) | 2001-05-04 | 2002-05-06 | Digital camera for and method of obtaining overlapping images |
Country Status (4)
Country | Link |
---|---|
US (2) | US7009638B2 (en) |
EP (2) | EP1384046B1 (en) |
AU (1) | AU2002308651A1 (en) |
WO (1) | WO2002090887A2 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1377026A2 (en) * | 2002-06-21 | 2004-01-02 | Microsoft Corporation | Image Stitching |
CN103040435A (en) * | 2011-10-14 | 2013-04-17 | 上海美沃精密仪器有限公司 | Tilt-shift tomography eye scanning system and method thereof |
US10337862B2 (en) | 2006-11-30 | 2019-07-02 | Rafael Advanced Defense Systems Ltd. | Digital mapping system based on continuous scanning line of sight |
Families Citing this family (196)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002236100A (en) * | 2001-02-09 | 2002-08-23 | Hitachi Ltd | Method and apparatus for nondestructive inspection |
JP4512293B2 (en) * | 2001-06-18 | 2010-07-28 | パナソニック株式会社 | Monitoring system and monitoring method |
US20040257441A1 (en) * | 2001-08-29 | 2004-12-23 | Geovantage, Inc. | Digital imaging system for airborne applications |
US20030048357A1 (en) * | 2001-08-29 | 2003-03-13 | Geovantage, Inc. | Digital imaging system for airborne applications |
JP2003143459A (en) * | 2001-11-02 | 2003-05-16 | Canon Inc | Compound-eye image pickup system and device provided therewith |
AU2003233403A1 (en) * | 2002-03-15 | 2003-09-29 | Kirtas Technologies, Inc. | Page turning apparatus with a vacuum plenum and an adaptive air fluffer |
WO2003087929A1 (en) * | 2002-04-10 | 2003-10-23 | Pan-X Imaging, Inc. | A digital imaging system |
US8483960B2 (en) | 2002-09-20 | 2013-07-09 | Visual Intelligence, LP | Self-calibrated, remote imaging and data processing system |
US7893957B2 (en) * | 2002-08-28 | 2011-02-22 | Visual Intelligence, LP | Retinal array compound camera system |
US8994822B2 (en) | 2002-08-28 | 2015-03-31 | Visual Intelligence Lp | Infrastructure mapping system and method |
CA2496753A1 (en) | 2002-09-10 | 2004-03-25 | Kirtas Technologies, Inc. | Automated page turning apparatus to assist in viewing pages of a document |
US6928194B2 (en) * | 2002-09-19 | 2005-08-09 | M7 Visual Intelligence, Lp | System for mosaicing digital ortho-images |
USRE49105E1 (en) | 2002-09-20 | 2022-06-14 | Vi Technologies, Llc | Self-calibrated, remote imaging and data processing system |
US7424133B2 (en) * | 2002-11-08 | 2008-09-09 | Pictometry International Corporation | Method and apparatus for capturing, geolocating and measuring oblique images |
KR20040068438A (en) * | 2003-01-25 | 2004-07-31 | 삼성전자주식회사 | Walking type robot and a moving method thereof |
CN1788188B (en) * | 2003-06-20 | 2012-02-29 | 三菱电机株式会社 | Picked-up image display method and device |
US7405761B2 (en) | 2003-10-01 | 2008-07-29 | Tessera North America, Inc. | Thin camera having sub-pixel resolution |
WO2005072370A2 (en) * | 2004-01-26 | 2005-08-11 | Digital Optics Corporation | Thin camera having sub-pixel resolution |
US8724006B2 (en) * | 2004-01-26 | 2014-05-13 | Flir Systems, Inc. | Focal plane coding for digital imaging |
US7773143B2 (en) * | 2004-04-08 | 2010-08-10 | Tessera North America, Inc. | Thin color camera having sub-pixel resolution |
US7436438B2 (en) * | 2004-03-16 | 2008-10-14 | Creative Technology Ltd. | Digital still camera and method of forming a panoramic image |
WO2006022855A2 (en) * | 2004-03-18 | 2006-03-02 | Northrop Grumman Corporation | Multi-camera image stitching for a distributed aperture system |
US8049806B2 (en) * | 2004-09-27 | 2011-11-01 | Digitaloptics Corporation East | Thin camera and associated methods |
US8953087B2 (en) * | 2004-04-08 | 2015-02-10 | Flir Systems Trading Belgium Bvba | Camera system and associated methods |
AU2005271385B2 (en) * | 2004-08-04 | 2010-02-11 | Intergraph Software Technologies Company | Method of preparing a composite image with non-uniform resolution |
JP4023494B2 (en) * | 2005-01-18 | 2007-12-19 | ソニー株式会社 | IMAGING DEVICE, IMAGING METHOD, AND IMAGING DEVICE DESIGNING METHOD |
FR2882137B1 (en) * | 2005-02-15 | 2007-04-20 | Thales Sa | DEVICE FOR ACQUIRING A DIGITAL IMAGE BY SCANNING WITH PASSIVE STABILIZATION |
US7561191B2 (en) * | 2005-02-18 | 2009-07-14 | Eastman Kodak Company | Camera phone using multiple lenses and image sensors to provide an extended zoom range |
US7236306B2 (en) * | 2005-02-18 | 2007-06-26 | Eastman Kodak Company | Digital camera using an express zooming mode to provide expedited operation over an extended zoom range |
US20060187322A1 (en) * | 2005-02-18 | 2006-08-24 | Janson Wilbert F Jr | Digital camera using multiple fixed focal length lenses and multiple image sensors to provide an extended zoom range |
US7206136B2 (en) * | 2005-02-18 | 2007-04-17 | Eastman Kodak Company | Digital camera using multiple lenses and image sensors to provide an extended zoom range |
US7256944B2 (en) * | 2005-02-18 | 2007-08-14 | Eastman Kodak Company | Compact image capture assembly using multiple lenses and image sensors to provide an extended zoom range |
US7611060B2 (en) | 2005-03-11 | 2009-11-03 | Hand Held Products, Inc. | System and method to automatically focus an image reader |
US7780089B2 (en) * | 2005-06-03 | 2010-08-24 | Hand Held Products, Inc. | Digital picture taking optical reader having hybrid monochrome and color image sensor array |
US7568628B2 (en) | 2005-03-11 | 2009-08-04 | Hand Held Products, Inc. | Bar code reading device with global electronic shutter control |
US7770799B2 (en) | 2005-06-03 | 2010-08-10 | Hand Held Products, Inc. | Optical reader having reduced specular reflection read failures |
JP2007028008A (en) * | 2005-07-13 | 2007-02-01 | Konica Minolta Photo Imaging Inc | Imaging apparatus, imaging system, and operating program thereof |
US20090268983A1 (en) * | 2005-07-25 | 2009-10-29 | The Regents Of The University Of California | Digital imaging system and method using multiple digital image sensors to produce large high-resolution gapless mosaic images |
US8139130B2 (en) | 2005-07-28 | 2012-03-20 | Omnivision Technologies, Inc. | Image sensor with improved light sensitivity |
US8274715B2 (en) * | 2005-07-28 | 2012-09-25 | Omnivision Technologies, Inc. | Processing color and panchromatic pixels |
TWI266520B (en) * | 2005-08-22 | 2006-11-11 | Avision Inc | Method for simultaneously capturing images of multiple scan areas |
US7551802B2 (en) * | 2005-09-22 | 2009-06-23 | Konica Minolta Systems Laboratory, Inc. | Map image matching method and apparatus |
US7930627B2 (en) * | 2005-09-22 | 2011-04-19 | Konica Minolta Systems Laboratory, Inc. | Office document matching method and apparatus |
US9182228B2 (en) * | 2006-02-13 | 2015-11-10 | Sony Corporation | Multi-lens array system and method |
US20070188610A1 (en) * | 2006-02-13 | 2007-08-16 | The Boeing Company | Synoptic broad-area remote-sensing via multiple telescopes |
JP2007235877A (en) * | 2006-03-03 | 2007-09-13 | Fujifilm Corp | Multi-ccd solid-state imaging element module and imaging apparatus |
US20070236508A1 (en) * | 2006-03-28 | 2007-10-11 | Microsoft Corporation | Management of gridded map data regions |
FR2899344B1 (en) * | 2006-04-03 | 2008-08-15 | Eads Astrium Sas Soc Par Actio | METHOD FOR RESTITUTION OF MOVEMENTS OF THE OPTICAL LINE OF AN OPTICAL INSTRUMENT |
JP2009534772A (en) * | 2006-04-24 | 2009-09-24 | エヌエックスピー ビー ヴィ | Method and apparatus for generating a panoramic image from a video sequence |
US7912296B1 (en) | 2006-05-02 | 2011-03-22 | Google Inc. | Coverage mask generation for large images |
US7965902B1 (en) * | 2006-05-19 | 2011-06-21 | Google Inc. | Large-scale image processing using mass parallelization techniques |
US7916362B2 (en) * | 2006-05-22 | 2011-03-29 | Eastman Kodak Company | Image sensor with improved light sensitivity |
US8762493B1 (en) | 2006-06-22 | 2014-06-24 | Google Inc. | Hierarchical spatial data structure and 3D index data versioning for generating packet data |
JP4470926B2 (en) * | 2006-08-08 | 2010-06-02 | 国際航業株式会社 | Aerial photo image data set and its creation and display methods |
KR101264804B1 (en) * | 2006-08-16 | 2013-05-15 | 삼성전자주식회사 | panorama photography method and apparatus capable of informing optimum position of photographing |
US7873238B2 (en) | 2006-08-30 | 2011-01-18 | Pictometry International Corporation | Mosaic oblique images and methods of making and using same |
US8031258B2 (en) | 2006-10-04 | 2011-10-04 | Omnivision Technologies, Inc. | Providing multiple video signals from single sensor |
US20080100711A1 (en) * | 2006-10-26 | 2008-05-01 | Wisted Jeffrey M | Integrated Multiple Imaging Device |
CN101197044B (en) * | 2006-12-06 | 2011-02-02 | 鸿富锦精密工业(深圳)有限公司 | Image synthesis system and method |
US7575375B2 (en) * | 2007-01-11 | 2009-08-18 | General Electric Company | Systems and methods for reducing movement of an object |
US8593518B2 (en) * | 2007-02-01 | 2013-11-26 | Pictometry International Corp. | Computer system for continuous oblique panning |
KR101456652B1 (en) * | 2007-02-01 | 2014-11-04 | 이섬 리서치 디벨러프먼트 컴파니 오브 더 히브루 유니버시티 오브 예루살렘 엘티디. | Method and System for Video Indexing and Video Synopsis |
US8520079B2 (en) * | 2007-02-15 | 2013-08-27 | Pictometry International Corp. | Event multiplexer for managing the capture of images |
US7978239B2 (en) * | 2007-03-01 | 2011-07-12 | Eastman Kodak Company | Digital camera using multiple image sensors to provide improved temporal sampling |
US20080231698A1 (en) * | 2007-03-20 | 2008-09-25 | Alan Edward Kaplan | Vehicle video control system |
US8098956B2 (en) | 2007-03-23 | 2012-01-17 | Vantana Medical Systems, Inc. | Digital microscope slide scanning system and methods |
US8385672B2 (en) * | 2007-05-01 | 2013-02-26 | Pictometry International Corp. | System for detecting image abnormalities |
US9262818B2 (en) | 2007-05-01 | 2016-02-16 | Pictometry International Corp. | System for detecting image abnormalities |
WO2008157764A1 (en) * | 2007-06-21 | 2008-12-24 | Kirtas Technologies, Inc. | Automated page turning apparatus to assist in viewing pages of a document |
US20090041368A1 (en) * | 2007-08-06 | 2009-02-12 | Microsoft Corporation | Enhancing digital images using secondary optical systems |
US8063941B2 (en) * | 2007-08-06 | 2011-11-22 | Microsoft Corporation | Enhancing digital images using secondary optical systems |
US7859572B2 (en) * | 2007-08-06 | 2010-12-28 | Microsoft Corporation | Enhancing digital images using secondary optical systems |
US7991226B2 (en) * | 2007-10-12 | 2011-08-02 | Pictometry International Corporation | System and process for color-balancing a series of oblique images |
US8531472B2 (en) | 2007-12-03 | 2013-09-10 | Pictometry International Corp. | Systems and methods for rapid three-dimensional modeling with real façade texture |
US20090180085A1 (en) * | 2008-01-15 | 2009-07-16 | Kirtas Technologies, Inc. | System and method for large format imaging |
IL188825A0 (en) * | 2008-01-16 | 2008-11-03 | Orbotech Ltd | Inspection of a substrate using multiple cameras |
US8520054B2 (en) * | 2008-01-23 | 2013-08-27 | Techtol Holdings, Llc | System and method to quickly acquire images |
US8675068B2 (en) | 2008-04-11 | 2014-03-18 | Nearmap Australia Pty Ltd | Systems and methods of capturing large area images in detail including cascaded cameras and/or calibration features |
US8497905B2 (en) | 2008-04-11 | 2013-07-30 | nearmap australia pty ltd. | Systems and methods of capturing large area images in detail including cascaded cameras and/or calibration features |
US8260085B2 (en) * | 2008-06-03 | 2012-09-04 | Bae Systems Information Solutions Inc. | Fusion of image block adjustments for the generation of a ground control network |
JP4513905B2 (en) * | 2008-06-27 | 2010-07-28 | ソニー株式会社 | Signal processing apparatus, signal processing method, program, and recording medium |
US8588547B2 (en) | 2008-08-05 | 2013-11-19 | Pictometry International Corp. | Cut-line steering methods for forming a mosaic image of a geographical area |
US8378290B1 (en) * | 2008-09-02 | 2013-02-19 | Flir Systems, Inc. | Sensor calibration systems and methods for infrared cameras |
US8049163B1 (en) | 2008-09-02 | 2011-11-01 | Flir Systems, Inc. | Calibration systems and methods for infrared cameras |
JP5030906B2 (en) * | 2008-09-11 | 2012-09-19 | 株式会社日立ハイテクノロジーズ | Panorama image synthesis method and apparatus using scanning charged particle microscope |
US8253815B2 (en) * | 2008-09-16 | 2012-08-28 | Altia Systems Inc. | Synchronized multiple imager system and method |
US20100079508A1 (en) * | 2008-09-30 | 2010-04-01 | Andrew Hodge | Electronic devices with gaze detection capabilities |
DE102008058311A1 (en) * | 2008-11-18 | 2010-07-22 | Jena-Optronik Gmbh | Arrangement for missile-based image data recording from the surface of a celestial body |
EP2683156A3 (en) | 2008-12-12 | 2015-02-11 | Testo AG | Thermal imaging camera |
US9091755B2 (en) | 2009-01-19 | 2015-07-28 | Microsoft Technology Licensing, Llc | Three dimensional image capture system for imaging building facades using a digital camera, near-infrared camera, and laser range finder |
US8300108B2 (en) * | 2009-02-02 | 2012-10-30 | L-3 Communications Cincinnati Electronics Corporation | Multi-channel imaging devices comprising unit cells |
US8441532B2 (en) * | 2009-02-24 | 2013-05-14 | Corning Incorporated | Shape measurement of specular reflective surface |
CN102405431B (en) | 2009-03-11 | 2015-09-16 | 美国樱花检验仪器株式会社 | Auto focusing method and autofocus device |
WO2010116369A1 (en) * | 2009-04-07 | 2010-10-14 | Nextvision Stabilized Systems Ltd | Methods of manufacturing a camera system having multiple image sensors |
JP2010250612A (en) * | 2009-04-16 | 2010-11-04 | Canon Inc | Image processing apparatus and image processing method |
US20100265313A1 (en) * | 2009-04-17 | 2010-10-21 | Sony Corporation | In-camera generation of high quality composite panoramic images |
JP4854819B2 (en) * | 2009-05-18 | 2012-01-18 | 小平アソシエイツ株式会社 | Image information output method |
US8401222B2 (en) * | 2009-05-22 | 2013-03-19 | Pictometry International Corp. | System and process for roof measurement using aerial imagery |
US20100316291A1 (en) * | 2009-06-11 | 2010-12-16 | Shulan Deng | Imaging terminal having data compression |
US8817096B1 (en) * | 2009-06-22 | 2014-08-26 | Lockheed Martin Corporation | Imaging systems and methods for generating image data |
US8462209B2 (en) * | 2009-06-26 | 2013-06-11 | Keyw Corporation | Dual-swath imaging system |
US20110169985A1 (en) * | 2009-07-23 | 2011-07-14 | Four Chambers Studio, LLC | Method of Generating Seamless Mosaic Images from Multi-Axis and Multi-Focus Photographic Data |
DE102009050073A1 (en) | 2009-10-20 | 2011-04-21 | Fachhochschule Gelsenkirchen | Image sensor arrangement for acquiring image information for automatic image data processing |
US9330494B2 (en) * | 2009-10-26 | 2016-05-03 | Pictometry International Corp. | Method for the automatic material classification and texture simulation for 3D models |
KR20110050834A (en) * | 2009-11-09 | 2011-05-17 | 삼성전자주식회사 | Appratus and method for registratiing image in portable terminal |
US8542286B2 (en) * | 2009-11-24 | 2013-09-24 | Microsoft Corporation | Large format digital camera with multiple optical systems and detector arrays |
US8665316B2 (en) | 2009-11-24 | 2014-03-04 | Microsoft Corporation | Multi-resolution digital large format camera with multiple detector arrays |
TWI417639B (en) * | 2009-12-30 | 2013-12-01 | Ind Tech Res Inst | Method and system for forming surrounding seamless bird-view image |
IN2012DN06329A (en) | 2010-01-26 | 2015-10-02 | Saab Ab | |
US20110228115A1 (en) * | 2010-03-16 | 2011-09-22 | Microsoft Corporation | Large Format Digital Camera |
US8896668B2 (en) | 2010-04-05 | 2014-11-25 | Qualcomm Incorporated | Combining data from multiple image sensors |
US9001227B2 (en) | 2010-04-05 | 2015-04-07 | Qualcomm Incorporated | Combining data from multiple image sensors |
TW201138432A (en) * | 2010-04-28 | 2011-11-01 | Hon Hai Prec Ind Co Ltd | Monitoring system and monitoring method |
CN102238369A (en) * | 2010-04-29 | 2011-11-09 | 鸿富锦精密工业(深圳)有限公司 | Monitoring system and method |
US8970672B2 (en) | 2010-05-28 | 2015-03-03 | Qualcomm Incorporated | Three-dimensional image processing |
US8477190B2 (en) | 2010-07-07 | 2013-07-02 | Pictometry International Corp. | Real-time moving platform management system |
US10139613B2 (en) | 2010-08-20 | 2018-11-27 | Sakura Finetek U.S.A., Inc. | Digital microscope and method of sensing an image of a tissue sample |
WO2012044297A1 (en) | 2010-09-30 | 2012-04-05 | Empire Technology Development Llc | Automatic flight control for uav based solid modeling |
US8842168B2 (en) * | 2010-10-29 | 2014-09-23 | Sony Corporation | Multi-view video and still 3D capture system |
US20120113213A1 (en) * | 2010-11-05 | 2012-05-10 | Teledyne Dalsa, Inc. | Wide format sensor |
US8866890B2 (en) * | 2010-11-05 | 2014-10-21 | Teledyne Dalsa, Inc. | Multi-camera |
US9124881B2 (en) * | 2010-12-03 | 2015-09-01 | Fly's Eye Imaging LLC | Method of displaying an enhanced three-dimensional images |
US8823732B2 (en) | 2010-12-17 | 2014-09-02 | Pictometry International Corp. | Systems and methods for processing images with edge detection and snap-to feature |
JP5775354B2 (en) | 2011-04-28 | 2015-09-09 | 株式会社トプコン | Takeoff and landing target device and automatic takeoff and landing system |
NL1038793C2 (en) * | 2011-05-03 | 2012-11-06 | Dirk Hendrik Martin Ruiter | MILK INSTALLATION. |
JP5882693B2 (en) * | 2011-11-24 | 2016-03-09 | 株式会社トプコン | Aerial photography imaging method and aerial photography imaging apparatus |
EP2527787B1 (en) | 2011-05-23 | 2019-09-11 | Kabushiki Kaisha TOPCON | Aerial photograph image pickup method and aerial photograph image pickup apparatus |
JP5708278B2 (en) * | 2011-06-08 | 2015-04-30 | ソニー株式会社 | Information processing apparatus and information processing method |
AU2012364820B2 (en) | 2011-06-10 | 2016-11-10 | Pictometry International Corp. | System and method for forming a video stream containing GIS data in real-time |
WO2013003485A1 (en) * | 2011-06-28 | 2013-01-03 | Inview Technology Corporation | Image sequence reconstruction based on overlapping measurement subsets |
IL216515A (en) * | 2011-11-22 | 2015-02-26 | Israel Aerospace Ind Ltd | System and method for processing multi-camera array images |
CN102635059B (en) * | 2012-02-23 | 2014-02-26 | 江西省交通设计研究院有限责任公司 | Bridge investigation method |
US9183538B2 (en) | 2012-03-19 | 2015-11-10 | Pictometry International Corp. | Method and system for quick square roof reporting |
JP6122591B2 (en) | 2012-08-24 | 2017-04-26 | 株式会社トプコン | Photogrammetry camera and aerial photography equipment |
WO2014062885A1 (en) | 2012-10-17 | 2014-04-24 | Bio-Rad Laboratories, Inc. | Image capture for large analyte arrays |
US9036002B2 (en) * | 2012-10-30 | 2015-05-19 | Eastman Kodak Company | System for making a panoramic image |
RU2518365C1 (en) * | 2012-11-22 | 2014-06-10 | Александр Николаевич Барышников | Optical-electronic photodetector (versions) |
US9244272B2 (en) | 2013-03-12 | 2016-01-26 | Pictometry International Corp. | Lidar system producing multiple scan paths and method of making and using same |
US9881163B2 (en) | 2013-03-12 | 2018-01-30 | Pictometry International Corp. | System and method for performing sensitive geo-spatial processing in non-sensitive operator environments |
US9275080B2 (en) | 2013-03-15 | 2016-03-01 | Pictometry International Corp. | System and method for early access to captured images |
US9720223B2 (en) | 2013-03-15 | 2017-08-01 | Lawrence Livermore National Security, Llc | Integrated telescope assembly |
US9753950B2 (en) | 2013-03-15 | 2017-09-05 | Pictometry International Corp. | Virtual property reporting for automatic structure detection |
DE102013103971A1 (en) | 2013-04-19 | 2014-11-06 | Sensovation Ag | Method for generating an overall picture of an object composed of several partial images |
US20140365639A1 (en) | 2013-06-06 | 2014-12-11 | Zih Corp. | Method, apparatus, and computer program product for performance analytics for determining role, formation, and play data based on real-time data for proximity and movement of objects |
US10437658B2 (en) | 2013-06-06 | 2019-10-08 | Zebra Technologies Corporation | Method, apparatus, and computer program product for collecting and displaying sporting event data based on real time data for proximity and movement of objects |
US9715005B2 (en) | 2013-06-06 | 2017-07-25 | Zih Corp. | Method, apparatus, and computer program product improving real time location systems with multiple location technologies |
US9517417B2 (en) | 2013-06-06 | 2016-12-13 | Zih Corp. | Method, apparatus, and computer program product for performance analytics determining participant statistical data and game status data |
US10609762B2 (en) | 2013-06-06 | 2020-03-31 | Zebra Technologies Corporation | Method, apparatus, and computer program product improving backhaul of sensor and other data to real time location system network |
US11423464B2 (en) | 2013-06-06 | 2022-08-23 | Zebra Technologies Corporation | Method, apparatus, and computer program product for enhancement of fan experience based on location data |
US9699278B2 (en) | 2013-06-06 | 2017-07-04 | Zih Corp. | Modular location tag for a real time location system network |
FR3013135B1 (en) * | 2013-11-14 | 2015-12-25 | Imao | VERY HIGH RESOLUTION PHOTOGRAPHIC CAMERA WITH VERY LARGE IMAGE SIZE |
US10007102B2 (en) | 2013-12-23 | 2018-06-26 | Sakura Finetek U.S.A., Inc. | Microscope with slide clamping assembly |
US9612598B2 (en) | 2014-01-10 | 2017-04-04 | Pictometry International Corp. | Unmanned aircraft structure evaluation system and method |
US9292913B2 (en) | 2014-01-31 | 2016-03-22 | Pictometry International Corp. | Augmented three dimensional point collection of vertical structures |
WO2015120188A1 (en) | 2014-02-08 | 2015-08-13 | Pictometry International Corp. | Method and system for displaying room interiors on a floor plan |
DE102014207315A1 (en) * | 2014-04-16 | 2015-10-22 | Spheronvr Ag | camera assembly |
US9626616B2 (en) | 2014-06-05 | 2017-04-18 | Zih Corp. | Low-profile real-time location system tag |
US9668164B2 (en) | 2014-06-05 | 2017-05-30 | Zih Corp. | Receiver processor for bandwidth management of a multiple receiver real-time location system (RTLS) |
GB2542298B (en) | 2014-06-05 | 2021-01-20 | Zebra Tech Corp | Method for iterative target location in a multiple receiver target location system |
US9661455B2 (en) | 2014-06-05 | 2017-05-23 | Zih Corp. | Method, apparatus, and computer program product for real time location system referencing in physically and radio frequency challenged environments |
GB2541834B (en) | 2014-06-05 | 2020-12-23 | Zebra Tech Corp | Receiver processor for adaptive windowing and high-resolution TOA determination in a multiple receiver target location system |
GB2541617B (en) | 2014-06-05 | 2021-07-07 | Zebra Tech Corp | Systems, apparatus and methods for variable rate ultra-wideband communications |
US20150375083A1 (en) | 2014-06-05 | 2015-12-31 | Zih Corp. | Method, Apparatus, And Computer Program Product For Enhancement Of Event Visualizations Based On Location Data |
EP3152585B1 (en) | 2014-06-06 | 2022-04-27 | Zebra Technologies Corporation | Method, apparatus, and computer program product improving real time location systems with multiple location technologies |
US9759803B2 (en) | 2014-06-06 | 2017-09-12 | Zih Corp. | Method, apparatus, and computer program product for employing a spatial association model in a real time location system |
US9858645B2 (en) * | 2014-07-01 | 2018-01-02 | Digitalglobe, Inc. | Automated seamline construction for high-quality high-resolution orthomosaics |
CN105592294B (en) * | 2014-10-21 | 2018-10-02 | 中国石油化工股份有限公司 | A kind of monitoring system of VSP excitations big gun group |
JP6354619B2 (en) * | 2015-02-27 | 2018-07-11 | ブラザー工業株式会社 | Image processing apparatus and computer program |
RU2616103C2 (en) * | 2015-09-11 | 2017-04-12 | Валентина Николаевна Панфилова | Automated method of charting road traffic accident by using global positioning system and cameras |
TWI582388B (en) | 2015-10-16 | 2017-05-11 | 財團法人工業技術研究院 | Image stitching method and image stitching device |
TWI607901B (en) * | 2015-11-06 | 2017-12-11 | 財團法人工業技術研究院 | Image inpainting system area and method using the same |
GB2549446B (en) * | 2015-12-08 | 2020-05-27 | William Mort Hugh | Utility meter register optical reading device |
US10194089B2 (en) | 2016-02-08 | 2019-01-29 | Qualcomm Incorporated | Systems and methods for implementing seamless zoom function using multiple cameras |
WO2017142788A1 (en) | 2016-02-15 | 2017-08-24 | Pictometry International Corp. | Automated system and methodology for feature extraction |
US10671648B2 (en) | 2016-02-22 | 2020-06-02 | Eagle View Technologies, Inc. | Integrated centralized property database systems and methods |
US10546385B2 (en) * | 2016-02-25 | 2020-01-28 | Technion Research & Development Foundation Limited | System and method for image capture device pose estimation |
US10290111B2 (en) | 2016-07-26 | 2019-05-14 | Qualcomm Incorporated | Systems and methods for compositing images |
US10297034B2 (en) | 2016-09-30 | 2019-05-21 | Qualcomm Incorporated | Systems and methods for fusing images |
TWI614500B (en) * | 2016-11-21 | 2018-02-11 | 國立清華大學 | Image registering and stitching method and image detection system for cell detection chip |
US11280803B2 (en) | 2016-11-22 | 2022-03-22 | Sakura Finetek U.S.A., Inc. | Slide management system |
US11158060B2 (en) * | 2017-02-01 | 2021-10-26 | Conflu3Nce Ltd | System and method for creating an image and/or automatically interpreting images |
US11176675B2 (en) | 2017-02-01 | 2021-11-16 | Conflu3Nce Ltd | System and method for creating an image and/or automatically interpreting images |
US20210365725A1 (en) * | 2017-02-01 | 2021-11-25 | Conflu3Nce Ltd | System and method for creating an image and/or automatically interpreting images |
CN110462678B (en) * | 2017-03-30 | 2023-05-02 | 富士胶片株式会社 | Image processing apparatus and image processing method |
US11747172B2 (en) | 2017-10-24 | 2023-09-05 | Deer Technology Ltd. | Utility meter register optical reading device |
CN108444451B (en) * | 2018-03-19 | 2020-10-20 | 中国人民解放军战略支援部队信息工程大学 | Planet surface image matching method and device |
US10153317B1 (en) * | 2018-04-26 | 2018-12-11 | Alentic Microscience Inc. | Image sensors comprising a chamber to confine a sample at a sensor surface of successive light sensitive subareas and non-light sensitive areas |
WO2020024576A1 (en) * | 2018-08-01 | 2020-02-06 | Oppo广东移动通信有限公司 | Camera calibration method and apparatus, electronic device, and computer-readable storage medium |
CN109274923A (en) * | 2018-11-21 | 2019-01-25 | 南京文采工业智能研究院有限公司 | A kind of Intellisense device for industrial equipment |
CN109702319B (en) * | 2019-01-24 | 2020-06-26 | 中国科学院西安光学精密机械研究所 | Online graph splicing method for large-breadth laser processing |
TW202031539A (en) * | 2019-02-25 | 2020-09-01 | 先進光電科技股份有限公司 | Action vehicle auxiliary system |
AT522995B1 (en) | 2019-10-07 | 2021-05-15 | Vexcel Imaging Gmbh | Sensor arrangement |
AT523556A1 (en) * | 2020-02-26 | 2021-09-15 | Vexcel Imaging Gmbh | Image correction procedure |
US11509837B2 (en) | 2020-05-12 | 2022-11-22 | Qualcomm Incorporated | Camera transition blending |
CN112040097B (en) * | 2020-07-24 | 2022-03-04 | 北京空间机电研究所 | Large-breadth camera system with spliced view fields |
CN112212835B (en) * | 2020-09-15 | 2022-11-11 | 广州全成多维信息技术有限公司 | Oblique photography and control method based on single-lens unmanned aerial vehicle |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0053640A1 (en) * | 1980-12-04 | 1982-06-16 | Interval Eizo Yugen Kaisha | A computerized control for aerial photography |
US4951136A (en) * | 1988-01-26 | 1990-08-21 | Deutsche Forschungs- Und Versuchsanstalt Fur Luft- Und Raumfahrt E.V. | Method and apparatus for remote reconnaissance of the earth |
DE4114304C1 (en) * | 1991-05-02 | 1992-04-30 | Messerschmitt-Boelkow-Blohm Gmbh, 8012 Ottobrunn, De | Area surveillance camera - has detector surfaces separated from imaging areas and in form of parallel strips of optoelectronic devices |
WO1997008511A1 (en) * | 1995-08-24 | 1997-03-06 | Vexcel Imaging Gmbh | System for scanning and digitizing large images using a reseau |
US5611033A (en) * | 1991-12-10 | 1997-03-11 | Logitech, Inc. | Apparatus and method for automerging images by matching features and aligning images |
DE19919487A1 (en) * | 1999-04-29 | 2000-11-23 | Wolf D Teuchert | Recording process and photogrammetric camera therefor |
US6205259B1 (en) * | 1992-04-09 | 2001-03-20 | Olympus Optical Co., Ltd. | Image processing apparatus |
EP1089548A2 (en) * | 1999-09-29 | 2001-04-04 | Xerox Corporation | Optical system |
Family Cites Families (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE2940871C2 (en) * | 1979-10-09 | 1983-11-10 | Messerschmitt-Bölkow-Blohm GmbH, 8012 Ottobrunn | Photogrammetric method for aircraft and spacecraft for digital terrain display |
JPS5672575A (en) * | 1979-11-19 | 1981-06-16 | Toshiba Corp | Picture input unit |
US4323925A (en) * | 1980-07-07 | 1982-04-06 | Avco Everett Research Laboratory, Inc. | Method and apparatus for arraying image sensor modules |
DE3428325C2 (en) | 1984-08-01 | 1992-01-02 | Wilfried Prof. Dr.-Ing. 3013 Barsinghausen Wester-Ebbinghaus | Arrangement of opto-electrical solid-state sensor surfaces in the photogrammetric imaging system |
FR2577669B1 (en) * | 1985-02-21 | 1992-05-15 | Fuji Photo Film Co Ltd | IMAGE READING METHOD AND APPARATUS |
FR2628276B1 (en) * | 1988-03-02 | 1991-06-28 | France Etat | METHOD FOR REDUCING THROUGHPUT OF A SEQUENCE OF DATA FOR ASSISTING THE RECONSTRUCTION OF AN ELECTRONIC IMAGE FROM A SUB-SAMPLE SIGNAL |
US5016109A (en) * | 1990-07-02 | 1991-05-14 | Bell South Corporation | Apparatus and method for segmenting a field of view into contiguous, non-overlapping, vertical and horizontal sub-fields |
DE4123791C2 (en) | 1991-07-18 | 1995-10-26 | Daimler Benz Aerospace Ag | Digital area camera with multiple optics |
US5317394A (en) * | 1992-04-30 | 1994-05-31 | Westinghouse Electric Corp. | Distributed aperture imaging and tracking system |
FR2696843B1 (en) * | 1992-10-14 | 1994-12-09 | Matra Sep Imagerie Inf | High resolution remote camera for aerial carrier. |
US5757423A (en) * | 1993-10-22 | 1998-05-26 | Canon Kabushiki Kaisha | Image taking apparatus |
US5889553A (en) * | 1993-11-17 | 1999-03-30 | Canon Kabushiki Kaisha | Image pickup apparatus capable of high resolution imaging |
JP3471964B2 (en) * | 1995-03-28 | 2003-12-02 | キヤノン株式会社 | Imaging device |
JP3227478B2 (en) * | 1995-05-17 | 2001-11-12 | シャープ株式会社 | Still image pickup device |
US5604534A (en) * | 1995-05-24 | 1997-02-18 | Omni Solutions International, Ltd. | Direct digital airborne panoramic camera system and method |
JPH09230443A (en) * | 1996-02-20 | 1997-09-05 | Yukio Igami | Omniazimuth simultaneous image pickup method |
US5982951A (en) * | 1996-05-28 | 1999-11-09 | Canon Kabushiki Kaisha | Apparatus and method for combining a plurality of images |
US6137535A (en) * | 1996-11-04 | 2000-10-24 | Eastman Kodak Company | Compact digital camera with segmented fields of view |
DE19714396A1 (en) | 1997-04-08 | 1998-10-15 | Zeiss Carl Fa | Photogrammetric camera used in aircraft or satellite |
US6044181A (en) * | 1997-08-01 | 2000-03-28 | Microsoft Corporation | Focal length estimation method and apparatus for construction of panoramic mosaic images |
TW342969U (en) * | 1997-08-11 | 1998-10-11 | Mustek Systems Inc | Multi-images optics apparatus for mono photo-electric transferring module |
US5864133A (en) * | 1997-08-12 | 1999-01-26 | Mustek System Inc. | Cost-effective optical device |
US6424752B1 (en) * | 1997-10-06 | 2002-07-23 | Canon Kabushiki Kaisha | Image synthesis apparatus and image synthesis method |
US7006132B2 (en) * | 1998-02-25 | 2006-02-28 | California Institute Of Technology | Aperture coded camera for three dimensional imaging |
DE19830036C2 (en) * | 1998-06-26 | 2000-09-07 | Deutsch Zentr Luft & Raumfahrt | Device for geometrically high-resolution image recording on a microsatellite and associated method |
US6611289B1 (en) * | 1999-01-15 | 2003-08-26 | Yanbin Yu | Digital cameras using multiple sensors with multiple lenses |
US7019777B2 (en) * | 2000-04-21 | 2006-03-28 | Flight Landata, Inc. | Multispectral imaging system with spatial resolution enhancement |
US7071980B2 (en) * | 2000-07-27 | 2006-07-04 | Canon Kabushiki Kaisha | Image sensing apparatus |
JP4182638B2 (en) * | 2000-11-07 | 2008-11-19 | コニカミノルタホールディングス株式会社 | Imaging apparatus, method for synthesizing captured image, computer-readable recording medium recording image processing program, and imaging system |
US20040257441A1 (en) * | 2001-08-29 | 2004-12-23 | Geovantage, Inc. | Digital imaging system for airborne applications |
WO2003087929A1 (en) * | 2002-04-10 | 2003-10-23 | Pan-X Imaging, Inc. | A digital imaging system |
US7268804B2 (en) * | 2003-04-04 | 2007-09-11 | Stmicroelectronics, Inc. | Compound camera and method for synthesizing a virtual image from multiple input images |
EP2466871A3 (en) * | 2003-10-22 | 2017-05-03 | Panasonic Intellectual Property Management Co., Ltd. | Imaging apparatus and method for producing the same, portable equipment, and imaging sensor and method for producing the same. |
US7123298B2 (en) * | 2003-12-18 | 2006-10-17 | Avago Technologies Sensor Ip Pte. Ltd. | Color image sensor with imaging elements imaging on respective regions of sensor elements |
US20070188610A1 (en) * | 2006-02-13 | 2007-08-16 | The Boeing Company | Synoptic broad-area remote-sensing via multiple telescopes |
-
2002
- 2002-05-06 AU AU2002308651A patent/AU2002308651A1/en not_active Abandoned
- 2002-05-06 EP EP02769388.6A patent/EP1384046B1/en not_active Expired - Lifetime
- 2002-05-06 WO PCT/US2002/014566 patent/WO2002090887A2/en not_active Application Discontinuation
- 2002-05-06 EP EP18169169.2A patent/EP3388784B1/en not_active Expired - Lifetime
- 2002-05-06 US US10/140,532 patent/US7009638B2/en not_active Expired - Lifetime
-
2006
- 2006-03-07 US US11/371,210 patent/US7339614B2/en not_active Expired - Lifetime
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0053640A1 (en) * | 1980-12-04 | 1982-06-16 | Interval Eizo Yugen Kaisha | A computerized control for aerial photography |
US4951136A (en) * | 1988-01-26 | 1990-08-21 | Deutsche Forschungs- Und Versuchsanstalt Fur Luft- Und Raumfahrt E.V. | Method and apparatus for remote reconnaissance of the earth |
DE4114304C1 (en) * | 1991-05-02 | 1992-04-30 | Messerschmitt-Boelkow-Blohm Gmbh, 8012 Ottobrunn, De | Area surveillance camera - has detector surfaces separated from imaging areas and in form of parallel strips of optoelectronic devices |
US5611033A (en) * | 1991-12-10 | 1997-03-11 | Logitech, Inc. | Apparatus and method for automerging images by matching features and aligning images |
US6205259B1 (en) * | 1992-04-09 | 2001-03-20 | Olympus Optical Co., Ltd. | Image processing apparatus |
WO1997008511A1 (en) * | 1995-08-24 | 1997-03-06 | Vexcel Imaging Gmbh | System for scanning and digitizing large images using a reseau |
DE19919487A1 (en) * | 1999-04-29 | 2000-11-23 | Wolf D Teuchert | Recording process and photogrammetric camera therefor |
EP1089548A2 (en) * | 1999-09-29 | 2001-04-04 | Xerox Corporation | Optical system |
Non-Patent Citations (1)
Title |
---|
PATENT ABSTRACTS OF JAPAN vol. 1998, no. 01, 30 January 1998 (1998-01-30) & JP 09 230443 A (IGAMI YUKIO), 5 September 1997 (1997-09-05) * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1377026A2 (en) * | 2002-06-21 | 2004-01-02 | Microsoft Corporation | Image Stitching |
EP1377026A3 (en) * | 2002-06-21 | 2004-03-10 | Microsoft Corporation | Image Stitching |
US10337862B2 (en) | 2006-11-30 | 2019-07-02 | Rafael Advanced Defense Systems Ltd. | Digital mapping system based on continuous scanning line of sight |
CN103040435A (en) * | 2011-10-14 | 2013-04-17 | 上海美沃精密仪器有限公司 | Tilt-shift tomography eye scanning system and method thereof |
Also Published As
Publication number | Publication date |
---|---|
US20020163582A1 (en) | 2002-11-07 |
EP3388784B1 (en) | 2019-07-17 |
EP3388784A1 (en) | 2018-10-17 |
AU2002308651A1 (en) | 2002-11-18 |
US20060215038A1 (en) | 2006-09-28 |
EP1384046A2 (en) | 2004-01-28 |
EP1384046B1 (en) | 2018-10-03 |
WO2002090887A8 (en) | 2003-07-03 |
WO2002090887A3 (en) | 2003-01-09 |
US7339614B2 (en) | 2008-03-04 |
US7009638B2 (en) | 2006-03-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP1384046B1 (en) | Digital camera for and method of obtaining overlapping images | |
US6211906B1 (en) | Computerized component variable interference filter imaging spectrometer system method and apparatus | |
Schenk | Introduction to photogrammetry | |
US5259037A (en) | Automated video imagery database generation using photogrammetry | |
EP2247094B1 (en) | Orthophotographic image creating method and imaging device | |
US7019777B2 (en) | Multispectral imaging system with spatial resolution enhancement | |
US5187754A (en) | Forming, with the aid of an overview image, a composite image from a mosaic of images | |
Tempelmann et al. | Photogrammetric software for the LH Systems ADS40 airborne digital sensor | |
CN107492069B (en) | Image fusion method based on multi-lens sensor | |
JPH02110314A (en) | Remote investigating method and device for surface of ground | |
US7859572B2 (en) | Enhancing digital images using secondary optical systems | |
JP2004531113A (en) | Omnidirectional three-dimensional image data acquisition apparatus by annotation, method and method for enlarging photosensitive area | |
CN101916455A (en) | Method and device for reconstructing three-dimensional model of high dynamic range texture | |
TW201803330A (en) | Multi-lines image sensor device, photographing device, moving object detecting device and moving object detecting program | |
DK1384046T3 (en) | DIGITAL CAMERA AND PROCEDURE TO OBTAIN OVERLAPPING IMAGES | |
Madani et al. | DMC practical experience and accuracy assessment | |
Puerta et al. | Photogrammetry as an Engineering Design Tool | |
Leberl et al. | Novel concepts for aerial digital cameras | |
RU44838U1 (en) | AVIATION OPTICAL-ELECTRONIC SYSTEM OF MONITORING AND REGISTRATION | |
CN107843341A (en) | A kind of Space-based Space high-resolution multispectral imaging method and system | |
RU2258204C1 (en) | Method of remote inspection of electric circuits by means of thermal-videocamera | |
Torlegård | Sensors for photogrammetric mapping: review and prospects | |
Aso et al. | Aurora stereo observations in Iceland | |
Huang et al. | Cylindrical panoramic cameras-from basic design to applications | |
Klette et al. | Modeling 3D scenes: Paradigm shifts in photogrammetry, remote sensing and computer vision |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A2 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SI SK SL TJ TM TN TR TT TZ UA UG UZ VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A2 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
AK | Designated states |
Kind code of ref document: A3 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SI SK SL TJ TM TN TR TT TZ UA UG UZ VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A3 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
WR | Later publication of a revised version of an international search report | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2002769388 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 2002769388 Country of ref document: EP |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |
|
NENP | Non-entry into the national phase |
Ref country code: JP |
|
WWW | Wipo information: withdrawn in national office |
Country of ref document: JP |