US5699444A - Methods and apparatus for using image data to determine camera location and orientation - Google Patents

Methods and apparatus for using image data to determine camera location and orientation Download PDF

Info

Publication number
US5699444A
US5699444A US08/414,651 US41465195A US5699444A US 5699444 A US5699444 A US 5699444A US 41465195 A US41465195 A US 41465195A US 5699444 A US5699444 A US 5699444A
Authority
US
United States
Prior art keywords
points
images
point
location
cameras
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
US08/414,651
Inventor
Charles S. Palm
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Benhov GmbH LLC
Original Assignee
Synthonics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Synthonics Inc filed Critical Synthonics Inc
Priority to US08/414,651 priority Critical patent/US5699444A/en
Assigned to SYNTHONICS INCORPORATED reassignment SYNTHONICS INCORPORATED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: PALM, CHARLES S.
Application granted granted Critical
Publication of US5699444A publication Critical patent/US5699444A/en
Assigned to PATENT PORTFOLIOS CONSULTING, INC. reassignment PATENT PORTFOLIOS CONSULTING, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SYNTHONICS TECHNOLOGIES, INC.
Assigned to DIVERSIFIED PATENT INVESTMENTS, LLC. reassignment DIVERSIFIED PATENT INVESTMENTS, LLC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: PATENT PORTFOLIOS CONSULTING, INC.
Assigned to JADE PIXEL, LLC reassignment JADE PIXEL, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DIVERSIFIED PATENT INVESTMENTS, LLC
Assigned to PATENT PORTFOLIOS CONSULTING, INC. reassignment PATENT PORTFOLIOS CONSULTING, INC. CORRECTIVE ASSIGNMENT TO CORRECT THE NAME OF THE ASSIGNOR FOR A TRANSCRIPTIONAL ERROR PREVIOUSLY RECORDED ON REEL 013746 FRAME 0833. ASSIGNOR(S) HEREBY CONFIRMS THE CORRECTION OF A TRANSCRIPTIONAL ERROR IN THE ASSIGNOR'S NAME FROM SYNTHONICS TECHNOLOGIES, INC. TO SYNTHONICS INCORPORATED.. Assignors: SYNTHONICS INCORPORATED
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/147Details of sensors, e.g. sensor lenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/246Calibration of cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0081Depth or disparity estimation from stereoscopic image signals

Definitions

  • the invention relates to the field of image processing and more particularly to methods and apparatus for determining camera position and orientation from an image captured with that camera and to accurate surveying using such methods and apparatus.
  • Stereoscopic photographic cameras which utilize a single camera body and two objective lenses separated by a fixed distance, usually corresponding to the interocular distance.
  • Other such cameras use a single objective and external arrangements which form two image areas on film positioned on the camera's image plane.
  • Still other arrangements use two separate cameras separated by a fixed distance to form images corresponding to a left and right eye view of the scene being photographed.
  • stereoscopic photographic images of the prior art are developed, they are often viewed through separate eye pieces, one for each eye. Each eye piece projects a view of a respective one of the developed images which the user's eyes would have seen had the eyes viewed the scene directly. Depth is clearly discernable when viewing a stereoscopic image.
  • Calculations of depth is a difficult task when using images captured from different positions vis-a-vis the scene being photographed because the planar relationships which result from projection of a three dimensional scene onto a two dimensional plane do not undergo a linear transformation or mapping compared with the same points projected onto a different image plane.
  • Different portions of a scene viewed from one point relate differently to corresponding points from the same scene viewed from another point.
  • Planar surfaces which are viewed normally in one view are reduced in extent when viewed obliquely.
  • Aerial surveying is also known. Images are captured from an airplane or other vehicle in transit over an area to be surveyed at positions which are precisely known by modern navigation techniques. Position of significant ground features can then be calculated using sophisticated image processing techniques which often require supercomputers. Aerial surveying techniques have the advantage that they can be accomplished without the need to place people on the ground in the area to be surveyed. Inaccessible terrain can also be surveyed in this way. However, expensive image capture equipment is required and even with very good optics and image processing, the resolution is not always as good as one might like. Also, accurate measurements in the vertical direction are even more difficult to take using aerial techniques.
  • 3-D representation such as a wireframe
  • CAD/CAM computer assisted design or computer assisted manufacturing
  • Every recorded image whether it be a photograph, a video frame, a true perspective drawing or other form of recorded image, has associated with it a viewing location and viewing look angles that exactly describe the orientation of the recording mechanism relative to the recorded scene.
  • camera location was either estimated or known a priori by locating the position from which the picture was taken using surveying techniques.
  • rotation angle was assumed to be 0 (horizontal) and elevation and azimuth were either measured with varying degrees of accuracy or estimated.
  • surveying and measurement increase the set up time required before capturing images for analysis, often to the point where any hope of accurate measurements would be abandoned in favor of qualitative information which could be gleaned from images captured under uncontrolled conditions.
  • stereo photographs are frequently used to investigate and document accident or crime scenes.
  • the accuracy of the documentation depends to a high degree on knowing exactly the viewing parameters of the cameras at the time the photographs were taken.
  • Computer-generated renderings are often merged with actual photographs to convey an image of a completed construction project while still in the planning and review stages.
  • the viewing parameters of the computer rendering it is necessary for the viewing parameters of the computer rendering to be exactly the same as the viewing parameters of the camera that took the photograph.
  • the viewing parameters for any given recorded image are unknown and difficult to determine with a high degree of accuracy, even when the camera positions are physically measured relative to some established coordinate system.
  • the difficulties arise from the fact that the camera lens principle points are usually located inside the lens structure and therefore inaccessible for purposes of direct measurement.
  • the measurement of viewing angles is even more difficult to accomplish without the use of surveying type tripods, levels and transits.
  • Photogrammetry is a science that deals with measurements made from photographs. Generally, photogrammetrists use special camera equipment that generates fiducial marks on the photographs to assist in determining the viewing parameters. Non-photogrammetric cameras can be used in some analyses, however the associated techniques generally require knowing the locations of a large number of calibration points (five or more) that are identifiable in the recorded scene. Generally, the three-dimensional location of five or more calibration points need to be known in terms of some orthogonal, reference coordinate system, in order to determine the viewing parameters.
  • the Direct Linear Transform (DLT) is a five-point calibration procedure that is sometimes employed by photogrammitrists.
  • the Church resection model may be used when the optical axis of an aerial camera lens is within four or five degrees of looking vertically down on the terrain. Angular displacements from the vertical of more than a few degrees results in noticeable mathematical nonlinearities that are associated with transcendental trigonometric functions. Under these conditions, the Church resection model is no longer valid and the three-point calibration procedure no longer applies.
  • the problems of the prior art are overcome in accordance with the invention by automatically identifying camera location and orientation based on image content. This can be done either by placing a calibrated target within the field of the camera or by measuring the distances among three relatively permanent points in the scene of images previously captured. Using the points, the location and orientation of a camera at the time a picture was take can be precisely identified for each picture. Once the location and orientation of the camera are known precisely for each of two or more pictures, accurate 3-D positional information can be calculated for all other identifiable points on the images, thus permitting an accurate survey of the scene or object.
  • the images can be captured by a single camera and then used to generate stereo images or stereo wireframes.
  • the above and other objects and advantages of the invention are achieved by providing a method of measuring the absolute three dimensional location of points, such as point D of FIG. 1 with respect to a coordinate system defined using three points, A, B and C, separated by known distances using image data.
  • the image data is captured by using one or more cameras of known focal length to capture two images of a scene containing the points A, B, C and D.
  • the location and orientation of the camera(s) at the time each of said images was captured is determined with reference to said coordinate system by using information derived from said images, the known focal length and the known distances.
  • the locations of the cameras at the time the images were captured is then utilized with other image data, to determine the location of points such as point D.
  • the step of using the locations of the cameras at the time the images were captured to determine the location of said point D from image data includes defining an auxiliary coordinate system with origin along the line joining the locations of the cameras, defining the center point of each image as an origin of a set of image reference axes pointing in X', Y' and Z' directions, respectively, measuring offset in at least one of the X' and Y' directions of a point on the first image and of a corresponding point of a second image, determining the angles formed between a line joining point D, the focal point of the objective and the image of point D on one of the X' or Y' planes for each of the images, determining said distance h using the measured offsets, the focal length and the angles, determining the X' and Y' coordinates of point D in the auxiliary coordinate system, and transforming coordinates (X', Y', h) of the auxiliary coordinate system to a representation in said coordinate system defined using said three points, A, B and C.
  • the step of determining the location and orientation of said one or more cameras at the time said images were captured with reference to said coordinate system using image data, known focal length and said known distances includes representing the distance between point A, B and C and the focal point of a camera O as a viewing pyramid, modifying the representation of the pyramid to a joined three triangle flattened representation, selecting a low estimate Ob 1 for one interior side of a first triangle of said flattened representation, solving the first triangle using image data, known focal length and said known distances, yielding, inter alia, a first calculated value for length OA, given estimate Ob 1 , solving the second triangle using results obtained, solving the third triangle using results obtained, yielding, inter alia, a second calculated value for length OA.
  • the process of deriving values for camera location using distances OA, OB and OC comprises solving simultaneously equations for spheres centered at points A, B and C with respective radii of OA, OB and OC.
  • one calculates the azimuthal and elevational adjustment required to direct the camera to the location of point A and calculates the amount of rotation about the optical axis required to align point B once the camera points at point A. This is done interactively until the degree of alignment is within the desired degree of accuracy.
  • the invention can be used to measure the distance between two points especially in a vertical direction, to locate the physical position of objects visible in images accurately, to create a three dimensional wireframe representation and to document the "as built" condition of an object.
  • the invention is also directed to a method of measuring the absolute three dimensional location O of a camera with respect to a coordinate system defined using three points, A, B and C, separated by known distances using image data by capturing an image of a scene containing the points A, B, and C, using a camera, determining or knowing a priori the focal length of said camera, determining the location of said camera at the time said image was captured with reference to said coordinate system using information derived from said image, known focal length and said known distances.
  • the invention is also directed to a method of measuring distance including vertical height by measuring the absolute three dimensional location of points D, E and F with respect to a coordinate system defined using three points, A, B and C, separated by known distances using image data using techniques described above, by determining distances between points D, E and F, and by using the location of said points D, E and F and the location of cameras at the time images were captured to determine the location of other points.
  • the other points may be optionally located on images different from those used to determine the location of points D, E and F.
  • the invention is also directed to apparatus for measuring the absolute three dimensional location of a point D with respect to a coordinate system defined using three points, A, B and C, separated by known distances using image data
  • apparatus for measuring the absolute three dimensional location of a point D with respect to a coordinate system defined using three points, A, B and C, separated by known distances using image data including one or more cameras for capturing images of a scene containing the points A, B, C and D, a memory interfaced to the camera(s) for storing images captured by the camera(s), a computer for processing stored images to determine the location and orientation of the camera(s) at the time each of said images was captured with reference to said coordinate system, using information derived from said images, known focal length and said known distances, and for using the locations of said one or more cameras at the time the images were captured to determine the location of said point D from image data.
  • Location information can be stored in a database which can be used for different purposes. For example, it can be used to store a three dimensional wireframe representation or
  • FIG. 1 is an illustration of the capture of two images of a scene, including a building, according to the invention.
  • FIG. 2 is an illustration of a viewing pyramid of three calibration points as projected through the focal point of a camera.
  • FIG. 3 is an illustration of a flattened pyramid used for calculation of camera distance.
  • FIG. 4 is an illustration of viewing angle determination used in calculation of camera distance.
  • FIG. 5 is an illustration of near, mid and far ambiguity.
  • FIG. 6 is an illustration of how to resolve near, mid and far ambiguity.
  • FIG. 7 is an illustration of azimuthal and elevational correction.
  • FIG. 8 is a flow chart of the algorithm used to determine camera distance and orientation.
  • FIG. 9 is a flow chart of the algorithm used to calculate camera location.
  • FIG. 10 is an illustration of how to calculate the distance of a point from a line joining the principal points of two cameras.
  • FIG. 11 is an illustration of the calculation of the location of a point in the X direction.
  • FIG. 12 is an illustration of the calculation of the location of a point in the Y direction.
  • FIG. 13 is an illustration of how to calculate point location generally given a determination of the location and orientation of the camera at the time when two images were captured.
  • FIG. 14 is an illustration of hardware utilized in accordance with the invention.
  • FIG. 1 illustrates a building 100 in front of which is located a calibrated target such as a builder's square 110.
  • Pictures of the building are taken from two positions. The first from point f 1 and the second from point f 2 .
  • f 1 is the location of the principal point of the lens or lens system of a camera and the image projected through that point falls on image plane fp 1 .
  • a second image of the scene is captured from position f 2 and the image through principal point f 2 is cast upon image plane fp 2 .
  • the positioning of the cameras is arbitrary. In some circumstances, it is desirable to capture images from two locations using the same camera. In other circumstances, it may be desirable to capture the images using different cameras.
  • the camera is aimed so as to center the object of interest within the viewing frame.
  • both cameras are pointed at center point T which means that the images of points A, B and C on the builder's square are not in the center of the image.
  • a real world coordinate system is defined with the Y axis running through points A and C and an X axis defined perpendicular to the Y axis through point A in the plane of A, B and C, thus forming an origin 0 at point A.
  • a Z axis is defined perpendicular to the XY plane and running through point A.
  • principal point f 1 is located at (X 1 , Y 1 , Z 1 ).
  • principal point f 2 is located at (X 2 , Y 2 , Z 2 ).
  • a camera directed at target point T has both an azimuth and an elevation which can be specified utilizing the coordinate system.
  • the camera may be rotated about the optical axis of the camera differently when the two pictures were taken. In short, there is no guarantee that the camera was horizontal to the XY plane when the picture was taken and thus, the orientation of the images may require correction prior to processing.
  • FIG. 2 illustrates a viewing pyramid formed by the three points A, B and C vis-a-vis the origin 0 (the principal point of a camera).
  • the viewing pyramid can be viewed as having three surfaces, each corresponding to a surface triangle, namely, triangles AOB, BOC and COA. If one were to view the pyramid shown in FIG. 2 as hollow and made of paper and if one were to cut along the line OA and flatten the resulting pattern, one would achieve a flattened pyramid such as shown in FIG. 3.
  • FIG. 3 will be utilized to describe the process by which camera position is determined in accordance with the invention.
  • the distance OA represents the distance from point A which is at the origin of the coordinate system to point O which is at the principal point of the lens.
  • angles AOB, AOC and BOC are known by virtue of knowing the distance between the principal point and the image plane and the measured distance separating two points on the image plane.
  • FIG. 4 assists in illustrating how this is done.
  • the XY plane constitutes the image plane of the camera.
  • f 0 is the principal point of the lens.
  • Images of points A and B are formed on the image plane after passing through the principal point at locations A and B shown on the XY plane.
  • the incoming rays from points A and B are respectively shown at 400 and 410 of FIG. 4.
  • an image plane origin FP 0 is defined and an X axis is defined as parallel to the longest dimension of the image aspect ratio.
  • the Y axis is formed perpendicular thereto, and the origin FP 0 lies directly under the principal point.
  • Rays from points A and B form an angle alpha ( ⁇ ) as they pass through the focal point.
  • the projection of those rays beyond the focal point also diverge at ⁇ .
  • corresponds to ⁇ AOB of FIG. 3.
  • the distances AFP 0 and BFP 0 can be determined.
  • the angles separating points A, B and C can be determined in the manner just described.
  • the distances separating points A, B and C are also known, either a priori by placing a calibrated target, such as a carpenter's square in the scene being photographed, or by measuring the distances between three relatively permanent points in the scene previously captured after the images have been formed.
  • the distance OA represents the distance from the principal point of the camera (O) to point A which is the origin of the coordinate system utilized to define camera position.
  • this is done by first assuming a very low estimate for the distance OB, such as the distance Ob 1 , then with that assumption, triangle AOB is solved.
  • “Solving a triangle” means establishing (e.g. calculating) values for the length of each side and for each of the angles within the triangle.
  • the first triangle is solved using known, assumed or calculated values.
  • a value for distance OA is calculated.
  • the second triangle BOC is solved and the derived distance OC is then utilized to solve the third triangle COA.
  • the calculated value for OA of the third triangle is compared with the calculated value of OA of the first triangle and the estimate Ob 1 is revised by adding the difference between the values for OA from the third triangle and the value for OA from the first triangle to the estimate Ob 1 and the process is repeated.
  • the estimate Ob 1 will be improved until the difference between the calculated values of OA reduces to a value less than ⁇ .
  • is low enough for the accuracy needed, the iterations cease and the true value of OA is assumed to lie between the values calculated for the first and third triangles.
  • Distance Ob 1 is the estimate of the length of OB, which, at the outset, is set to be low.
  • the distance AB is known because the dimensions of a calibrated target are known or because the distance AB has been measured after the images are captured.
  • the value for ⁇ AOB is calculated from measurements from the image plane as illustrated in FIG. 4 and discussed in connection with equations 1-7. Therefore, ⁇ OAB can be calculated as follows: ##EQU2##
  • the first estimate of ⁇ OBA can be calculated as follows:
  • Ob 1 is assumed to be the distance OB.
  • Distance BC is known from the target or measurements and angle BOC is known from measurements from the image plane.
  • the third triangle can be solved in a manner completely analogously to the solution of the second triangle substituting in the corresponding lengths and angles of the third triangle in equations 8-12.
  • the distance OA which has been calculated as set forth above.
  • This distance OA from the third triangle will have been derived based on calculations from the first, second and third triangles. Note, however, that the distance OA from the third triangle and the distance OA from the first triangle should be identical if the assumed value Ob 1 were equal in fact to the real length OB. Since Ob 1 was initially assumed to be of very low value, there will be generally a difference between the value of OA from the third triangle as compared with that from the first triangle. The difference between the two calculated lengths is added to original estimate Ob 1 to form an estimate Ob 2 for the second, iteration.
  • the estimate for the distance OB can be made accurate to whatever degree of resolution one desires by continuing the iterative process until the difference between OA from the first triangle and that from the third triangle is reduced to an acceptable level, ⁇ .
  • the distance OA which results from the iterative process is then equal to the distance of the principal point of the camera shown at O in FIG. 3 to point A which is the origin of the coordinate system defined for this set of measurements.
  • FIG. 5 when viewing the points A, B and C from the principal point of the camera, one cannot necessarily determine which of points A, B and C are closest and next closest to the camera. For example, in FIG. 5, given that point B 1 is closest to the camera, it is possible that either point A is closer and point C farther, or alternatively, that point C is closer and point A farther. These differences are reflected in triangles A 1 B 1 C 1 as compared with triangle A 2 B 1 C 2 .
  • the table shown in FIG. 5 illustrates that the relationship between points A, B and C may in general result in six different permutations. There will always be these combinations of near, mid and far when working toward a solution. Right at the start, one doesn't know which point is closest to the camera and which is furthest and which is midpoint.
  • the difference between OA of the first and third triangles is added to the estimate Ob 1 to determine the estimate to be utilized in the next iteration. It is, of course, possible to utilize a factor other than 1 to 1 and to adjust the estimate by a fraction or a multiple of the difference between the values of OA for the first and third triangles. The preferred adjustment, however, is 1 to 1.
  • a right angle calibration target be used, like an 8 1/2 ⁇ 11 piece of paper or a carpenter's square.
  • the six potential arrangements of near, mid and far for points A, B, C can be viewed as different ways of flattening the pyramid.
  • Three sets of flattened pyramids can be formed by using each vertex OA, OB and OC as the edge which is "opened" (e.g. If the pyramid were formed by folding paper into a pyramid shape, and one vertex were cut open and the pyramid unfolded into a pattern like that shown in FIG. 3, three different sets of flattened pyramids are formed, each by cutting a different vertex).
  • Each set has two members corresponding to the two orders in which the triangles may occur. As illustrated in FIG. 3, for example, the triangles are solved in 1-2-3 order. This ordering represents one of the 2 members.
  • the other member is formed by flipping the flattened pyramid over on its face so that triangle 3, as shown in FIG. 3 is put in the triangle 1 position. This member of the set is solved in 3-2-1 order as labeled.
  • the techniques described herein are applicable to images photographed without a calibrated target. By selecting 3 convenient points on the image and physically measuring the distance between them after the image has been captured, the same effect can be achieved as is achieved using a calibrated target at the time the image is captured.
  • Each of the possible solutions for near, mid and far is utilized to generate a set of spheres which are then solved for common points of intersection.
  • FIG. 6 one can see that in addition to intersection at point O of the three spheres in the +Z plane, there will be a symmetrical solution in the -Z plane.
  • the horizontal control grid established by the XY plane is viewed from the +Z direction looking down on the XY plane.
  • there is only one solution and that is the one is the +Z space and the -Z space solution is eliminated. That then determines the XYZ location of the principal point of the camera.
  • FIG. 7 illustrates how azimuthal and elevational corrections are determined.
  • FIG. 7 illustrates the image plane.
  • Points ABC are the same points ABC utilized to define a coordinate system and to calculate the distance of the camera in that coordinate system.
  • Points A, B and C are illustrated as part of the image shown in the image plane.
  • a center of the plane i.e. the center of the picture
  • a calibrated target or the three points utilized to establish a coordinate system, A, B and C are typically not at the center of the photograph.
  • the azimuthal correction is essentially that required to displace point A, the image of the origin of the external world coordinate system so that it lies exactly on top of the photographic location of point A shown to the right of axis 710 of the coordinate system of the image plane.
  • the elevational correction is the angle of elevation or declination required to place the image of point A exactly on top of the photographic location of point A shown below the abscissa of the image plane coordinate system 700.
  • azimuthal and elevational corrections are determined such that if they were applied to the camera, point A, the origin of the real world coordinate system would coincide with point A, the origin as captured on the photograph.
  • FIG. 7 assumes that if A is correctly located, points B and C will be correctly located. However, this is generally not true because of tilt of the camera about the optical axis. Once points A have been superimposed, one knows where point B should be because of the axis definitions in the real world coordinate system. If the origin of the real world coordinate system centered on A, and the origin of the image plane coordinate system, now also centered on A by virtue of the azimuthal and elevational corrections applied in connection with FIG. 7, then point B on the image plane should be located where point B in the real world coordinate system is located. This would be the case if the camera were absolutely horizontal when the picture was taken. However, if there is tilt, B will be displaced off the axis.
  • the B point residual error and the C point residual error are utilized as a discriminators.
  • FIG. 8 illustrates the process utilized to fully determine the location and orientation of a camera from the image.
  • step 800 one determines the location of the calibration points A, B and C and either knows or measures the distances between them (810).
  • the camera location in XYZ coordinates is determined using the technique set forth in FIG. 9. Once the XYZ camera location is determined, corrections are made to azimuth and elevation (830) and then to tilt (840). With azimuth and tilt correction made, one determines whether the points are correctly located within a desired accuracy ⁇ (850). If they are, the location and orientation of the camera is fully determined (860) and the process ends. If they are not, another iteration of steps 830 and 840 is undertaken to bring the location determination within the desired accuracy.
  • FIG. 9 illustrates the details of block 820 of FIG. 8. Knowing the principal distance of the camera, one measures the three angles AOB, BOC and COA from the image plane (900). A viewing pyramid is constructed with distance OA assumed as the longest dimension (905). The pyramid is flattened and a value estimated for line segment OB which is known to be low (910). Using the estimate for OB, the first triangle is solved (915). Second and third triangles are then sequentially solved using the results of the prior calculations (920 and 925).
  • the value ⁇ OA is added to the prior estimate of OB to form a new estimate and a new iteration of steps 915, 920, 925, 930 and 940 occurs. If ⁇ OA ⁇ (940), then the viewing pyramid is solved (950) and it is only necessary to resolve the near, mid and far ambiguity (960) before the objective of totally determining the position and orientation of the camera (970) is achieved.
  • the coordinates X 0 and Y 0 of the point O can be defined with respect to a camera axis by the following. See FIGS. 11 and 12.)
  • FIG. 13 illustrates a typical real world situation.
  • the points A, B and C represent the calibrated target or the points measured subsequent to image capture.
  • the coordinate system X, Y and Z is established in accordance with the conventions set forth above with A as the origin.
  • Camera positions 1 and 2 illustrated only by their principal points O 1 and O 2 respectively and their image planes IP 1 and IP 2 respectively, are positioned with their principal points located at O 1 and O 2 and with their optical axis pointed at point T which would be the center of the field on the image plane.
  • FIG. 14 illustrates hardware utilized to carry out certain aspects of the invention.
  • Camera 1400 is used to capture images to be analyzed in accordance with the invention.
  • Camera 1400 may be a digital still camera or a video camera with a frame grabber. Images from the camera are loaded onto computer 1420 using camera interface 1410. Normally, images loaded through interface 1410 would be stored on hard drive 1423 and then later retrieved for processing in video RAM 1430. However, images can be loaded directly into video RAM if desired.
  • Video RAM 1430 preferably contains sufficient image storage to permit the simultaneous processing of two images from the camera.
  • Video display 1440 is preferably a high resolution video display such as a cathode ray tube or a corresponding display implemented in the semiconductor technology. Display 1440 is interfaced to the computer bus through display at interface 1424 and may be utilized to display individual images or both images simultaneously or three dimensional wire frames created in accordance with the invention.
  • Keyboard 1450 is interfaced to the bus over keyboard interface 1422 in the usual manner.
  • distance measurements may be conveniently measured in number of pixels in the vertical and horizontal direction which may be translated into linear measurements on the display screen knowing the resolution of the display in vertical and horizontal directions. Numbers of pixels may be readily determined by pointing and clicking on points under consideration and by obtaining the addresses of the pixels clicked upon from the cursor addresses.
  • the techniques set forth herein permit accurate forensic surveying of accident or crime scenes as well as accurate surveying of buildings or construction sites, particularly in the vertical direction which had heretofore been practically impossible.

Abstract

Methods and apparatus for accurately surveying and determining the physical location of objects in a scene are disclosed which use image data captured by one or more cameras and three points from the scene which may either be measured after the images are captured or may be included in the calibrated target placed in the scene at the time of image capture. Objects are located with respect to a three dimensional coordinate system defined with reference to the three points. The methods and apparatus permit rapid set up and capture of precise location data using simple apparatus and simple image processing. The precise location and orientation of the camera utilized to capture each scene is determined from image data, from the three point locations and from optical parameters of the camera.

Description

TECHNICAL FIELD
The invention relates to the field of image processing and more particularly to methods and apparatus for determining camera position and orientation from an image captured with that camera and to accurate surveying using such methods and apparatus.
BACKGROUND ART
Since the invention of the stereoscope in 1847, inventors have attempted to replicate three dimensional (3D) images found in nature. Two dimensional images lack realism due to the absence of depth queues. Many techniques have been devised for producing 3D images with varying degrees of success.
Stereoscopic photographic cameras are known which utilize a single camera body and two objective lenses separated by a fixed distance, usually corresponding to the interocular distance. Other such cameras use a single objective and external arrangements which form two image areas on film positioned on the camera's image plane. Still other arrangements use two separate cameras separated by a fixed distance to form images corresponding to a left and right eye view of the scene being photographed.
Once stereoscopic photographic images of the prior art are developed, they are often viewed through separate eye pieces, one for each eye. Each eye piece projects a view of a respective one of the developed images which the user's eyes would have seen had the eyes viewed the scene directly. Depth is clearly discernable when viewing a stereoscopic image.
There are several problems with prior art techniques for generating three dimensional images. First, the requirement that there be a fixed camera to camera or objective to objective separation limits flexibility in the construction of cameras. The requirement for two objective lenses or two cameras dictates special apparatus in order to capture stereoscopic images.
Another problem with the prior art is that complicated lens arrangements are necessary to view stereoscopic images. Further, in the stereoscopic photographic systems of the prior art, depth was not readily quantifiable.
Calculations of depth is a difficult task when using images captured from different positions vis-a-vis the scene being photographed because the planar relationships which result from projection of a three dimensional scene onto a two dimensional plane do not undergo a linear transformation or mapping compared with the same points projected onto a different image plane. Different portions of a scene viewed from one point relate differently to corresponding points from the same scene viewed from another point. As one changes viewing positions, some portions of a scene become hidden as the view point changes. Planar surfaces which are viewed normally in one view are reduced in extent when viewed obliquely.
In the prior art, methods and apparatus are known for surveying a plot of land to identify the locations of significant features of the plot. Typically, this involves a team of surveyors who go to the plot and make physical measurements of distance and angle using a surveyor's transit theodolite and calibrated standards for measuring distance. Surveys using these techniques are typically baselined against a national grid of survey markers. This technique is subject to errors of various kinds in reading the instruments and in performing calculations.
Aerial surveying is also known. Images are captured from an airplane or other vehicle in transit over an area to be surveyed at positions which are precisely known by modern navigation techniques. Position of significant ground features can then be calculated using sophisticated image processing techniques which often require supercomputers. Aerial surveying techniques have the advantage that they can be accomplished without the need to place people on the ground in the area to be surveyed. Inaccessible terrain can also be surveyed in this way. However, expensive image capture equipment is required and even with very good optics and image processing, the resolution is not always as good as one might like. Also, accurate measurements in the vertical direction are even more difficult to take using aerial techniques.
In forensic investigations such as those of a crime scene or archeological dig, spatial relationships are very important. Such investigations often occur under conditions where some urgency or public necessity exists to vacate the scene of the investigation in a short period of time. If a freeway is blocked for an investigation during rush hour, the need to resume traffic flow is a political necessity. In crime scene analysis, if details are not observed and recorded immediately, valuable evidence may be lost. In such circumstances, there is not time for a careful manual survey and aerial techniques generally lack needed resolution or are too expensive for general application to police investigations.
In a manufacturing environment, it is often desirable to determine the physical details of a product "as built" either for inspection purposes or for documentation with substantial accuracy.
In manufacturing, it is often desirable to capture the physical dimensions of complex objects for purposes of creating a three dimensional (3-D) representation, such as a wireframe, for use in computer assisted design or computer assisted manufacturing (CAD/CAM). In entertainment, it is desirable to use such a 3-D representation for creating animations which result in changes to the position or viewing perspective of a 3-D object.
There is thus a need to accurately capture 3-D information about objects and scenes in ways which are convenient and economical and which don't require sophisticated computing equipment. There is also a need to accurately capture physical dimensions of objects in the vertical direction which might be inaccessible to a physical survey.
Every recorded image, whether it be a photograph, a video frame, a true perspective drawing or other form of recorded image, has associated with it a viewing location and viewing look angles that exactly describe the orientation of the recording mechanism relative to the recorded scene.
When making distance calculations from images captured using cameras, it is necessary to know the location of the camera at the time the picture was taken, or more precisely the front principal point of the camera lens or system of lenses at the time the picture was taken. To calculate distances accurately, it is also desirable to know the azimuth, elevation and rotation angle of the optical axis of the lens or lens system as it emerges from the camera.
In the prior art, camera location was either estimated or known a priori by locating the position from which the picture was taken using surveying techniques. Typically, rotation angle was assumed to be 0 (horizontal) and elevation and azimuth were either measured with varying degrees of accuracy or estimated. Clearly, such surveying and measurement increase the set up time required before capturing images for analysis, often to the point where any hope of accurate measurements would be abandoned in favor of qualitative information which could be gleaned from images captured under uncontrolled conditions.
The need for accurate viewing parameters is being expressed by an ever increasing population of computer users who use digital and analog images for a wide range of purposes, from engineering measurement applications to marketing and sales presentations.
For example, stereo photographs are frequently used to investigate and document accident or crime scenes. The accuracy of the documentation depends to a high degree on knowing exactly the viewing parameters of the cameras at the time the photographs were taken.
Computer-generated renderings are often merged with actual photographs to convey an image of a completed construction project while still in the planning and review stages. In order to make the computer rendering blend into and match the photograph in a visually convincing manner, it is necessary for the viewing parameters of the computer rendering to be exactly the same as the viewing parameters of the camera that took the photograph.
Typically, the viewing parameters for any given recorded image are unknown and difficult to determine with a high degree of accuracy, even when the camera positions are physically measured relative to some established coordinate system. The difficulties arise from the fact that the camera lens principle points are usually located inside the lens structure and therefore inaccessible for purposes of direct measurement. The measurement of viewing angles is even more difficult to accomplish without the use of surveying type tripods, levels and transits.
Photogrammetry is a science that deals with measurements made from photographs. Generally, photogrammetrists use special camera equipment that generates fiducial marks on the photographs to assist in determining the viewing parameters. Non-photogrammetric cameras can be used in some analyses, however the associated techniques generally require knowing the locations of a large number of calibration points (five or more) that are identifiable in the recorded scene. Generally, the three-dimensional location of five or more calibration points need to be known in terms of some orthogonal, reference coordinate system, in order to determine the viewing parameters. The Direct Linear Transform (DLT) is a five-point calibration procedure that is sometimes employed by photogrammitrists. It is usually difficult and expensive to establish the locations of these points and it is certainly complicated enough to deter a non-technical person from attempting to determine the viewing parameters. Unless a tightly controlled calibration coordinates system is established prior to taking the photographs, it is necessary for the user to know a minimum of nine linear dimensions between the five points. This requirement limits the use of the technique considerably.
In some specialized cases, such as certain aerial surveying applications, conventional photogrammetry can be employed to determine camera parameters using as few as three calibration points. In particular, the Church resection model may be used when the optical axis of an aerial camera lens is within four or five degrees of looking vertically down on the terrain. Angular displacements from the vertical of more than a few degrees results in noticeable mathematical nonlinearities that are associated with transcendental trigonometric functions. Under these conditions, the Church resection model is no longer valid and the three-point calibration procedure no longer applies.
All of the calibration techniques discussed above suffer from a number of disadvantages:
(a) They required calibrated camera equipment;
(b) They require calibration targets consisting of too many points to make the procedures practical for common everyday use by non-professionals;
(c) Techniques which use a three-point calibration target are valid only over a very limited range of off normal camera look angles; and
(d) All of the previous methods for solving viewing parameters employ matrix operations operating on all point data at the same time, thus allowing one poorly defined measurement parameter to inject errors in a relatively unknown and indeterminable sense due to parameter cross-talk effects.
SUMMARY OF THE INVENTION
The problems of the prior art are overcome in accordance with the invention by automatically identifying camera location and orientation based on image content. This can be done either by placing a calibrated target within the field of the camera or by measuring the distances among three relatively permanent points in the scene of images previously captured. Using the points, the location and orientation of a camera at the time a picture was take can be precisely identified for each picture. Once the location and orientation of the camera are known precisely for each of two or more pictures, accurate 3-D positional information can be calculated for all other identifiable points on the images, thus permitting an accurate survey of the scene or object. The images can be captured by a single camera and then used to generate stereo images or stereo wireframes.
Accordingly, besides the advantages of the simple three-point calibration target described above, several additional objects and advantages of the present invention are:
(a) to provide a decoupling of error terms such that Azimuth, Elevation and Tilt terms do not affect the accuracy of X, Y, and Z terms;
(b) to provide simple procedures that can be applied successfully by non-technical personnel;
(c) to provide an iterative solution such that all viewing parameters are determined to an accuracy in excess of 12 decimal places or the limitations of pixellation error, whichever is larger;
(d) to provide a test of all possible solutions prior to selecting the solution with the least error, and
(e) to provide a surveying system which permits capture of 3-D information at large angles off normal.
The above and other objects and advantages of the invention are achieved by providing a method of measuring the absolute three dimensional location of points, such as point D of FIG. 1 with respect to a coordinate system defined using three points, A, B and C, separated by known distances using image data. The image data is captured by using one or more cameras of known focal length to capture two images of a scene containing the points A, B, C and D. The location and orientation of the camera(s) at the time each of said images was captured is determined with reference to said coordinate system by using information derived from said images, the known focal length and the known distances. The locations of the cameras at the time the images were captured is then utilized with other image data, to determine the location of points such as point D.
The step of using the locations of the cameras at the time the images were captured to determine the location of said point D from image data includes defining an auxiliary coordinate system with origin along the line joining the locations of the cameras, defining the center point of each image as an origin of a set of image reference axes pointing in X', Y' and Z' directions, respectively, measuring offset in at least one of the X' and Y' directions of a point on the first image and of a corresponding point of a second image, determining the angles formed between a line joining point D, the focal point of the objective and the image of point D on one of the X' or Y' planes for each of the images, determining said distance h using the measured offsets, the focal length and the angles, determining the X' and Y' coordinates of point D in the auxiliary coordinate system, and transforming coordinates (X', Y', h) of the auxiliary coordinate system to a representation in said coordinate system defined using said three points, A, B and C.
The step of determining the location and orientation of said one or more cameras at the time said images were captured with reference to said coordinate system using image data, known focal length and said known distances includes representing the distance between point A, B and C and the focal point of a camera O as a viewing pyramid, modifying the representation of the pyramid to a joined three triangle flattened representation, selecting a low estimate Ob1 for one interior side of a first triangle of said flattened representation, solving the first triangle using image data, known focal length and said known distances, yielding, inter alia, a first calculated value for length OA, given estimate Ob1, solving the second triangle using results obtained, solving the third triangle using results obtained, yielding, inter alia, a second calculated value for length OA. Subtracting the second calculated value for length OA from the first calculated value for length OA to produce a difference value, revising the value of estimate Ob1 by adding said difference value to achieve a revised estimate, iterating using the revised estimate until said difference value is less than a desired accuracy, and deriving values for camera location using distances OA, OB and OC.
The process of deriving values for camera location using distances OA, OB and OC comprises solving simultaneously equations for spheres centered at points A, B and C with respective radii of OA, OB and OC.
When one determines the orientation of one of more of the cameras, one calculates the azimuthal and elevational adjustment required to direct the camera to the location of point A and calculates the amount of rotation about the optical axis required to align point B once the camera points at point A. This is done interactively until the degree of alignment is within the desired degree of accuracy.
The invention can be used to measure the distance between two points especially in a vertical direction, to locate the physical position of objects visible in images accurately, to create a three dimensional wireframe representation and to document the "as built" condition of an object.
The invention is also directed to a method of measuring the absolute three dimensional location O of a camera with respect to a coordinate system defined using three points, A, B and C, separated by known distances using image data by capturing an image of a scene containing the points A, B, and C, using a camera, determining or knowing a priori the focal length of said camera, determining the location of said camera at the time said image was captured with reference to said coordinate system using information derived from said image, known focal length and said known distances.
The invention is also directed to a method of measuring distance including vertical height by measuring the absolute three dimensional location of points D, E and F with respect to a coordinate system defined using three points, A, B and C, separated by known distances using image data using techniques described above, by determining distances between points D, E and F, and by using the location of said points D, E and F and the location of cameras at the time images were captured to determine the location of other points. The other points may be optionally located on images different from those used to determine the location of points D, E and F.
The invention is also directed to apparatus for measuring the absolute three dimensional location of a point D with respect to a coordinate system defined using three points, A, B and C, separated by known distances using image data including one or more cameras for capturing images of a scene containing the points A, B, C and D, a memory interfaced to the camera(s) for storing images captured by the camera(s), a computer for processing stored images to determine the location and orientation of the camera(s) at the time each of said images was captured with reference to said coordinate system, using information derived from said images, known focal length and said known distances, and for using the locations of said one or more cameras at the time the images were captured to determine the location of said point D from image data. Location information can be stored in a database which can be used for different purposes. For example, it can be used to store a three dimensional wireframe representation or the locations of points surveyed.
Still other objects and advantages of the present invention will become readily apparent to those skilled in this art from the following detailed description, wherein only the preferred embodiment of the invention is shown and described, simply by way of illustration of the best mode contemplated of carrying out the invention. As will be realized, the invention is capable of other and different embodiments, and its several details are capable of modifications in various obvious respects, all without departing from the invention. Accordingly, the drawing and description are to be regarded as illustrative in nature, and not as restrictive.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is an illustration of the capture of two images of a scene, including a building, according to the invention.
FIG. 2 is an illustration of a viewing pyramid of three calibration points as projected through the focal point of a camera.
FIG. 3 is an illustration of a flattened pyramid used for calculation of camera distance.
FIG. 4 is an illustration of viewing angle determination used in calculation of camera distance.
FIG. 5 is an illustration of near, mid and far ambiguity.
FIG. 6 is an illustration of how to resolve near, mid and far ambiguity.
FIG. 7 is an illustration of azimuthal and elevational correction.
FIG. 8 is a flow chart of the algorithm used to determine camera distance and orientation.
FIG. 9 is a flow chart of the algorithm used to calculate camera location.
FIG. 10 is an illustration of how to calculate the distance of a point from a line joining the principal points of two cameras.
FIG. 11 is an illustration of the calculation of the location of a point in the X direction.
FIG. 12 is an illustration of the calculation of the location of a point in the Y direction.
FIG. 13 is an illustration of how to calculate point location generally given a determination of the location and orientation of the camera at the time when two images were captured.
FIG. 14 is an illustration of hardware utilized in accordance with the invention.
DETAILED DISCLOSURE OF THE INVENTION
FIG. 1 illustrates a building 100 in front of which is located a calibrated target such as a builder's square 110. Pictures of the building are taken from two positions. The first from point f1 and the second from point f2. f1 is the location of the principal point of the lens or lens system of a camera and the image projected through that point falls on image plane fp1. A second image of the scene is captured from position f2 and the image through principal point f2 is cast upon image plane fp2. The positioning of the cameras is arbitrary. In some circumstances, it is desirable to capture images from two locations using the same camera. In other circumstances, it may be desirable to capture the images using different cameras.
Typically, the camera is aimed so as to center the object of interest within the viewing frame. In the picture shown, both cameras are pointed at center point T which means that the images of points A, B and C on the builder's square are not in the center of the image.
Once images are available in viewable form for analysis, knowing the distance between the principal point and the image plane of the camera (principal distance) and the physical displacement of the points on the reproduced image, one may calculate the angles Af1 B, Bf1 C and Cf1 A because the angles subtended by pairs of points vis-a-vis the principal point are identical whether they are measured in the real scene or on the image plane side of the focal point.
In the implementation of this invention, a real world coordinate system is defined with the Y axis running through points A and C and an X axis defined perpendicular to the Y axis through point A in the plane of A, B and C, thus forming an origin 0 at point A. A Z axis is defined perpendicular to the XY plane and running through point A. By convention, the +Y direction runs from the origin at A to point C, the +X direction runs to the right when standing at the origin and facing the +Y direction and the +Z direction proceeds in a vertical direction from the origin out of the XY plane in a direction indicated by the cross product of a vector in the +X direction with a vector in the +Y direction.
Given this coordinate system, it is desirable to calculate the location of the camera, namely, the location of the principal point of the camera from which an image was captured. Thus, principal point f1 is located at (X1, Y1, Z1). Likewise, the principal point f2 is located at (X2, Y2, Z2).
With respect to that coordinate system, one can see that a camera directed at target point T has both an azimuth and an elevation which can be specified utilizing the coordinate system. In addition, the camera may be rotated about the optical axis of the camera differently when the two pictures were taken. In short, there is no guarantee that the camera was horizontal to the XY plane when the picture was taken and thus, the orientation of the images may require correction prior to processing.
FIG. 2 illustrates a viewing pyramid formed by the three points A, B and C vis-a-vis the origin 0 (the principal point of a camera). The viewing pyramid can be viewed as having three surfaces, each corresponding to a surface triangle, namely, triangles AOB, BOC and COA. If one were to view the pyramid shown in FIG. 2 as hollow and made of paper and if one were to cut along the line OA and flatten the resulting pattern, one would achieve a flattened pyramid such as shown in FIG. 3.
FIG. 3 will be utilized to describe the process by which camera position is determined in accordance with the invention. The distance OA represents the distance from point A which is at the origin of the coordinate system to point O which is at the principal point of the lens.
At the beginning of the determination, one knows values for angles AOB, AOC and BOC by virtue of knowing the distance between the principal point and the image plane and the measured distance separating two points on the image plane.
FIG. 4 assists in illustrating how this is done. In FIG. 4, the XY plane constitutes the image plane of the camera. f0 is the principal point of the lens. Images of points A and B are formed on the image plane after passing through the principal point at locations A and B shown on the XY plane. The incoming rays from points A and B are respectively shown at 400 and 410 of FIG. 4. For purposes of image plane analysis, an image plane origin FP0 is defined and an X axis is defined as parallel to the longest dimension of the image aspect ratio. The Y axis is formed perpendicular thereto, and the origin FP0 lies directly under the principal point. Rays from points A and B form an angle alpha (<α) as they pass through the focal point. The projection of those rays beyond the focal point also diverge at <α. <α corresponds to <AOB of FIG. 3.
By taking careful measurements from the image capture medium (e.g. photographic film, digital array etc.), one can determine the distances AFP0 and BFP0.
Calculating the distances AF0 and BF0 using the Pythogorean Theorem using the known distance F0 FP0 (the distance between the principal point and the focal plane) and measured distance AFP0 and BFP0, one may determine angle 2 using the law of cosines as follows:
AB.sup.2 =(F.sub.0 A).sup.2 +(F.sub.0 B).sup.2 -2(F.sub.0 A) (F.sub.0 B)cos α                                                   (1)
α=arc cos  ((F.sub.0 A).sup.2+(F.sub.0 B).sup.2 -(AB).sup.2 /2(F.sub.0 A) (F.sub.0 B).sup.(2)                         (2)
Thus, by analyzing points in the focal plane, the angles separating points A, B and C can be determined in the manner just described.
The distances separating points A, B and C are also known, either a priori by placing a calibrated target, such as a carpenter's square in the scene being photographed, or by measuring the distances between three relatively permanent points in the scene previously captured after the images have been formed.
In FIG. 3, the distance OA represents the distance from the principal point of the camera (O) to point A which is the origin of the coordinate system utilized to define camera position. At a high level, this is done by first assuming a very low estimate for the distance OB, such as the distance Ob1, then with that assumption, triangle AOB is solved. "Solving a triangle" means establishing (e.g. calculating) values for the length of each side and for each of the angles within the triangle. With the distance Ob1 assumed, the first triangle is solved using known, assumed or calculated values. In the process, a value for distance OA is calculated. Using the estimate Ob1, the second triangle BOC is solved and the derived distance OC is then utilized to solve the third triangle COA. When the third triangle is solved, the calculated value for OA of the third triangle is compared with the calculated value of OA of the first triangle and the estimate Ob1 is revised by adding the difference between the values for OA from the third triangle and the value for OA from the first triangle to the estimate Ob1 and the process is repeated. By successive iterations, the estimate Ob1 will be improved until the difference between the calculated values of OA reduces to a value less than ε. When ε is low enough for the accuracy needed, the iterations cease and the true value of OA is assumed to lie between the values calculated for the first and third triangles.
A calculation of one iteration will illustrate in detail how this is done.
From the law of sines, one knows: ##EQU1##
Distance Ob1 is the estimate of the length of OB, which, at the outset, is set to be low. The distance AB is known because the dimensions of a calibrated target are known or because the distance AB has been measured after the images are captured. The value for <AOB is calculated from measurements from the image plane as illustrated in FIG. 4 and discussed in connection with equations 1-7. Therefore, <OAB can be calculated as follows: ##EQU2##
Once the first estimate of <OAB is known, the first estimate of <OBA can be calculated as follows:
<OBA=180°=<AOB-<OAE                                 (5)
At this point, one knows all three angles of the first triangle of FIG. 3 and is in a position to calculate a value for OA of the first triangle. Again using the law of sines, OA can be determined as follows: ##EQU3##
At this point, the first triangle is entirely solved under the assumption that the distance Ob1 is the actual value of length OB.
Turning to the second triangle, Ob1 is assumed to be the distance OB. Distance BC is known from the target or measurements and angle BOC is known from measurements from the image plane. Thus, there is enough information to solve the second triangle completely as shown in equations 13-17. ##EQU4##
With the distance OC calculated as shown in equation 12, the same information is available with respect to the third triangle that was available at the beginning of the solution of the second triangle. Therefore, the third triangle can be solved in a manner completely analogously to the solution of the second triangle substituting in the corresponding lengths and angles of the third triangle in equations 8-12.
One result of the solution of the third triangle is the distance OA which has been calculated as set forth above. This distance OA from the third triangle will have been derived based on calculations from the first, second and third triangles. Note, however, that the distance OA from the third triangle and the distance OA from the first triangle should be identical if the assumed value Ob1 were equal in fact to the real length OB. Since Ob1 was initially assumed to be of very low value, there will be generally a difference between the value of OA from the third triangle as compared with that from the first triangle. The difference between the two calculated lengths is added to original estimate Ob1 to form an estimate Ob2 for the second, iteration.
With the distance assumed to be Ob2, the calculations set forth above for the solution of the first, second and third triangles are repeated and the resulting values for OA from the first and third triangles are compared once again and an adjustment made to the estimate Ob2 based on the difference between the lengths as set forth above.
By successive iteration, the estimate for the distance OB can be made accurate to whatever degree of resolution one desires by continuing the iterative process until the difference between OA from the first triangle and that from the third triangle is reduced to an acceptable level, ε. The distance OA which results from the iterative process is then equal to the distance of the principal point of the camera shown at O in FIG. 3 to point A which is the origin of the coordinate system defined for this set of measurements.
If the values for OA from the first and third triangles agree within ε, all of the triangles are solved and therefore the entire viewing pyramid is solved.
Turning to FIG. 5, when viewing the points A, B and C from the principal point of the camera, one cannot necessarily determine which of points A, B and C are closest and next closest to the camera. For example, in FIG. 5, given that point B1 is closest to the camera, it is possible that either point A is closer and point C farther, or alternatively, that point C is closer and point A farther. These differences are reflected in triangles A1 B1 C1 as compared with triangle A2 B1 C2. The table shown in FIG. 5 illustrates that the relationship between points A, B and C may in general result in six different permutations. There will always be these combinations of near, mid and far when working toward a solution. Right at the start, one doesn't know which point is closest to the camera and which is furthest and which is midpoint.
To avoid incorrect answers, it is desirable to try all combinations. For each of the combinations one assumes that one knows which one is which and then tries the calculation. If the calculation converges to a potential solution, then one holds that solution over for further analysis. If one is close to the plane of a particular triangle, there can be as many as five potential solutions or orientations of the triangle that will give you the same relationship of side lengths and viewing pyramid apex angles.
If a particular combination of near, mid and far is not feasible, the calculations do not converge and the process blows up, usually terminating in a math error, typically in a trigonometric function. However, if the calculations proceed normally, then potential solutions are realized and each potential solution is retained for further investigation.
In FIG. 5, it is clear that sometimes there may be degeneracy in which two or more points are located at exactly the same distance from the focal point. That reduces the number of different possible solutions.
During the iterative process, in the example shown above, the difference between OA of the first and third triangles is added to the estimate Ob1 to determine the estimate to be utilized in the next iteration. It is, of course, possible to utilize a factor other than 1 to 1 and to adjust the estimate by a fraction or a multiple of the difference between the values of OA for the first and third triangles. The preferred adjustment, however, is 1 to 1.
When utilizing a calibrated target, it is preferred that a right angle calibration target be used, like an 8 1/2×11 piece of paper or a carpenter's square.
The six potential arrangements of near, mid and far for points A, B, C can be viewed as different ways of flattening the pyramid. Three sets of flattened pyramids can be formed by using each vertex OA, OB and OC as the edge which is "opened" (e.g. If the pyramid were formed by folding paper into a pyramid shape, and one vertex were cut open and the pyramid unfolded into a pattern like that shown in FIG. 3, three different sets of flattened pyramids are formed, each by cutting a different vertex). Each set has two members corresponding to the two orders in which the triangles may occur. As illustrated in FIG. 3, for example, the triangles are solved in 1-2-3 order. This ordering represents one of the 2 members. The other member is formed by flipping the flattened pyramid over on its face so that triangle 3, as shown in FIG. 3 is put in the triangle 1 position. This member of the set is solved in 3-2-1 order as labeled.
The 1-2-3 ordering of the solution of the triangle of a flattened pyramid implicitly assumes that the left (and right) exterior edge (OA in the figure) is the farthest, the next (OB) is intermediate (mid) and OC is closest.
When searching for a solution for each of the possible arrangements of near, mid and far, the algorithm converges only for that (those) solution(s) which are "possible". Usually only one of the 6 combinations is possible. However, sometimes degeneracy occurs when 2 (or 3) points are exactly the same distance away. In such a case, multiple solutions are possible but they will yield to the same result.
Thus convergent solutions will uniquely define the X, Y and Z locations of the camera in the coordinate system defined by the points A, B and C as set forth above.
The techniques described herein are applicable to images photographed without a calibrated target. By selecting 3 convenient points on the image and physically measuring the distance between them after the image has been captured, the same effect can be achieved as is achieved using a calibrated target at the time the image is captured.
To resolve the near, mid and far ambiguities, as shown in FIG. 6, one notes that the principal point of the camera is going to be where the known lengths of OA, OB and OC coincide at point O. For each of the possible solutions for the location of point O, one can then write an equation for a sphere about the point A, about point B and then about point C. The intersection of the spheres can be understood by visualizing two soap bubbles coming together. As they get progressively closer, they can touch at one point and then as one penetrates the other it will generate a circle which will be a locus of points that is common to the two spheres. As long as the spheres are not identically the same size, one bubble will go inside of the other and as it goes inside it will, at worst case, touch again at one point. As it goes out the other side, it will touch at a point, form a circle, and then as it leaves it will touch a diametrically opposite point.
By writing equations for spheres centered at points A, B and C with radii respectively of length OA, OB and OC, one obtains three equations in three unknowns (assuming a rectangular coordinate system).
Each of the possible solutions for near, mid and far is utilized to generate a set of spheres which are then solved for common points of intersection. Looking at FIG. 6, one can see that in addition to intersection at point O of the three spheres in the +Z plane, there will be a symmetrical solution in the -Z plane. By convention, one assumes that the horizontal control grid established by the XY plane is viewed from the +Z direction looking down on the XY plane. By that convention, there is only one solution and that is the one is the +Z space and the -Z space solution is eliminated. That then determines the XYZ location of the principal point of the camera.
Once the camera position is determined, there are three possible orientations for the camera which need to be specified. They are (1) the azimuthal rotation, (2) the elevation rotation and (3) the tilt about the optical axis. FIG. 7 illustrates how azimuthal and elevational corrections are determined. FIG. 7 illustrates the image plane. Points ABC are the same points ABC utilized to define a coordinate system and to calculate the distance of the camera in that coordinate system. Points A, B and C are illustrated as part of the image shown in the image plane. A center of the plane (i.e. the center of the picture) is typically placed on the object of interest so that the object of interest appears in the center of the image. A calibrated target or the three points utilized to establish a coordinate system, A, B and C, are typically not at the center of the photograph. The azimuthal correction is essentially that required to displace point A, the image of the origin of the external world coordinate system so that it lies exactly on top of the photographic location of point A shown to the right of axis 710 of the coordinate system of the image plane. The elevational correction is the angle of elevation or declination required to place the image of point A exactly on top of the photographic location of point A shown below the abscissa of the image plane coordinate system 700. In short, azimuthal and elevational corrections are determined such that if they were applied to the camera, point A, the origin of the real world coordinate system would coincide with point A, the origin as captured on the photograph.
Mathematically, the differential offset angles, that place the image of the origin of the real world coordinate system exactly on point A in the image plane, are calculated as follows: ##EQU5##
The corrections required to coalign or superimpose points A are shown in FIG. 7.
FIG. 7 assumes that if A is correctly located, points B and C will be correctly located. However, this is generally not true because of tilt of the camera about the optical axis. Once points A have been superimposed, one knows where point B should be because of the axis definitions in the real world coordinate system. If the origin of the real world coordinate system centered on A, and the origin of the image plane coordinate system, now also centered on A by virtue of the azimuthal and elevational corrections applied in connection with FIG. 7, then point B on the image plane should be located where point B in the real world coordinate system is located. This would be the case if the camera were absolutely horizontal when the picture was taken. However, if there is tilt, B will be displaced off the axis. On the image plane, one knows the actual angle that the line AB makes to the X axis of the image plane by measurement from the image plane. By taking the viewing pyramid and projecting it onto a projection plane, as is commonly done when projecting three dimensional images onto a two dimensional surface, one can determine what angle BAC should be on the image plane. To correct for camera tilt, one must rotate the image plane about the optical axis. However, doing so potentially changes the location of points A, B and C requiring another iteration of corrections in which points A are superimposed and the amount of tilt recalculated until the points converge to an arbitrary amount of error ε1.
Using these techniques, convergence can commonly be achieved to an accuracy of 1 part in 10-14. If there is more than one convergent candidate, the B point residual error and the C point residual error are utilized as a discriminators.
FIG. 8 illustrates the process utilized to fully determine the location and orientation of a camera from the image. At step 800, one determines the location of the calibration points A, B and C and either knows or measures the distances between them (810). The camera location in XYZ coordinates is determined using the technique set forth in FIG. 9. Once the XYZ camera location is determined, corrections are made to azimuth and elevation (830) and then to tilt (840). With azimuth and tilt correction made, one determines whether the points are correctly located within a desired accuracy ε (850). If they are, the location and orientation of the camera is fully determined (860) and the process ends. If they are not, another iteration of steps 830 and 840 is undertaken to bring the location determination within the desired accuracy.
FIG. 9 illustrates the details of block 820 of FIG. 8. Knowing the principal distance of the camera, one measures the three angles AOB, BOC and COA from the image plane (900). A viewing pyramid is constructed with distance OA assumed as the longest dimension (905). The pyramid is flattened and a value estimated for line segment OB which is known to be low (910). Using the estimate for OB, the first triangle is solved (915). Second and third triangles are then sequentially solved using the results of the prior calculations (920 and 925). If the difference between the value for OA calculated in connection with the first triangle differs from the value for OA calculated from the third triangle (930) by an amount greater than ε (940), the value ΔOA is added to the prior estimate of OB to form a new estimate and a new iteration of steps 915, 920, 925, 930 and 940 occurs. If ΔOA<ε (940), then the viewing pyramid is solved (950) and it is only necessary to resolve the near, mid and far ambiguity (960) before the objective of totally determining the position and orientation of the camera (970) is achieved.
If the images had been captured with two cameras aligned as shown in FIG. 10, the location of the point X1, Y1, Z1 would be calculated as follows:
Assume a set of axes with origin at 0, the X and Z axes as shown in FIG. 10 and the Y axis being perpendicular to the plane of the page. Assume that the images are captured with an objective at point C and an objective at point F in FIG. 10. The distance between C and F being d1 +d2. The camera capturing the image will have a known focal length F and the image plane corresponding to each of the points at which the image is captured is shown in a heavy line on the X axis. The distance of the point labeled D from the line joining the focal points of the camera (C & F) can be calculated as follows:
Triangles ABC and CED are similar in a geometric sense and triangles DEF and FHG are also similar.
Because they are similar, ##EQU6## Equating (20) and (21) as shown in (23) and the subtracting the right hand term from both sides of the equation results in: ##EQU7## For (24) to be true, the numerator must=0.
d.sub.12 ΔX.sub.R -(d.sub.2 +d.sub.11) ΔX.sub.L =0(20)
Solving equation 22 for d11, substituting in equation 25 and moving the right term to the right side of the equation results in:
d.sub.12 ΔX.sub.R =(d.sub.2 +d.sub.1 -d.sub.12) ΔX.sub.L(21)
d.sub.12 (ΔX.sub.R +ΔX.sub.L)=(d.sub.2 +d.sub.1) ΔX.sub.L(22) ##EQU8##
Once h is known, the coordinates X0 and Y0 of the point O can be defined with respect to a camera axis by the following. See FIGS. 11 and 12.)
α.sub.x =tan.sup.-1 f/ΔX                       (26)
α.sub.y =tan.sup.-1 f/ΔY                       (27)
X.sub.0 =-h Cot α.sub.x                              (28)
Y.sub.0 =-h Cot α.sub.y                              (29)
In capturing images under field conditions, the positioning of cameras as shown in FIG. 10 is rarely so cleanly defined.
FIG. 13 illustrates a typical real world situation. In FIG. 13 the points A, B and C represent the calibrated target or the points measured subsequent to image capture. The coordinate system X, Y and Z is established in accordance with the conventions set forth above with A as the origin. Camera positions 1 and 2 illustrated only by their principal points O1 and O2 respectively and their image planes IP1 and IP2 respectively, are positioned with their principal points located at O1 and O2 and with their optical axis pointed at point T which would be the center of the field on the image plane. One desires to obtain the coordinates (X1, Y1, Z1) for an arbitrary point P.
This can be accomplished by a two-stage transformation. If one were to draw a line between the focal points O1 and O2 and define a mid-point OM(Xm, Ym, Zm) at the center of that line, and then if one were to perform an azimuthal rotation and if the same kind of rotation were applied to camera 2 about focal point O2 then, the cameras would be oriented as shown in FIG. 10 and the coordinates for point P could be calculated using equations 15-19 as shown above. However, the coordinates calculated are with reference to point O of FIG. 10 which corresponds to point Om of FIG. 13. To obtain the coordinates of point P with reference to the world coordinate system defined for measurements requires then only a simple coordinate transformation to change the representation from a coordinate system centered at Om to one centered at point A. This is done routinely using well-known mathematics.
FIG. 14 illustrates hardware utilized to carry out certain aspects of the invention. Camera 1400 is used to capture images to be analyzed in accordance with the invention. Camera 1400 may be a digital still camera or a video camera with a frame grabber. Images from the camera are loaded onto computer 1420 using camera interface 1410. Normally, images loaded through interface 1410 would be stored on hard drive 1423 and then later retrieved for processing in video RAM 1430. However, images can be loaded directly into video RAM if desired. Video RAM 1430 preferably contains sufficient image storage to permit the simultaneous processing of two images from the camera. Video display 1440 is preferably a high resolution video display such as a cathode ray tube or a corresponding display implemented in the semiconductor technology. Display 1440 is interfaced to the computer bus through display at interface 1424 and may be utilized to display individual images or both images simultaneously or three dimensional wire frames created in accordance with the invention. Keyboard 1450 is interfaced to the bus over keyboard interface 1422 in the usual manner.
When utilizing a computer implementation, such as found in FIG. 14, distance measurements may be conveniently measured in number of pixels in the vertical and horizontal direction which may be translated into linear measurements on the display screen knowing the resolution of the display in vertical and horizontal directions. Numbers of pixels may be readily determined by pointing and clicking on points under consideration and by obtaining the addresses of the pixels clicked upon from the cursor addresses.
Thus, by knowing the position and orientation of the cameras or other image capture devices, as determined from images analyzed after the capture, one can calculate the precise position in terms of the XYZ real world coordinates in a system centered at point A thus enabling one to specify with great accuracy the position of those points relative to the real world coordinate system.
The techniques set forth herein permit accurate forensic surveying of accident or crime scenes as well as accurate surveying of buildings or construction sites, particularly in the vertical direction which had heretofore been practically impossible.
In this disclosure, there is shown and described only the preferred embodiment of the invention, but, as aforementioned, it is to be understood that the invention is capable of use in various other combinations and environments and is capable of changes or modification within the scope of the inventive concepts as expressed herein.

Claims (19)

What is claimed is:
1. A method of measuring the absolute three dimensional location of a point D with respect to a coordinate system defined using three points, A, B and C, separated by known distances using image data comprising:
a. capturing two images of a scene containing the points of B, C and D, using one or more cameras of known principal distance,
b. determining the location and orientation of said one or more cameras at the time each of said images was captured with reference to said coordinate system using 3 to 5 points from said images, principal distance and said known distances,
c. using the locations of the one or more cameras at the time the images were captured to determine the location of said point D from image data.
2. A method of measuring the absolute three dimensional location of a point D with respect to a coordinate system defined using three points, A, B and C, separated by known distances using image data comprising:
a. capturing two images of a scene containing the points A, B, C and D, using one or more cameras of known principal distance,
b. determining the location and orientation of said one or more cameras at the time each of said images was captured with reference to said coordinate system using information derived from said images, principal distance and said known distances, and
c. using the locations of the one or more cameras at the time the images were captured to determine the location of said point D from image data by
c1. defining an auxiliary coordinate system with origin along the line joining the locations of the cameras,
c2. defining the center point of each image as an origin of a set of image reference axes pointing in X', Y' and Z' directions, respectively,
c3. measuring offset in at least one of the X' and Y' directions of a point on the first image and of a corresponding point of a second image,
c4. determining the angles formed between a line joining point D, the principal point of the objective and the image of point D on one of the X' or Y' planes for each of the images,
c5. determining a distance h representing a distance of point d to a line joining principle points of said one or more cameras used to capture said two images using the measured offsets, the focal length and the angles,
c6. determining the X' and Y' coordinates of point D in the auxiliary coordinate system, and
c7. transforming coordinates (X', Y', h) of the auxiliary coordinate system to a representation in said coordinate system defined using said three points, A, B and C.
3. A method of measuring the absolute three dimensional location of a point D with respect to a coordinate system defined using three points, A, B and C, separated by known distances using image data comprising:
a. capturing two images of a scene containing the points A, B, C and D, usinq one or more cameras of known principal distance,
b. determining the location and orientation of said one or more cameras at the time each of said images was captured with reference to said coordinate system using information derived from said images, principal distance and said known distances by,
b1. representing the distance between point A, B and C and the principal point of a camera O as a viewing pyramid,
b2. modifying the representation of the pyramid to a three triangle flattened representation,
b3. selecting a low estimate Ob1 for one interior side of a first triangle of said flattened representation,
b4. solving the first triangle using image data, principal distance and said known distances, yielding, inter alia, a first calculated value for length OA, given estimate Ob1,
b5. solving the second triangle using results obtained,
b6. solving the third triangle using results obtained, yielding, inter alia, a second calculated value for length OA,
b7. subtracting the second calculated value for length OA from the first calculated value for length OA to produce a difference value,
b8. revising the value of estimate Ob1 by adding said difference value to achieve a revised estimate,
b9. iterating steps d-h using the revised estimate until said difference value is less than a desired accuracy, and
b10. deriving values for camera location using one or more sets of values for distances OA, OB and OC, and
c. using the locations of the one or more cameras at the time the images were captured to determine the location of said point D from image data.
4. The method of claim 3 in which the step of deriving values for camera location using one or more sets of values for distances OA, OB and OC comprises solving simultaneously equations for spheres centered at points A, B and C with respective radii of OA, OB and OC.
5. The method of claim 3, further comprising:
k. determining the orientation of one of more of said cameras by calculating the azimuthal and elevational adjustment required to direct the camera to the location of point A.
6. The method of claim 5, further comprising:
l. determining the orientation of one of more of said cameras by calculating the amount of rotation about the optical axis required to align point B once the camera points at point A.
7. The method of claim 5 further comprising iterating steps k and l until the degree of alignment is within the desired degree of accuracy.
8. The method of claim 1 used to measure the distance between two points.
9. The method of claim 1 used to measure distances in a vertical direction.
10. The method of claim 1 used to locate physical position accurately of objects visible in said images.
11. The method of claim 1 used to create a three dimensional wireframe representation or a three dimensional surface model comprising 3 or 4 vertices surface element.
12. The method of claim 1 used to document the as built condition of an object.
13. A method of measuring the absolute three dimensional location O of a camera with respect to a coordinate system defined using three points, A, B and C, separated by known distances using image data comprising:
a. capturing an image of a scene containing the points A, B, and C, using a camera,
b. determining the principal distance of said camera,
c. determining the location of said camera at the time said image was captured with reference to said coordinate system using 3 to 5 points from said image, principal distance and said known distances.
14. A method of measuring the absolute three dimensional location O of a camera with respect to a coordinate system defined using three points, A, B and C, separated by known distances using image data comprising:
a. capturing an images of a scene containing the points A, B, and C, using a camera of known principal distance,
b. determining the location of said camera at the time said image was captured with reference to said coordinate system using 3 to 5 points from said image, principal distance and said known distances.
15. A method of measuring distance including vertical height comprising:
a. measuring the absolute three dimensional location of points D, E and F with respect to a coordinate system defined using three points, A, B and C, separated by known distances using image data by:
a1. capturing two images of a scene containing the points A, B, C, D, E and F, using one or more cameras of known principal distance,
a2. determining the location and orientation of said one or more cameras at the time each of said images was captured with reference to said coordinate system using points A, B, and C from said images, principal distance and said known distances,
a3. using the locations of the one or more cameras at the time the images were captured to determine the locations of said points D, E and F from image data,
b. determining distances between points D, E and F, and
c. using the location of said points D, E and F and the location of one or more cameras at the time images were captured to determine the location of other points.
16. The method of claim 15 in which the locations of points D, E and F are used to determine the location of said other points using image data from images different from those used to determine the location of points D, E and F.
17. Apparatus for measuring the absolute three dimensional location of a point D with respect to a coordinate system defined using three points, A, B and C, separated by known distances using image data comprising:
a. one or more cameras for capturing images of a scene containing the points A, B, C and D,
b. means for storing images captured by said one or more cameras,
c. means for processing stored images to determine the location and orientation of said one or more cameras at the time each of said images was captured with reference to said coordinate system, using 3 to 5 points from said images, principal distance and said known distances,
d. means for using the locations of said one or more cameras at the time the images were captured to determine the location of said point D from image data.
18. Apparatus as claimed in claim 17 in which the location of point D is stored in a database utilized to store a three dimensional wireframe representation.
19. Apparatus as claimed in claim 18 in which the location of point D is stored in a database of locations of points surveyed.
US08/414,651 1995-03-31 1995-03-31 Methods and apparatus for using image data to determine camera location and orientation Expired - Lifetime US5699444A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US08/414,651 US5699444A (en) 1995-03-31 1995-03-31 Methods and apparatus for using image data to determine camera location and orientation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US08/414,651 US5699444A (en) 1995-03-31 1995-03-31 Methods and apparatus for using image data to determine camera location and orientation

Publications (1)

Publication Number Publication Date
US5699444A true US5699444A (en) 1997-12-16

Family

ID=23642349

Family Applications (1)

Application Number Title Priority Date Filing Date
US08/414,651 Expired - Lifetime US5699444A (en) 1995-03-31 1995-03-31 Methods and apparatus for using image data to determine camera location and orientation

Country Status (1)

Country Link
US (1) US5699444A (en)

Cited By (216)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5841353A (en) * 1995-08-16 1998-11-24 Trimble Navigation Limited Relating to the determination of verticality in tall buildings and other structures
US5850469A (en) * 1996-07-09 1998-12-15 General Electric Company Real time tracking of camera pose
US5870136A (en) * 1997-12-05 1999-02-09 The University Of North Carolina At Chapel Hill Dynamic generation of imperceptible structured light for tracking and acquisition of three dimensional scene geometry and surface characteristics in interactive three dimensional computer graphics applications
US5991437A (en) * 1996-07-12 1999-11-23 Real-Time Geometry Corporation Modular digital audio system having individualized functional modules
US5999642A (en) * 1996-04-22 1999-12-07 Gilliland; Malcolm T. Method and apparatus for determining the configuration of a workpiece
US6108497A (en) * 1996-11-06 2000-08-22 Asahi Kogaku Kogyo Kabushiki Kaisha Standard measurement scale and markers for defining standard measurement scale
US6144761A (en) * 1997-02-03 2000-11-07 Asahi Kogaku Kogyo Kabushiki Kaisha Photogrammetric analytical measurement system
US6195455B1 (en) * 1998-07-01 2001-02-27 Intel Corporation Imaging device orientation information through analysis of test images
US6201882B1 (en) * 1997-07-23 2001-03-13 Nec Corporation Camera calibration apparatus
US6208348B1 (en) 1998-05-27 2001-03-27 In-Three, Inc. System and method for dimensionalization processing of images in consideration of a pedetermined image projection format
US6249616B1 (en) * 1997-05-30 2001-06-19 Enroute, Inc Combining digital images based on three-dimensional relationships between source image data sets
US6266442B1 (en) 1998-10-23 2001-07-24 Facet Technology Corp. Method and apparatus for identifying objects depicted in a videostream
US6278479B1 (en) * 1998-02-24 2001-08-21 Wilson, Hewitt & Associates, Inc. Dual reality system
US6281904B1 (en) * 1998-06-09 2001-08-28 Adobe Systems Incorporated Multi-source texture reconstruction and fusion
US6304669B1 (en) * 1997-11-10 2001-10-16 Asahi Kogaku Kogyo Kabushiki Kaisha Photogrammetric analytical measurement system
US6333749B1 (en) 1998-04-17 2001-12-25 Adobe Systems, Inc. Method and apparatus for image assisted modeling of three-dimensional scenes
US6359647B1 (en) * 1998-08-07 2002-03-19 Philips Electronics North America Corporation Automated camera handoff system for figure tracking in a multiple camera system
EP1188510A2 (en) * 2000-08-29 2002-03-20 Kvaerner Masa-Yards Oy Welding arrangement and method
US20020050988A1 (en) * 2000-03-28 2002-05-02 Michael Petrov System and method of three-dimensional image capture and modeling
US6385334B1 (en) * 1998-03-12 2002-05-07 Fuji Jukogyo Kabushiki Kaisha System and method for adjusting stereo camera
US20020106109A1 (en) * 2000-08-12 2002-08-08 Retterath James E. System for road sign sheeting classification
US20020120424A1 (en) * 2001-01-03 2002-08-29 Christoph Hauger Method and apparatus for fixing a location
US6503195B1 (en) 1999-05-24 2003-01-07 University Of North Carolina At Chapel Hill Methods and systems for real-time structured light depth extraction and endoscope using real-time structured light depth extraction
US6515659B1 (en) 1998-05-27 2003-02-04 In-Three, Inc. Method and system for creating realistic smooth three-dimensional depth contours from two-dimensional images
US20030025788A1 (en) * 2001-08-06 2003-02-06 Mitsubishi Electric Research Laboratories, Inc. Hand-held 3D vision system
DE10139846C1 (en) * 2001-08-14 2003-02-06 Daimler Chrysler Ag Method for estimating positions and locations uses alignment of image data for a camera of model structures in order to increase long-duration stability and autonomics of aerodynamic vehicles/missiles.
US20030035100A1 (en) * 2001-08-02 2003-02-20 Jerry Dimsdale Automated lens calibration
WO2003041411A1 (en) * 2001-11-08 2003-05-15 Revolution Company, Llc Video system and methods for operating a video system
US20030103651A1 (en) * 2001-12-03 2003-06-05 Kurt Novak Photogrammetric apparatus
US6600511B1 (en) * 1997-01-08 2003-07-29 Pentax Corporation Camera for use in photogrammetric analytical measurement
US6618498B1 (en) 1999-07-07 2003-09-09 Pentax Corporation Image processing computer system for photogrammetric analytical measurement
US6618497B1 (en) 1999-06-24 2003-09-09 Pentax Corporation Photogrammetric image processing apparatus and method
US20030169918A1 (en) * 2002-03-06 2003-09-11 Fuji Jukogyo Kabushiki Kaisha Stereoscopic image characteristics examination system
US6628803B1 (en) 1998-11-25 2003-09-30 Pentax Corporation Device for calculating positional data of standard points of photogrammetric target
US20030185555A1 (en) * 2002-03-28 2003-10-02 Osamu Nonaka Electronic camera and photographing composition determination apparatus mountable on electronic camera
US20030210329A1 (en) * 2001-11-08 2003-11-13 Aagaard Kenneth Joseph Video system and methods for operating a video system
US6650764B1 (en) 1999-03-01 2003-11-18 Pentax Corporation Device for calculating positional data of standard points of a photogrammetric target
US6668082B1 (en) * 1997-08-05 2003-12-23 Canon Kabushiki Kaisha Image processing apparatus
US6674878B2 (en) 2001-06-07 2004-01-06 Facet Technology Corp. System for automated determination of retroreflectivity of road signs and other reflective objects
US6693650B2 (en) 2000-03-17 2004-02-17 Pentax Corporation Image processing computer system for a photogrammetric analytical measurement
US6717683B1 (en) 1998-09-30 2004-04-06 Pentax Corporation Target for photogrammetric analytical measurement system
US20040080758A1 (en) * 2002-10-23 2004-04-29 Fanuc Ltd. Three-dimensional visual sensor
US6754378B2 (en) * 1998-06-11 2004-06-22 Kabushiki Kaisha Topcon Image forming apparatus, image forming method and computer-readable storage medium having an image forming program
US20040119848A1 (en) * 2002-11-12 2004-06-24 Buehler Christopher J. Method and apparatus for computerized image background analysis
US20040130620A1 (en) * 2002-11-12 2004-07-08 Buehler Christopher J. Method and system for tracking and behavioral monitoring of multiple objects moving through multiple fields-of-view
US6762766B1 (en) 1999-07-06 2004-07-13 Pentax Corporation Image processing computer system for photogrammetric analytical measurement
US6768813B1 (en) 1999-06-16 2004-07-27 Pentax Corporation Photogrammetric image processing apparatus and method
US6782123B1 (en) * 1997-02-17 2004-08-24 Compagnie Generale Des Matieres Nucleaires Method and device for mapping radiation sources
US6789039B1 (en) * 2000-04-05 2004-09-07 Microsoft Corporation Relative range camera calibration
US20040179107A1 (en) * 2003-03-10 2004-09-16 Charles Benton Video augmented orientation sensor
US20040183909A1 (en) * 2003-03-21 2004-09-23 Lavision Gmbh Method of determining the imaging equation for self calibration with regard to performing stereo-PIV methods
US20040201756A1 (en) * 2003-04-08 2004-10-14 Vanbree Ken System for accurately repositioning imaging devices
US6833858B1 (en) * 1998-10-02 2004-12-21 Canon Kabushiki Kaisha Image input apparatus
US20050058321A1 (en) * 2003-09-11 2005-03-17 Buehler Christopher J. Computerized method and apparatus for determining field-of-view relationships among multiple image sensors
US6873924B1 (en) * 2003-09-30 2005-03-29 General Electric Company Method and system for calibrating relative fields of view of multiple cameras
US20050078853A1 (en) * 2003-10-10 2005-04-14 Buehler Christopher J. System and method for searching for changes in surveillance video
US20050078852A1 (en) * 2003-10-10 2005-04-14 Buehler Christopher J. Method of counting objects in a monitored environment and apparatus for the same
US20050104878A1 (en) * 1998-05-27 2005-05-19 Kaye Michael C. Method of hidden surface reconstruction for creating accurate three-dimensional images converted from two-dimensional images
US20050104879A1 (en) * 1998-05-27 2005-05-19 Kaye Michael C. Method for minimizing visual artifacts converting two-dimensional motion pictures into three-dimensional motion pictures
US20050123188A1 (en) * 2001-11-23 2005-06-09 Esa Leikas Method and system for the calibration of a computer vision system
US6912293B1 (en) 1998-06-26 2005-06-28 Carl P. Korobkin Photogrammetry engine for model construction
US20050219552A1 (en) * 2002-06-07 2005-10-06 Ackerman Jermy D Methods and systems for laser based real-time structured light depth extraction
US6954217B1 (en) 1999-07-02 2005-10-11 Pentax Corporation Image processing computer system for photogrammetric analytical measurement
US20050231505A1 (en) * 1998-05-27 2005-10-20 Kaye Michael C Method for creating artifact free three-dimensional images converted from two-dimensional images
US20050254726A1 (en) * 2004-02-25 2005-11-17 The University Of North Carolina At Chapel Hill Methods, systems, and computer program products for imperceptibly embedding structured light patterns in projected color images for display on planar and non-planar surfaces
US20050271304A1 (en) * 2004-05-05 2005-12-08 Retterath Jamie E Methods and apparatus for automated true object-based image analysis and retrieval
WO2005124594A1 (en) * 2004-06-16 2005-12-29 Koninklijke Philips Electronics, N.V. Automatic, real-time, superimposed labeling of points and objects of interest within a view
US20060013437A1 (en) * 2004-06-22 2006-01-19 David Nister Method and apparatus for determining camera pose
US20060017807A1 (en) * 2004-07-26 2006-01-26 Silicon Optix, Inc. Panoramic vision system and method
US20060017938A1 (en) * 2004-06-15 2006-01-26 Fumio Ohtomo Three-dimensional surveying instrument and electronic storage medium
US7002551B2 (en) 2002-09-25 2006-02-21 Hrl Laboratories, Llc Optical see-through augmented reality modified-scale display
WO2006022630A1 (en) * 2004-07-26 2006-03-02 Silicon Optix, Inc. Panoramic vision system and method
US20060149458A1 (en) * 2005-01-04 2006-07-06 Costello Michael J Precision landmark-aided navigation
US7103212B2 (en) 2002-11-22 2006-09-05 Strider Labs, Inc. Acquisition of three-dimensional images by an active stereo technique using locally unique patterns
US20060221072A1 (en) * 2005-02-11 2006-10-05 Se Shuen Y S 3D imaging system
US7151562B1 (en) * 2000-08-03 2006-12-19 Koninklijke Philips Electronics N.V. Method and apparatus for external calibration of a camera via a graphical user interface
US20070008515A1 (en) * 2005-07-11 2007-01-11 Kabushiki Kaisha Topcon Geographic data collecting system
US20070010924A1 (en) * 2005-07-11 2007-01-11 Kabushiki Kaisha Topcon Geographic data collecting system
US7193645B1 (en) 2000-07-27 2007-03-20 Pvi Virtual Media Services, Llc Video system and method of operating a video system
US7193633B1 (en) 2000-04-27 2007-03-20 Adobe Systems Incorporated Method and apparatus for image assisted modeling of three-dimensional scenes
US20070081200A1 (en) * 2005-03-16 2007-04-12 Columbia University Lensless imaging with controllable apertures
US20070168146A1 (en) * 2003-03-26 2007-07-19 Assembleon N.V. Method for calibrating a device, method for calibrating a number of devices lying side by side as well as an object suitable for implementing such a method
US20070182818A1 (en) * 2005-09-02 2007-08-09 Buehler Christopher J Object tracking and alerts
US20070196016A1 (en) * 2006-02-21 2007-08-23 I-Hsien Chen Calibration system for image capture apparatus and method thereof
US20070283004A1 (en) * 2006-06-02 2007-12-06 Buehler Christopher J Systems and methods for distributed monitoring of remote sites
US20080071559A1 (en) * 2006-09-19 2008-03-20 Juha Arrasvuori Augmented reality assisted shopping
US7382900B2 (en) 2003-09-18 2008-06-03 Lavision Gmbh Method of determining a three-dimensional velocity field in a volume
US20080303902A1 (en) * 2007-06-09 2008-12-11 Sensomatic Electronics Corporation System and method for integrating video analytics and data analytics/mining
US20090092284A1 (en) * 1995-06-07 2009-04-09 Automotive Technologies International, Inc. Light Modulation Techniques for Imaging Objects in or around a Vehicle
US20090131836A1 (en) * 2007-03-06 2009-05-21 Enohara Takaaki Suspicious behavior detection system and method
US20090222237A1 (en) * 2008-03-03 2009-09-03 Kabushiki Kaisha Topcon Geographical data collecting device
US20090225161A1 (en) * 2008-03-04 2009-09-10 Kabushiki Kaisha Topcon Geographical data collecting device
US20090257620A1 (en) * 2008-04-10 2009-10-15 Michael Alan Hicks Methods and apparatus for auditing signage
US20090312629A1 (en) * 2008-06-13 2009-12-17 Inneroptic Technology Inc. Correction of relative tracking errors based on a fiducial
US20090323121A1 (en) * 2005-09-09 2009-12-31 Robert Jan Valkenburg A 3D Scene Scanner and a Position and Orientation System
US20100002082A1 (en) * 2005-03-25 2010-01-07 Buehler Christopher J Intelligent camera selection and object tracking
US7671728B2 (en) 2006-06-02 2010-03-02 Sensormatic Electronics, LLC Systems and methods for distributed monitoring of remote sites
US20100103173A1 (en) * 2008-10-27 2010-04-29 Minkyu Lee Real time object tagging for interactive image display applications
US7728868B2 (en) 2006-08-02 2010-06-01 Inneroptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
US20100215220A1 (en) * 2007-06-01 2010-08-26 Toyota Jidosha Kabushiki Kaisha Measurement device, measurement method, program, and computer readable medium
US20100321473A1 (en) * 2007-10-04 2010-12-23 Samsung Techwin Co., Ltd. Surveillance camera system
US7907793B1 (en) 2001-05-04 2011-03-15 Legend Films Inc. Image sequence depth enhancement system and method
US7941269B2 (en) 2005-05-06 2011-05-10 Rialcardo Tice B.V. Llc Network-based navigation system having virtual drive-thru advertisements integrated with actual imagery from along a physical route
US20110205340A1 (en) * 2008-08-12 2011-08-25 Iee International Electronics & Engineering S.A. 3d time-of-flight camera system and position/orientation calibration method therefor
US20110305370A1 (en) * 2010-06-14 2011-12-15 Samsung Electronics Co., Ltd. Apparatus and method for depth unfolding based on multiple depth images
US20120007943A1 (en) * 2009-03-31 2012-01-12 Donny Tytgat Method for determining the relative position of a first and a second imaging device and devices therefore
US8152305B2 (en) 2004-07-16 2012-04-10 The University Of North Carolina At Chapel Hill Methods, systems, and computer program products for full spectrum projection
US8160390B1 (en) 1970-01-21 2012-04-17 Legend3D, Inc. Minimal artifact image sequence depth enhancement system and method
US20120194517A1 (en) * 2011-01-31 2012-08-02 Microsoft Corporation Using a Three-Dimensional Environment Model in Gameplay
US20120249749A1 (en) * 2011-03-31 2012-10-04 Ats Automation Tooling Systems Inc. Three dimensional optical sensing through optical media
US8340379B2 (en) 2008-03-07 2012-12-25 Inneroptic Technology, Inc. Systems and methods for displaying guidance data based on updated deformable imaging data
US8385684B2 (en) 2001-05-04 2013-02-26 Legend3D, Inc. System and method for minimal iteration workflow for image sequence depth enhancement
US8401222B2 (en) 2009-05-22 2013-03-19 Pictometry International Corp. System and process for roof measurement using aerial imagery
US20130194428A1 (en) * 2012-01-27 2013-08-01 Qualcomm Incorporated System and method for determining location of a device using opposing cameras
CN103282741A (en) * 2011-01-11 2013-09-04 高通股份有限公司 Position determination using horizontal angles
US8554307B2 (en) 2010-04-12 2013-10-08 Inneroptic Technology, Inc. Image annotation in image-guided medical procedures
US8586368B2 (en) 2009-06-25 2013-11-19 The University Of North Carolina At Chapel Hill Methods and systems for using actuated surface-attached posts for assessing biofluid rheology
US8585598B2 (en) 2009-02-17 2013-11-19 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US20130308826A1 (en) * 2011-02-17 2013-11-21 Konica Minolta, Inc. Image processing apparatus, non-transitory computer readable recording medium, and image processing method
US8641621B2 (en) 2009-02-17 2014-02-04 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US8670816B2 (en) 2012-01-30 2014-03-11 Inneroptic Technology, Inc. Multiple medical device guidance
US8730232B2 (en) 2011-02-01 2014-05-20 Legend3D, Inc. Director-style based 2D to 3D movie conversion system and method
US8774556B2 (en) 2011-11-30 2014-07-08 Microsoft Corporation Perspective correction using a reflection
US8872852B2 (en) 2011-06-30 2014-10-28 International Business Machines Corporation Positional context determination with multi marker confidence ranking
US8934009B2 (en) 2010-09-02 2015-01-13 Kabushiki Kaisha Topcon Measuring method and measuring device
US8942917B2 (en) 2011-02-14 2015-01-27 Microsoft Corporation Change invariant scene recognition by an agent
US20150040072A1 (en) * 2013-08-01 2015-02-05 Ebay Inc. Three dimensional image dimension mapping
US20150098636A1 (en) * 2013-10-09 2015-04-09 United Sciences, Llc Integrated tracking with fiducial-based modeling
US20150097935A1 (en) * 2013-10-09 2015-04-09 United Sciences, Llc Integrated tracking with world modeling
US20150097929A1 (en) * 2013-10-09 2015-04-09 United Sciences, Llc. Display for three-dimensional imaging
US9007404B2 (en) 2013-03-15 2015-04-14 Legend3D, Inc. Tilt-based look around effect image enhancement method
US9007365B2 (en) 2012-11-27 2015-04-14 Legend3D, Inc. Line depth augmentation system and method for conversion of 2D images to 3D images
US20150130928A1 (en) * 2013-11-12 2015-05-14 Trimble Navigation Limited Point-to-point measurements using a handheld device
US20150168263A1 (en) * 2011-09-30 2015-06-18 Lufthansa Technik Ag Endoscopy system and corresponding method for examining gas turbines
US9160979B1 (en) * 2011-05-27 2015-10-13 Trimble Navigation Limited Determining camera position for a photograph having a displaced center of projection
US20150292999A1 (en) * 2012-11-05 2015-10-15 Jfe Steel Corporation Method and apparatus for measuring dynamic panel stiffness of outer panel for automobile parts
US9206023B2 (en) * 2011-08-26 2015-12-08 Crown Equipment Limited Method and apparatus for using unique landmarks to locate industrial vehicles at start-up
US20150369593A1 (en) * 2014-06-19 2015-12-24 Kari MYLLYKOSKI Orthographic image capture system
US9241147B2 (en) 2013-05-01 2016-01-19 Legend3D, Inc. External depth map transformation method for conversion of two-dimensional images to stereoscopic images
US9245916B2 (en) 2013-07-09 2016-01-26 Rememdia LC Optical positioning sensor
US9251582B2 (en) 2012-12-31 2016-02-02 General Electric Company Methods and systems for enhanced automated visual inspection of a physical asset
US9265572B2 (en) 2008-01-24 2016-02-23 The University Of North Carolina At Chapel Hill Methods, systems, and computer readable media for image guided ablation
US9270965B2 (en) 2012-02-06 2016-02-23 Legend 3D, Inc. Multi-stage production pipeline system
US9282321B2 (en) 2011-02-17 2016-03-08 Legend3D, Inc. 3D model multi-reviewer system
US9286941B2 (en) 2001-05-04 2016-03-15 Legend3D, Inc. Image sequence enhancement and motion picture project management system
US9288476B2 (en) 2011-02-17 2016-03-15 Legend3D, Inc. System and method for real-time depth modification of stereo images of a virtual reality environment
US9282947B2 (en) 2009-12-01 2016-03-15 Inneroptic Technology, Inc. Imager focusing based on intraoperative data
US20160134860A1 (en) * 2014-11-12 2016-05-12 Dejan Jovanovic Multiple template improved 3d modeling of imaged objects using camera position and pose to obtain accuracy
US9367650B2 (en) 2014-01-10 2016-06-14 Ebay Inc. Solar installation mapping
US9407904B2 (en) 2013-05-01 2016-08-02 Legend3D, Inc. Method for creating 3D virtual reality from 2D images
EP3051493A1 (en) * 2015-01-27 2016-08-03 Kabushiki Kaisha Topcon Survey data processing device, survey data processing method, and program therefor
US9438878B2 (en) 2013-05-01 2016-09-06 Legend3D, Inc. Method of converting 2D video to 3D video using 3D object models
US9547937B2 (en) 2012-11-30 2017-01-17 Legend3D, Inc. Three-dimensional annotation system and method
US9609307B1 (en) 2015-09-17 2017-03-28 Legend3D, Inc. Method of converting 2D video to 3D video using machine learning
US9612211B2 (en) 2013-03-14 2017-04-04 General Electric Company Methods and systems for enhanced tip-tracking and navigation of visual inspection devices
US9675319B1 (en) 2016-02-17 2017-06-13 Inneroptic Technology, Inc. Loupe display
US9677840B2 (en) 2014-03-14 2017-06-13 Lineweight Llc Augmented reality simulator
US20170243371A1 (en) * 2015-10-30 2017-08-24 Snap Inc. Image based tracking in augmented reality systems
WO2017209213A1 (en) * 2016-05-31 2017-12-07 Necソリューションイノベータ株式会社 Image processing device, image processing method, and computer-readable recording medium
US9901406B2 (en) 2014-10-02 2018-02-27 Inneroptic Technology, Inc. Affected region display associated with a medical device
US9949700B2 (en) 2015-07-22 2018-04-24 Inneroptic Technology, Inc. Medical device approaches
US9984499B1 (en) 2015-11-30 2018-05-29 Snap Inc. Image and point cloud based tracking and in augmented reality systems
CN108257182A (en) * 2016-12-29 2018-07-06 深圳超多维光电子有限公司 A kind of scaling method and device of three-dimensional camera module
US20180249088A1 (en) * 2015-09-03 2018-08-30 3Digiview Asia Co., Ltd. Method for correcting image of multi-camera system by using multi-sphere correction device
US10068344B2 (en) 2014-03-05 2018-09-04 Smart Picture Technologies Inc. Method and system for 3D capture based on structure from motion with simplified pose detection
US10074381B1 (en) 2017-02-20 2018-09-11 Snap Inc. Augmented reality speech balloon system
US10083522B2 (en) 2015-06-19 2018-09-25 Smart Picture Technologies, Inc. Image based measurement system
US10083524B1 (en) * 2017-04-21 2018-09-25 Octi Systems and methods for determining location and orientation of a camera
US10088317B2 (en) 2011-06-09 2018-10-02 Microsoft Technologies Licensing, LLC Hybrid-approach for localization of an agent
US10091491B2 (en) 2012-06-05 2018-10-02 Samsung Electronics Co., Ltd. Depth image generating method and apparatus and depth image processing method and apparatus
US10188467B2 (en) 2014-12-12 2019-01-29 Inneroptic Technology, Inc. Surgical guidance intersection display
US20190108743A1 (en) * 2017-10-08 2019-04-11 Magik Eye Inc. Calibrating a sensor system including multiple movable sensors
US10278778B2 (en) 2016-10-27 2019-05-07 Inneroptic Technology, Inc. Medical device navigation using a virtual 3D space
US10304254B2 (en) 2017-08-08 2019-05-28 Smart Picture Technologies, Inc. Method for measuring and modeling spaces using markerless augmented reality
US10319149B1 (en) 2017-02-17 2019-06-11 Snap Inc. Augmented reality anamorphosis system
US10314559B2 (en) 2013-03-14 2019-06-11 Inneroptic Technology, Inc. Medical device guidance
US10387730B1 (en) 2017-04-20 2019-08-20 Snap Inc. Augmented reality typography personalization system
US10488195B2 (en) 2016-10-25 2019-11-26 Microsoft Technology Licensing, Llc Curated photogrammetry
US10665035B1 (en) 2017-07-11 2020-05-26 B+T Group Holdings, LLC System and process of using photogrammetry for digital as-built site surveys and asset tracking
US10677583B2 (en) 2015-04-17 2020-06-09 Rememdia LC Strain sensor
US10740974B1 (en) 2017-09-15 2020-08-11 Snap Inc. Augmented reality system
WO2020160874A1 (en) * 2019-02-06 2020-08-13 Robert Bosch Gmbh Calibration unit for a monitoring device, monitoring device for man-overboard monitoring and method for calibration
US10769458B2 (en) 2008-02-12 2020-09-08 DBI/CIDAUT Technologies, LLC Determination procedure of the luminance of traffic signs and device for its embodiment
US10788865B1 (en) 2019-04-26 2020-09-29 Dell Products L.P. Information handling system dual pivot hinge signal path
US10904458B2 (en) 2015-09-03 2021-01-26 3Digiview Asia Co., Ltd. Error correction unit for time slice image
US10931883B2 (en) 2018-03-20 2021-02-23 Magik Eye Inc. Adjusting camera exposure for three-dimensional depth sensing and two-dimensional imaging
US10997760B2 (en) 2018-08-31 2021-05-04 Snap Inc. Augmented reality anthropomorphization system
US11002537B2 (en) 2016-12-07 2021-05-11 Magik Eye Inc. Distance sensor including adjustable focus imaging sensor
US11009936B2 (en) 2019-05-02 2021-05-18 Dell Products L.P. Information handling system power control sensor
US11017742B2 (en) 2019-05-02 2021-05-25 Dell Products L.P. Information handling system multiple display viewing angle brightness adjustment
US11019249B2 (en) 2019-05-12 2021-05-25 Magik Eye Inc. Mapping three-dimensional depth map data onto two-dimensional images
US11062468B2 (en) 2018-03-20 2021-07-13 Magik Eye Inc. Distance measurement using projection patterns of varying densities
US20210231810A1 (en) * 2018-05-30 2021-07-29 Maxell, Ltd. Camera apparatus
US11138757B2 (en) 2019-05-10 2021-10-05 Smart Picture Technologies, Inc. Methods and systems for measuring and modeling spaces using markerless photo-based augmented reality process
US11151782B1 (en) 2018-12-18 2021-10-19 B+T Group Holdings, Inc. System and process of generating digital images of a site having a structure with superimposed intersecting grid lines and annotations
US11199397B2 (en) 2017-10-08 2021-12-14 Magik Eye Inc. Distance measurement using a longitudinal grid pattern
US11215711B2 (en) 2012-12-28 2022-01-04 Microsoft Technology Licensing, Llc Using photometric stereo for 3D environment modeling
US11259879B2 (en) 2017-08-01 2022-03-01 Inneroptic Technology, Inc. Selective transparency to assist medical device navigation
US11288831B2 (en) * 2018-12-05 2022-03-29 Vivotek Inc. Information measuring method and information measuring system
US11320537B2 (en) 2019-12-01 2022-05-03 Magik Eye Inc. Enhancing triangulation-based three-dimensional distance measurements with time of flight information
US11341925B2 (en) 2019-05-02 2022-05-24 Dell Products L.P. Information handling system adapting multiple display visual image presentations
US11347331B2 (en) 2019-04-08 2022-05-31 Dell Products L.P. Portable information handling system stylus garage and charge interface
US11423605B2 (en) * 2019-11-01 2022-08-23 Activision Publishing, Inc. Systems and methods for remastering a game space while maintaining the underlying game simulation
US11464578B2 (en) 2009-02-17 2022-10-11 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US11474209B2 (en) 2019-03-25 2022-10-18 Magik Eye Inc. Distance measurement using high density projection patterns
US11475584B2 (en) 2018-08-07 2022-10-18 Magik Eye Inc. Baffles for three-dimensional sensors having spherical fields of view
US11474245B2 (en) 2018-06-06 2022-10-18 Magik Eye Inc. Distance measurement using high density projection patterns
US11483503B2 (en) 2019-01-20 2022-10-25 Magik Eye Inc. Three-dimensional sensor including bandpass filter having multiple passbands
US11484365B2 (en) 2018-01-23 2022-11-01 Inneroptic Technology, Inc. Medical image guidance
US11536857B2 (en) 2019-12-19 2022-12-27 Trimble Inc. Surface tracking on a survey pole
US11568614B1 (en) 2021-08-02 2023-01-31 Bank Of America Corporation Adaptive augmented reality system for dynamic processing of spatial component parameters based on detecting accommodation factors in real time
US11580662B2 (en) 2019-12-29 2023-02-14 Magik Eye Inc. Associating three-dimensional coordinates with two-dimensional feature points
US11587265B2 (en) * 2019-08-02 2023-02-21 Samsung Electronics Co., Ltd. Electronic apparatus and control method thereof
US11609345B2 (en) 2020-02-20 2023-03-21 Rockwell Automation Technologies, Inc. System and method to determine positioning in a virtual coordinate system
US11688088B2 (en) 2020-01-05 2023-06-27 Magik Eye Inc. Transferring the coordinate system of a three-dimensional camera to the incident point of a two-dimensional camera
US11710309B2 (en) 2013-02-22 2023-07-25 Microsoft Technology Licensing, Llc Camera/object pose from predicted coordinates

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4965840A (en) * 1987-11-27 1990-10-23 State University Of New York Method and apparatus for determining the distances between surface-patches of a three-dimensional spatial scene and a camera system
US4969106A (en) * 1989-02-27 1990-11-06 Camsys, Inc. Computerized method of determining surface strain distributions in a deformed body
US5146346A (en) * 1991-06-14 1992-09-08 Adobe Systems Incorporated Method for displaying and printing multitone images derived from grayscale images
US5259037A (en) * 1991-02-07 1993-11-02 Hughes Training, Inc. Automated video imagery database generation using photogrammetry
US5525883A (en) * 1994-07-08 1996-06-11 Sara Avitzour Mobile robot location determination employing error-correcting distributed landmarks

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4965840A (en) * 1987-11-27 1990-10-23 State University Of New York Method and apparatus for determining the distances between surface-patches of a three-dimensional spatial scene and a camera system
US4969106A (en) * 1989-02-27 1990-11-06 Camsys, Inc. Computerized method of determining surface strain distributions in a deformed body
US5259037A (en) * 1991-02-07 1993-11-02 Hughes Training, Inc. Automated video imagery database generation using photogrammetry
US5146346A (en) * 1991-06-14 1992-09-08 Adobe Systems Incorporated Method for displaying and printing multitone images derived from grayscale images
US5525883A (en) * 1994-07-08 1996-06-11 Sara Avitzour Mobile robot location determination employing error-correcting distributed landmarks

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
David F. Rogers et al., Mathematical Elements for Computer Graphics, Second Edition, 1990, pp. 200 207. *
David F. Rogers et al., Mathematical Elements for Computer Graphics, Second Edition, 1990, pp. 200-207.
Manual of Photogrammetry, Fourth Edition, 1980, American Society of Photogrammetry, pp. 54 57. *
Manual of Photogrammetry, Fourth Edition, 1980, American Society of Photogrammetry, pp. 54-57.
Wolfgang Boehm et al., Geometric Concepts for Geometric Design, 1994, Chapter 8: Reconstruction, pp. 71 76. *
Wolfgang Boehm et al., Geometric Concepts for Geometric Design, 1994, Chapter 8: Reconstruction, pp. 71-76.

Cited By (410)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8160390B1 (en) 1970-01-21 2012-04-17 Legend3D, Inc. Minimal artifact image sequence depth enhancement system and method
US20090092284A1 (en) * 1995-06-07 2009-04-09 Automotive Technologies International, Inc. Light Modulation Techniques for Imaging Objects in or around a Vehicle
US7738678B2 (en) * 1995-06-07 2010-06-15 Automotive Technologies International, Inc. Light modulation techniques for imaging objects in or around a vehicle
US5841353A (en) * 1995-08-16 1998-11-24 Trimble Navigation Limited Relating to the determination of verticality in tall buildings and other structures
US5999642A (en) * 1996-04-22 1999-12-07 Gilliland; Malcolm T. Method and apparatus for determining the configuration of a workpiece
US6101268A (en) * 1996-04-22 2000-08-08 Gilliland; Malcolm T. Method and apparatus for determining the configuration of a workpiece
US5850469A (en) * 1996-07-09 1998-12-15 General Electric Company Real time tracking of camera pose
US5991437A (en) * 1996-07-12 1999-11-23 Real-Time Geometry Corporation Modular digital audio system having individualized functional modules
US6339683B1 (en) 1996-11-06 2002-01-15 Asahi Kogaku Kogyo Kabushiki Kaisha Standard measurement scale and markers for defining standard measurement scale
US6108497A (en) * 1996-11-06 2000-08-22 Asahi Kogaku Kogyo Kabushiki Kaisha Standard measurement scale and markers for defining standard measurement scale
US6600511B1 (en) * 1997-01-08 2003-07-29 Pentax Corporation Camera for use in photogrammetric analytical measurement
US6144761A (en) * 1997-02-03 2000-11-07 Asahi Kogaku Kogyo Kabushiki Kaisha Photogrammetric analytical measurement system
DE19804205B4 (en) * 1997-02-03 2004-04-08 Pentax Corp. Photogrammetric measuring method and photogrammetric measuring device
US6782123B1 (en) * 1997-02-17 2004-08-24 Compagnie Generale Des Matieres Nucleaires Method and device for mapping radiation sources
US6249616B1 (en) * 1997-05-30 2001-06-19 Enroute, Inc Combining digital images based on three-dimensional relationships between source image data sets
US6201882B1 (en) * 1997-07-23 2001-03-13 Nec Corporation Camera calibration apparatus
US6301372B1 (en) 1997-07-23 2001-10-09 Nec Corporation Camera calibration apparatus
US6668082B1 (en) * 1997-08-05 2003-12-23 Canon Kabushiki Kaisha Image processing apparatus
US6304669B1 (en) * 1997-11-10 2001-10-16 Asahi Kogaku Kogyo Kabushiki Kaisha Photogrammetric analytical measurement system
US5870136A (en) * 1997-12-05 1999-02-09 The University Of North Carolina At Chapel Hill Dynamic generation of imperceptible structured light for tracking and acquisition of three dimensional scene geometry and surface characteristics in interactive three dimensional computer graphics applications
WO1999030509A1 (en) * 1997-12-05 1999-06-17 The University Of North Carolina At Chapel Hill Generation of light for tracking and acquisition of three-dimensional scene geometry in computer graphics applications
WO1999030510A1 (en) * 1997-12-05 1999-06-17 The University Of North Carolina At Chapel Hill Generation of light for tracking and acquisition of three dimensional scene geometry in computer graphics applications
US6278479B1 (en) * 1998-02-24 2001-08-21 Wilson, Hewitt & Associates, Inc. Dual reality system
US6498618B2 (en) * 1998-02-24 2002-12-24 Phillip C. Wilson Dual reality system
US6385334B1 (en) * 1998-03-12 2002-05-07 Fuji Jukogyo Kabushiki Kaisha System and method for adjusting stereo camera
US7602404B1 (en) * 1998-04-17 2009-10-13 Adobe Systems, Incorporated Method and apparatus for image assisted modeling of three-dimensional scenes
US6333749B1 (en) 1998-04-17 2001-12-25 Adobe Systems, Inc. Method and apparatus for image assisted modeling of three-dimensional scenes
US7116323B2 (en) 1998-05-27 2006-10-03 In-Three, Inc. Method of hidden surface reconstruction for creating accurate three-dimensional images converted from two-dimensional images
US20050104879A1 (en) * 1998-05-27 2005-05-19 Kaye Michael C. Method for minimizing visual artifacts converting two-dimensional motion pictures into three-dimensional motion pictures
US7116324B2 (en) 1998-05-27 2006-10-03 In-Three, Inc. Method for minimizing visual artifacts converting two-dimensional motion pictures into three-dimensional motion pictures
US6686926B1 (en) 1998-05-27 2004-02-03 In-Three, Inc. Image processing system and method for converting two-dimensional images into three-dimensional images
US6208348B1 (en) 1998-05-27 2001-03-27 In-Three, Inc. System and method for dimensionalization processing of images in consideration of a pedetermined image projection format
US6515659B1 (en) 1998-05-27 2003-02-04 In-Three, Inc. Method and system for creating realistic smooth three-dimensional depth contours from two-dimensional images
US20050231505A1 (en) * 1998-05-27 2005-10-20 Kaye Michael C Method for creating artifact free three-dimensional images converted from two-dimensional images
US20050104878A1 (en) * 1998-05-27 2005-05-19 Kaye Michael C. Method of hidden surface reconstruction for creating accurate three-dimensional images converted from two-dimensional images
US6281904B1 (en) * 1998-06-09 2001-08-28 Adobe Systems Incorporated Multi-source texture reconstruction and fusion
US7116818B2 (en) 1998-06-11 2006-10-03 Kabushiki Kaisha Topcon Image forming apparatus, image forming method and computer-readable storage medium having an image forming program
US20040131248A1 (en) * 1998-06-11 2004-07-08 Kabushiki Kaisha Topcon Image forming apparatus, image forming method and computer-readable storage medium having an image forming program
US6754378B2 (en) * 1998-06-11 2004-06-22 Kabushiki Kaisha Topcon Image forming apparatus, image forming method and computer-readable storage medium having an image forming program
US6912293B1 (en) 1998-06-26 2005-06-28 Carl P. Korobkin Photogrammetry engine for model construction
US8542911B1 (en) 1998-06-26 2013-09-24 Carl Phillip Korobkin Photogrammetry engine for model construction
US7720276B1 (en) * 1998-06-26 2010-05-18 Korobkin Carl P Photogrammetry engine for model construction
US6195455B1 (en) * 1998-07-01 2001-02-27 Intel Corporation Imaging device orientation information through analysis of test images
US6359647B1 (en) * 1998-08-07 2002-03-19 Philips Electronics North America Corporation Automated camera handoff system for figure tracking in a multiple camera system
US6717683B1 (en) 1998-09-30 2004-04-06 Pentax Corporation Target for photogrammetric analytical measurement system
US20040150816A1 (en) * 1998-09-30 2004-08-05 Pentax Corporation Target for photogrammetric analystical measurement system
US6833858B1 (en) * 1998-10-02 2004-12-21 Canon Kabushiki Kaisha Image input apparatus
US6453056B2 (en) 1998-10-23 2002-09-17 Facet Technology Corporation Method and apparatus for generating a database of road sign images and positions
US6449384B2 (en) 1998-10-23 2002-09-10 Facet Technology Corp. Method and apparatus for rapidly determining whether a digitized image frame contains an object of interest
US6266442B1 (en) 1998-10-23 2001-07-24 Facet Technology Corp. Method and apparatus for identifying objects depicted in a videostream
US6625315B2 (en) * 1998-10-23 2003-09-23 Facet Technology Corp. Method and apparatus for identifying objects depicted in a videostream
US20070154067A1 (en) * 1998-10-23 2007-07-05 Facet Technology Corp. Method and apparatus for identifying objects depicted in a videostream
US20040062442A1 (en) * 1998-10-23 2004-04-01 Facet Technology Corp. Method and apparatus for identifying objects depicted in a videostream
US7092548B2 (en) * 1998-10-23 2006-08-15 Facet Technology Corporation Method and apparatus for identifying objects depicted in a videostream
US6363161B2 (en) 1998-10-23 2002-03-26 Facet Technology Corp. System for automatically generating database of objects of interest by analysis of images recorded by moving vehicle
US7444003B2 (en) * 1998-10-23 2008-10-28 Facet Technology Corporation Method and apparatus for identifying objects depicted in a videostream
US6628803B1 (en) 1998-11-25 2003-09-30 Pentax Corporation Device for calculating positional data of standard points of photogrammetric target
US6650764B1 (en) 1999-03-01 2003-11-18 Pentax Corporation Device for calculating positional data of standard points of a photogrammetric target
US6503195B1 (en) 1999-05-24 2003-01-07 University Of North Carolina At Chapel Hill Methods and systems for real-time structured light depth extraction and endoscope using real-time structured light depth extraction
US6768813B1 (en) 1999-06-16 2004-07-27 Pentax Corporation Photogrammetric image processing apparatus and method
US6618497B1 (en) 1999-06-24 2003-09-09 Pentax Corporation Photogrammetric image processing apparatus and method
US6954217B1 (en) 1999-07-02 2005-10-11 Pentax Corporation Image processing computer system for photogrammetric analytical measurement
US6762766B1 (en) 1999-07-06 2004-07-13 Pentax Corporation Image processing computer system for photogrammetric analytical measurement
US6618498B1 (en) 1999-07-07 2003-09-09 Pentax Corporation Image processing computer system for photogrammetric analytical measurement
US6693650B2 (en) 2000-03-17 2004-02-17 Pentax Corporation Image processing computer system for a photogrammetric analytical measurement
US20020050988A1 (en) * 2000-03-28 2002-05-02 Michael Petrov System and method of three-dimensional image capture and modeling
US7065242B2 (en) 2000-03-28 2006-06-20 Viewpoint Corporation System and method of three-dimensional image capture and modeling
US7453456B2 (en) 2000-03-28 2008-11-18 Enliven Marketing Technologies Corporation System and method of three-dimensional image capture and modeling
US7474803B2 (en) 2000-03-28 2009-01-06 Enliven Marketing Technologies Corporation System and method of three-dimensional image capture and modeling
US20060232583A1 (en) * 2000-03-28 2006-10-19 Michael Petrov System and method of three-dimensional image capture and modeling
US7003427B2 (en) * 2000-04-05 2006-02-21 Microsoft Corporation Relative range camera calibration
US6789039B1 (en) * 2000-04-05 2004-09-07 Microsoft Corporation Relative range camera calibration
US20050031329A1 (en) * 2000-04-05 2005-02-10 Microsoft Corporation Relative range camera calibration
US7193633B1 (en) 2000-04-27 2007-03-20 Adobe Systems Incorporated Method and apparatus for image assisted modeling of three-dimensional scenes
US7193645B1 (en) 2000-07-27 2007-03-20 Pvi Virtual Media Services, Llc Video system and method of operating a video system
US7151562B1 (en) * 2000-08-03 2006-12-19 Koninklijke Philips Electronics N.V. Method and apparatus for external calibration of a camera via a graphical user interface
US9335255B2 (en) 2000-08-12 2016-05-10 Facet Technology Corp. System and assessment of reflective objects along a roadway
US7995796B2 (en) 2000-08-12 2011-08-09 Facet Technology Corp. System for road sign sheeting classification
US8860944B2 (en) 2000-08-12 2014-10-14 Facet Technology Corp. System and assessment of reflective objects along a roadway
US6891960B2 (en) 2000-08-12 2005-05-10 Facet Technology System for road sign sheeting classification
US20020106109A1 (en) * 2000-08-12 2002-08-08 Retterath James E. System for road sign sheeting classification
US9671328B2 (en) 2000-08-12 2017-06-06 Facet Technology Corp. System and assessment of reflective objects along a roadway
US7515736B2 (en) 2000-08-12 2009-04-07 Facet Technology, Corp. System for road sign sheeting classification
US8660311B2 (en) 2000-08-12 2014-02-25 Facet Technology Corp. System for assessment reflective objects along a roadway
US9989457B2 (en) 2000-08-12 2018-06-05 Mandli Communications, Inc. System and assessment of reflective objects along a roadway
US20050249378A1 (en) * 2000-08-12 2005-11-10 Facet Technology, Inc. System for road sign sheeting classification
US9989456B2 (en) 2000-08-12 2018-06-05 Facet Technology Corp. System for the determination of retroreflectivity of road signs and other reflective objects
EP1188510A3 (en) * 2000-08-29 2002-07-31 Kvaerner Masa-Yards Oy Welding arrangement and method
US6750426B2 (en) 2000-08-29 2004-06-15 Kvaerner Masa-Yards Oy Welding arrangement and method
EP1188510A2 (en) * 2000-08-29 2002-03-20 Kvaerner Masa-Yards Oy Welding arrangement and method
US20020120424A1 (en) * 2001-01-03 2002-08-29 Christoph Hauger Method and apparatus for fixing a location
US6741948B2 (en) * 2001-01-03 2004-05-25 Carl-Zeiss-Stiftung Method and apparatus for fixing a location
US8396328B2 (en) 2001-05-04 2013-03-12 Legend3D, Inc. Minimal artifact image sequence depth enhancement system and method
US8385684B2 (en) 2001-05-04 2013-02-26 Legend3D, Inc. System and method for minimal iteration workflow for image sequence depth enhancement
US9286941B2 (en) 2001-05-04 2016-03-15 Legend3D, Inc. Image sequence enhancement and motion picture project management system
US8401336B2 (en) 2001-05-04 2013-03-19 Legend3D, Inc. System and method for rapid image sequence depth enhancement with augmented computer-generated elements
US7907793B1 (en) 2001-05-04 2011-03-15 Legend Films Inc. Image sequence depth enhancement system and method
US8073247B1 (en) 2001-05-04 2011-12-06 Legend3D, Inc. Minimal artifact image sequence depth enhancement system and method
US8078006B1 (en) 2001-05-04 2011-12-13 Legend3D, Inc. Minimal artifact image sequence depth enhancement system and method
US9615082B2 (en) 2001-05-04 2017-04-04 Legend3D, Inc. Image sequence enhancement and motion picture project management system and method
US6674878B2 (en) 2001-06-07 2004-01-06 Facet Technology Corp. System for automated determination of retroreflectivity of road signs and other reflective objects
US20040156531A1 (en) * 2001-06-07 2004-08-12 Facet Technology Corporation System for automated determination of retroreflectivity of road signs and other reflective objects
US7043057B2 (en) 2001-06-07 2006-05-09 Facet Technology, Corporation System for automated determination of retroreflectivity of road signs and other reflective objects
US20030035100A1 (en) * 2001-08-02 2003-02-20 Jerry Dimsdale Automated lens calibration
US6781618B2 (en) * 2001-08-06 2004-08-24 Mitsubishi Electric Research Laboratories, Inc. Hand-held 3D vision system
US20030025788A1 (en) * 2001-08-06 2003-02-06 Mitsubishi Electric Research Laboratories, Inc. Hand-held 3D vision system
DE10139846C1 (en) * 2001-08-14 2003-02-06 Daimler Chrysler Ag Method for estimating positions and locations uses alignment of image data for a camera of model structures in order to increase long-duration stability and autonomics of aerodynamic vehicles/missiles.
WO2003041411A1 (en) * 2001-11-08 2003-05-15 Revolution Company, Llc Video system and methods for operating a video system
US20030210329A1 (en) * 2001-11-08 2003-11-13 Aagaard Kenneth Joseph Video system and methods for operating a video system
US8675073B2 (en) 2001-11-08 2014-03-18 Kenneth Joseph Aagaard Video system and methods for operating a video system
US20110211096A1 (en) * 2001-11-08 2011-09-01 Kenneth Joseph Aagaard Video system and methods for operating a video system
US20050123188A1 (en) * 2001-11-23 2005-06-09 Esa Leikas Method and system for the calibration of a computer vision system
US7860298B2 (en) * 2001-11-23 2010-12-28 Mapvision Oy Ltd. Method and system for the calibration of a computer vision system
US20030103651A1 (en) * 2001-12-03 2003-06-05 Kurt Novak Photogrammetric apparatus
US20030169918A1 (en) * 2002-03-06 2003-09-11 Fuji Jukogyo Kabushiki Kaisha Stereoscopic image characteristics examination system
US7139424B2 (en) * 2002-03-06 2006-11-21 Fuji Jukogyo Kabushiki Kaisha Stereoscopic image characteristics examination system
US7184089B2 (en) * 2002-03-28 2007-02-27 Olympus Corporation Electronic camera and photographing composition determination apparatus mountable on electronic camera
US20030185555A1 (en) * 2002-03-28 2003-10-02 Osamu Nonaka Electronic camera and photographing composition determination apparatus mountable on electronic camera
US20050219552A1 (en) * 2002-06-07 2005-10-06 Ackerman Jermy D Methods and systems for laser based real-time structured light depth extraction
US7385708B2 (en) 2002-06-07 2008-06-10 The University Of North Carolina At Chapel Hill Methods and systems for laser based real-time structured light depth extraction
US7002551B2 (en) 2002-09-25 2006-02-21 Hrl Laboratories, Llc Optical see-through augmented reality modified-scale display
US7526121B2 (en) * 2002-10-23 2009-04-28 Fanuc Ltd Three-dimensional visual sensor
US20040080758A1 (en) * 2002-10-23 2004-04-29 Fanuc Ltd. Three-dimensional visual sensor
US20040130620A1 (en) * 2002-11-12 2004-07-08 Buehler Christopher J. Method and system for tracking and behavioral monitoring of multiple objects moving through multiple fields-of-view
US20050265582A1 (en) * 2002-11-12 2005-12-01 Buehler Christopher J Method and system for tracking and behavioral monitoring of multiple objects moving through multiple fields-of-view
US20040119848A1 (en) * 2002-11-12 2004-06-24 Buehler Christopher J. Method and apparatus for computerized image background analysis
US7460685B2 (en) 2002-11-12 2008-12-02 Intellivid Corporation Method and apparatus for computerized image background analysis
US7221775B2 (en) 2002-11-12 2007-05-22 Intellivid Corporation Method and apparatus for computerized image background analysis
US20070211914A1 (en) * 2002-11-12 2007-09-13 Buehler Christopher J Method and apparatus for computerized image background analysis
US8547437B2 (en) 2002-11-12 2013-10-01 Sensormatic Electronics, LLC Method and system for tracking and behavioral monitoring of multiple objects moving through multiple fields-of-view
US7103212B2 (en) 2002-11-22 2006-09-05 Strider Labs, Inc. Acquisition of three-dimensional images by an active stereo technique using locally unique patterns
US7071970B2 (en) * 2003-03-10 2006-07-04 Charles Benton Video augmented orientation sensor
US20040179107A1 (en) * 2003-03-10 2004-09-16 Charles Benton Video augmented orientation sensor
US20040183909A1 (en) * 2003-03-21 2004-09-23 Lavision Gmbh Method of determining the imaging equation for self calibration with regard to performing stereo-PIV methods
US20070168146A1 (en) * 2003-03-26 2007-07-19 Assembleon N.V. Method for calibrating a device, method for calibrating a number of devices lying side by side as well as an object suitable for implementing such a method
US7688381B2 (en) * 2003-04-08 2010-03-30 Vanbree Ken System for accurately repositioning imaging devices
US20040201756A1 (en) * 2003-04-08 2004-10-14 Vanbree Ken System for accurately repositioning imaging devices
US7286157B2 (en) 2003-09-11 2007-10-23 Intellivid Corporation Computerized method and apparatus for determining field-of-view relationships among multiple image sensors
US20050058321A1 (en) * 2003-09-11 2005-03-17 Buehler Christopher J. Computerized method and apparatus for determining field-of-view relationships among multiple image sensors
US7382900B2 (en) 2003-09-18 2008-06-03 Lavision Gmbh Method of determining a three-dimensional velocity field in a volume
US20050071105A1 (en) * 2003-09-30 2005-03-31 General Electric Company Method and system for calibrating relative fields of view of multiple cameras
US6873924B1 (en) * 2003-09-30 2005-03-29 General Electric Company Method and system for calibrating relative fields of view of multiple cameras
US7346187B2 (en) 2003-10-10 2008-03-18 Intellivid Corporation Method of counting objects in a monitored environment and apparatus for the same
US20050078853A1 (en) * 2003-10-10 2005-04-14 Buehler Christopher J. System and method for searching for changes in surveillance video
US20050078852A1 (en) * 2003-10-10 2005-04-14 Buehler Christopher J. Method of counting objects in a monitored environment and apparatus for the same
US7280673B2 (en) 2003-10-10 2007-10-09 Intellivid Corporation System and method for searching for changes in surveillance video
US20050254726A1 (en) * 2004-02-25 2005-11-17 The University Of North Carolina At Chapel Hill Methods, systems, and computer program products for imperceptibly embedding structured light patterns in projected color images for display on planar and non-planar surfaces
US7182465B2 (en) 2004-02-25 2007-02-27 The University Of North Carolina Methods, systems, and computer program products for imperceptibly embedding structured light patterns in projected color images for display on planar and non-planar surfaces
US8150216B2 (en) 2004-05-05 2012-04-03 Google Inc. Methods and apparatus for automated true object-based image analysis and retrieval
US8908996B2 (en) 2004-05-05 2014-12-09 Google Inc. Methods and apparatus for automated true object-based image analysis and retrieval
US9424277B2 (en) 2004-05-05 2016-08-23 Google Inc. Methods and apparatus for automated true object-based image analysis and retrieval
US7590310B2 (en) 2004-05-05 2009-09-15 Facet Technology Corp. Methods and apparatus for automated true object-based image analysis and retrieval
US20050271304A1 (en) * 2004-05-05 2005-12-08 Retterath Jamie E Methods and apparatus for automated true object-based image analysis and retrieval
US20100082597A1 (en) * 2004-05-05 2010-04-01 Facet Technology Corp. Methods and apparatus for automated true object-based image analysis and retrieval
US8908997B2 (en) 2004-05-05 2014-12-09 Google Inc. Methods and apparatus for automated true object-based image analysis and retrieval
US8903199B2 (en) 2004-05-05 2014-12-02 Google Inc. Methods and apparatus for automated true object-based image analysis and retrieval
US20060017938A1 (en) * 2004-06-15 2006-01-26 Fumio Ohtomo Three-dimensional surveying instrument and electronic storage medium
WO2005124594A1 (en) * 2004-06-16 2005-12-29 Koninklijke Philips Electronics, N.V. Automatic, real-time, superimposed labeling of points and objects of interest within a view
US20060013437A1 (en) * 2004-06-22 2006-01-19 David Nister Method and apparatus for determining camera pose
US7613323B2 (en) * 2004-06-22 2009-11-03 Sarnoff Corporation Method and apparatus for determining camera pose
US8152305B2 (en) 2004-07-16 2012-04-10 The University Of North Carolina At Chapel Hill Methods, systems, and computer program products for full spectrum projection
US20060017807A1 (en) * 2004-07-26 2006-01-26 Silicon Optix, Inc. Panoramic vision system and method
US7576767B2 (en) 2004-07-26 2009-08-18 Geo Semiconductors Inc. Panoramic vision system and method
WO2006022630A1 (en) * 2004-07-26 2006-03-02 Silicon Optix, Inc. Panoramic vision system and method
US7191056B2 (en) * 2005-01-04 2007-03-13 The Boeing Company Precision landmark-aided navigation
US20060149458A1 (en) * 2005-01-04 2006-07-06 Costello Michael J Precision landmark-aided navigation
US20100098327A1 (en) * 2005-02-11 2010-04-22 Mas Donald Dettwiler And Associates Inc. 3D Imaging system
US20060221072A1 (en) * 2005-02-11 2006-10-05 Se Shuen Y S 3D imaging system
US20100098328A1 (en) * 2005-02-11 2010-04-22 Mas Donald Dettwiler And Associates Inc. 3D imaging system
US7860301B2 (en) * 2005-02-11 2010-12-28 Macdonald Dettwiler And Associates Inc. 3D imaging system
US8532368B2 (en) * 2005-02-11 2013-09-10 Macdonald Dettwiler And Associates Inc. Method and apparatus for producing 3D model of an environment
US8031909B2 (en) * 2005-02-11 2011-10-04 Macdonald Dettwiler And Associates Inc. Method and apparatus for producing 3D model of an underground environment
US20120120072A1 (en) * 2005-02-11 2012-05-17 Macdonald Dettwiler And Associates Inc. Method and apparatus for producing 3d model of an environment
US8031933B2 (en) * 2005-02-11 2011-10-04 Macdonald Dettwiler & Associates Inc. Method and apparatus for producing an enhanced 3D model of an environment or an object
US7830561B2 (en) * 2005-03-16 2010-11-09 The Trustees Of Columbia University In The City Of New York Lensless imaging with controllable apertures
US20070081200A1 (en) * 2005-03-16 2007-04-12 Columbia University Lensless imaging with controllable apertures
US20110157393A1 (en) * 2005-03-16 2011-06-30 The Trustees Of Columbia University In The City Of New York Lensless imaging with controllable apertures
US8144376B2 (en) * 2005-03-16 2012-03-27 The Trustees Of Columbia University In The City Of New York Lensless imaging with controllable apertures
US8174572B2 (en) 2005-03-25 2012-05-08 Sensormatic Electronics, LLC Intelligent camera selection and object tracking
US8502868B2 (en) 2005-03-25 2013-08-06 Sensormatic Electronics, LLC Intelligent camera selection and object tracking
US20100002082A1 (en) * 2005-03-25 2010-01-07 Buehler Christopher J Intelligent camera selection and object tracking
US7941269B2 (en) 2005-05-06 2011-05-10 Rialcardo Tice B.V. Llc Network-based navigation system having virtual drive-thru advertisements integrated with actual imagery from along a physical route
US8406992B2 (en) 2005-05-06 2013-03-26 Rialcardo Tice B.V. Llc Network-based navigation system having virtual drive-thru advertisements integrated with actual imagery from along a physical route
US8319952B2 (en) 2005-07-11 2012-11-27 Kabushiki Kaisha Topcon Geographic data collecting system
US20070010924A1 (en) * 2005-07-11 2007-01-11 Kabushiki Kaisha Topcon Geographic data collecting system
US20110096319A1 (en) * 2005-07-11 2011-04-28 Kabushiki Kaisha Topcon Geographic data collecting system
US7933001B2 (en) * 2005-07-11 2011-04-26 Kabushiki Kaisha Topcon Geographic data collecting system
US20070008515A1 (en) * 2005-07-11 2007-01-11 Kabushiki Kaisha Topcon Geographic data collecting system
US20070182818A1 (en) * 2005-09-02 2007-08-09 Buehler Christopher J Object tracking and alerts
US9407878B2 (en) 2005-09-02 2016-08-02 Sensormatic Electronics, LLC Object tracking and alerts
US9036028B2 (en) 2005-09-02 2015-05-19 Sensormatic Electronics, LLC Object tracking and alerts
US9881216B2 (en) 2005-09-02 2018-01-30 Sensormatic Electronics, LLC Object tracking and alerts
US8625854B2 (en) 2005-09-09 2014-01-07 Industrial Research Limited 3D scene scanner and a position and orientation system
US20090323121A1 (en) * 2005-09-09 2009-12-31 Robert Jan Valkenburg A 3D Scene Scanner and a Position and Orientation System
US20070196016A1 (en) * 2006-02-21 2007-08-23 I-Hsien Chen Calibration system for image capture apparatus and method thereof
US7825792B2 (en) 2006-06-02 2010-11-02 Sensormatic Electronics Llc Systems and methods for distributed monitoring of remote sites
US8013729B2 (en) 2006-06-02 2011-09-06 Sensormatic Electronics, LLC Systems and methods for distributed monitoring of remote sites
US7671728B2 (en) 2006-06-02 2010-03-02 Sensormatic Electronics, LLC Systems and methods for distributed monitoring of remote sites
US20100145899A1 (en) * 2006-06-02 2010-06-10 Buehler Christopher J Systems and Methods for Distributed Monitoring of Remote Sites
US20070283004A1 (en) * 2006-06-02 2007-12-06 Buehler Christopher J Systems and methods for distributed monitoring of remote sites
US8482606B2 (en) 2006-08-02 2013-07-09 Inneroptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
US7728868B2 (en) 2006-08-02 2010-06-01 Inneroptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
US10733700B2 (en) 2006-08-02 2020-08-04 Inneroptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
US8350902B2 (en) 2006-08-02 2013-01-08 Inneroptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
US10127629B2 (en) 2006-08-02 2018-11-13 Inneroptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
US11481868B2 (en) 2006-08-02 2022-10-25 Inneroptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure she using multiple modalities
US9659345B2 (en) 2006-08-02 2017-05-23 Inneroptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
US20080071559A1 (en) * 2006-09-19 2008-03-20 Juha Arrasvuori Augmented reality assisted shopping
US20090131836A1 (en) * 2007-03-06 2009-05-21 Enohara Takaaki Suspicious behavior detection system and method
US20100215220A1 (en) * 2007-06-01 2010-08-26 Toyota Jidosha Kabushiki Kaisha Measurement device, measurement method, program, and computer readable medium
US8295643B2 (en) * 2007-06-01 2012-10-23 Toyota Jidosha Kabushiki Kaisha Device and associated methodology for measuring three-dimensional positions based on retrieved points from one view angle and positions and postures from another view angle
US20080303902A1 (en) * 2007-06-09 2008-12-11 Sensomatic Electronics Corporation System and method for integrating video analytics and data analytics/mining
US8508595B2 (en) * 2007-10-04 2013-08-13 Samsung Techwin Co., Ltd. Surveillance camera system for controlling cameras using position and orientation of the cameras and position information of a detected object
US20100321473A1 (en) * 2007-10-04 2010-12-23 Samsung Techwin Co., Ltd. Surveillance camera system
US9265572B2 (en) 2008-01-24 2016-02-23 The University Of North Carolina At Chapel Hill Methods, systems, and computer readable media for image guided ablation
US10769458B2 (en) 2008-02-12 2020-09-08 DBI/CIDAUT Technologies, LLC Determination procedure of the luminance of traffic signs and device for its embodiment
US20090222237A1 (en) * 2008-03-03 2009-09-03 Kabushiki Kaisha Topcon Geographical data collecting device
US8280677B2 (en) 2008-03-03 2012-10-02 Kabushiki Kaisha Topcon Geographical data collecting device
US20090225161A1 (en) * 2008-03-04 2009-09-10 Kabushiki Kaisha Topcon Geographical data collecting device
US8717432B2 (en) 2008-03-04 2014-05-06 Kabushiki Kaisha Topcon Geographical data collecting device
US8831310B2 (en) 2008-03-07 2014-09-09 Inneroptic Technology, Inc. Systems and methods for displaying guidance data based on updated deformable imaging data
US8340379B2 (en) 2008-03-07 2012-12-25 Inneroptic Technology, Inc. Systems and methods for displaying guidance data based on updated deformable imaging data
US20090257620A1 (en) * 2008-04-10 2009-10-15 Michael Alan Hicks Methods and apparatus for auditing signage
US8315456B2 (en) * 2008-04-10 2012-11-20 The Nielsen Company Methods and apparatus for auditing signage
US8649610B2 (en) 2008-04-10 2014-02-11 The Nielsen Company (Us), Llc Methods and apparatus for auditing signage
US20090312629A1 (en) * 2008-06-13 2009-12-17 Inneroptic Technology Inc. Correction of relative tracking errors based on a fiducial
JP2011530706A (en) * 2008-08-12 2011-12-22 アイイーイー インターナショナル エレクトロニクス アンド エンジニアリング エス.エイ. 3D-TOF camera device and position / orientation calibration method therefor
US10795006B2 (en) * 2008-08-12 2020-10-06 Iee International Electronics & Engineering S.A. 3D time-of-flight camera system and position/orientation calibration method therefor
US20110205340A1 (en) * 2008-08-12 2011-08-25 Iee International Electronics & Engineering S.A. 3d time-of-flight camera system and position/orientation calibration method therefor
US20100103173A1 (en) * 2008-10-27 2010-04-29 Minkyu Lee Real time object tagging for interactive image display applications
US9364294B2 (en) 2009-02-17 2016-06-14 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US8641621B2 (en) 2009-02-17 2014-02-04 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US10136951B2 (en) 2009-02-17 2018-11-27 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US8690776B2 (en) 2009-02-17 2014-04-08 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US9398936B2 (en) 2009-02-17 2016-07-26 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US10398513B2 (en) 2009-02-17 2019-09-03 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US8585598B2 (en) 2009-02-17 2013-11-19 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US11464575B2 (en) 2009-02-17 2022-10-11 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US11464578B2 (en) 2009-02-17 2022-10-11 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US20120007943A1 (en) * 2009-03-31 2012-01-12 Donny Tytgat Method for determining the relative position of a first and a second imaging device and devices therefore
US8977075B2 (en) * 2009-03-31 2015-03-10 Alcatel Lucent Method for determining the relative position of a first and a second imaging device and devices therefore
US10648800B2 (en) 2009-05-22 2020-05-12 Pictometry International Corp. System and process for roof measurement using imagery
US9933254B2 (en) 2009-05-22 2018-04-03 Pictometry International Corp. System and process for roof measurement using aerial imagery
US11060857B2 (en) 2009-05-22 2021-07-13 Pictometry International Corp. System and process for roof measurement using imagery
US8401222B2 (en) 2009-05-22 2013-03-19 Pictometry International Corp. System and process for roof measurement using aerial imagery
US10197391B2 (en) 2009-05-22 2019-02-05 Pictometry International Corp. System and process for roof measurement using imagery
US9238869B2 (en) 2009-06-25 2016-01-19 The University Of North Carolina At Chapel Hill Methods and systems for using actuated surface-attached posts for assessing biofluid rheology
US8586368B2 (en) 2009-06-25 2013-11-19 The University Of North Carolina At Chapel Hill Methods and systems for using actuated surface-attached posts for assessing biofluid rheology
US9282947B2 (en) 2009-12-01 2016-03-15 Inneroptic Technology, Inc. Imager focusing based on intraoperative data
US9107698B2 (en) 2010-04-12 2015-08-18 Inneroptic Technology, Inc. Image annotation in image-guided medical procedures
US8554307B2 (en) 2010-04-12 2013-10-08 Inneroptic Technology, Inc. Image annotation in image-guided medical procedures
US8577089B2 (en) * 2010-06-14 2013-11-05 Samsung Electronics Co., Ltd. Apparatus and method for depth unfolding based on multiple depth images
US20110305370A1 (en) * 2010-06-14 2011-12-15 Samsung Electronics Co., Ltd. Apparatus and method for depth unfolding based on multiple depth images
US8934009B2 (en) 2010-09-02 2015-01-13 Kabushiki Kaisha Topcon Measuring method and measuring device
US20130310071A1 (en) * 2011-01-11 2013-11-21 Qualcomm Incorporated Position determination using horizontal angles
CN103282741A (en) * 2011-01-11 2013-09-04 高通股份有限公司 Position determination using horizontal angles
CN103282741B (en) * 2011-01-11 2017-04-12 高通股份有限公司 Position determination using horizontal angles
US20120194517A1 (en) * 2011-01-31 2012-08-02 Microsoft Corporation Using a Three-Dimensional Environment Model in Gameplay
US8570320B2 (en) * 2011-01-31 2013-10-29 Microsoft Corporation Using a three-dimensional environment model in gameplay
US8730232B2 (en) 2011-02-01 2014-05-20 Legend3D, Inc. Director-style based 2D to 3D movie conversion system and method
US8942917B2 (en) 2011-02-14 2015-01-27 Microsoft Corporation Change invariant scene recognition by an agent
US9619561B2 (en) 2011-02-14 2017-04-11 Microsoft Technology Licensing, Llc Change invariant scene recognition by an agent
US9282321B2 (en) 2011-02-17 2016-03-08 Legend3D, Inc. 3D model multi-reviewer system
US9288476B2 (en) 2011-02-17 2016-03-15 Legend3D, Inc. System and method for real-time depth modification of stereo images of a virtual reality environment
US20130308826A1 (en) * 2011-02-17 2013-11-21 Konica Minolta, Inc. Image processing apparatus, non-transitory computer readable recording medium, and image processing method
US9215441B2 (en) * 2011-02-17 2015-12-15 Konica Minolta, Inc. Image processing apparatus, non-transitory computer readable recording medium, and image processing method
US20120249749A1 (en) * 2011-03-31 2012-10-04 Ats Automation Tooling Systems Inc. Three dimensional optical sensing through optical media
US9551570B2 (en) * 2011-03-31 2017-01-24 Ats Automation Tooling Systems Inc. Three dimensional optical sensing through optical media
US9160979B1 (en) * 2011-05-27 2015-10-13 Trimble Navigation Limited Determining camera position for a photograph having a displaced center of projection
US10088317B2 (en) 2011-06-09 2018-10-02 Microsoft Technologies Licensing, LLC Hybrid-approach for localization of an agent
US8872852B2 (en) 2011-06-30 2014-10-28 International Business Machines Corporation Positional context determination with multi marker confidence ranking
US9147379B2 (en) 2011-06-30 2015-09-29 International Business Machines Corporation Positional context determination with multi marker confidence ranking
US10611613B2 (en) 2011-08-26 2020-04-07 Crown Equipment Corporation Systems and methods for pose development using retrieved position of a pallet or product load to be picked up
US9206023B2 (en) * 2011-08-26 2015-12-08 Crown Equipment Limited Method and apparatus for using unique landmarks to locate industrial vehicles at start-up
US9939349B2 (en) * 2011-09-30 2018-04-10 Lufthansa Technik Ag Endoscopy system and corresponding method for examining gas turbines
US20150168263A1 (en) * 2011-09-30 2015-06-18 Lufthansa Technik Ag Endoscopy system and corresponding method for examining gas turbines
US8983227B2 (en) 2011-11-30 2015-03-17 Microsoft Technology Licensing, Llc Perspective correction using a reflection
US8774556B2 (en) 2011-11-30 2014-07-08 Microsoft Corporation Perspective correction using a reflection
US9986208B2 (en) * 2012-01-27 2018-05-29 Qualcomm Incorporated System and method for determining location of a device using opposing cameras
JP2015513235A (en) * 2012-01-27 2015-04-30 クアルコム,インコーポレイテッド System and method for locating a device using an opposing camera
US20130194428A1 (en) * 2012-01-27 2013-08-01 Qualcomm Incorporated System and method for determining location of a device using opposing cameras
CN104067091A (en) * 2012-01-27 2014-09-24 高通股份有限公司 System and method for determining location of a device using opposing cameras
US8670816B2 (en) 2012-01-30 2014-03-11 Inneroptic Technology, Inc. Multiple medical device guidance
US9443555B2 (en) 2012-02-06 2016-09-13 Legend3D, Inc. Multi-stage production pipeline system
US9270965B2 (en) 2012-02-06 2016-02-23 Legend 3D, Inc. Multi-stage production pipeline system
US9595296B2 (en) 2012-02-06 2017-03-14 Legend3D, Inc. Multi-stage production pipeline system
US10091491B2 (en) 2012-06-05 2018-10-02 Samsung Electronics Co., Ltd. Depth image generating method and apparatus and depth image processing method and apparatus
US20150292999A1 (en) * 2012-11-05 2015-10-15 Jfe Steel Corporation Method and apparatus for measuring dynamic panel stiffness of outer panel for automobile parts
US9709473B2 (en) * 2012-11-05 2017-07-18 JFS Steel Corporation Method and apparatus for measuring dynamic panel stiffness of outer panel for automobile parts
US9007365B2 (en) 2012-11-27 2015-04-14 Legend3D, Inc. Line depth augmentation system and method for conversion of 2D images to 3D images
US9547937B2 (en) 2012-11-30 2017-01-17 Legend3D, Inc. Three-dimensional annotation system and method
US11215711B2 (en) 2012-12-28 2022-01-04 Microsoft Technology Licensing, Llc Using photometric stereo for 3D environment modeling
US9251582B2 (en) 2012-12-31 2016-02-02 General Electric Company Methods and systems for enhanced automated visual inspection of a physical asset
US11710309B2 (en) 2013-02-22 2023-07-25 Microsoft Technology Licensing, Llc Camera/object pose from predicted coordinates
US10314559B2 (en) 2013-03-14 2019-06-11 Inneroptic Technology, Inc. Medical device guidance
US9612211B2 (en) 2013-03-14 2017-04-04 General Electric Company Methods and systems for enhanced tip-tracking and navigation of visual inspection devices
US9007404B2 (en) 2013-03-15 2015-04-14 Legend3D, Inc. Tilt-based look around effect image enhancement method
US9438878B2 (en) 2013-05-01 2016-09-06 Legend3D, Inc. Method of converting 2D video to 3D video using 3D object models
US9241147B2 (en) 2013-05-01 2016-01-19 Legend3D, Inc. External depth map transformation method for conversion of two-dimensional images to stereoscopic images
US9407904B2 (en) 2013-05-01 2016-08-02 Legend3D, Inc. Method for creating 3D virtual reality from 2D images
US9874433B2 (en) 2013-07-09 2018-01-23 Rememdia LC Optical positioning sensor
US9651365B2 (en) 2013-07-09 2017-05-16 Rememdia LC Optical positioning sensor
US9245916B2 (en) 2013-07-09 2016-01-26 Rememdia LC Optical positioning sensor
US10690479B2 (en) 2013-07-09 2020-06-23 Rememdia LLC Optical positioning sensor
US20150040072A1 (en) * 2013-08-01 2015-02-05 Ebay Inc. Three dimensional image dimension mapping
US10296970B2 (en) 2013-08-01 2019-05-21 Ebay Inc. Bi-directional project information updates in multi-party bidding
US11704723B2 (en) 2013-08-01 2023-07-18 Ebay Inc. Bi-directional project information updates in multi-party bidding
US20150098636A1 (en) * 2013-10-09 2015-04-09 United Sciences, Llc Integrated tracking with fiducial-based modeling
US20150097929A1 (en) * 2013-10-09 2015-04-09 United Sciences, Llc. Display for three-dimensional imaging
US20150097935A1 (en) * 2013-10-09 2015-04-09 United Sciences, Llc Integrated tracking with world modeling
US20150130928A1 (en) * 2013-11-12 2015-05-14 Trimble Navigation Limited Point-to-point measurements using a handheld device
US9470511B2 (en) * 2013-11-12 2016-10-18 Trimble Navigation Limited Point-to-point measurements using a handheld device
US9367650B2 (en) 2014-01-10 2016-06-14 Ebay Inc. Solar installation mapping
US10068344B2 (en) 2014-03-05 2018-09-04 Smart Picture Technologies Inc. Method and system for 3D capture based on structure from motion with simplified pose detection
US9677840B2 (en) 2014-03-14 2017-06-13 Lineweight Llc Augmented reality simulator
US20150369593A1 (en) * 2014-06-19 2015-12-24 Kari MYLLYKOSKI Orthographic image capture system
US10820944B2 (en) 2014-10-02 2020-11-03 Inneroptic Technology, Inc. Affected region display based on a variance parameter associated with a medical device
US11684429B2 (en) 2014-10-02 2023-06-27 Inneroptic Technology, Inc. Affected region display associated with a medical device
US9901406B2 (en) 2014-10-02 2018-02-27 Inneroptic Technology, Inc. Affected region display associated with a medical device
US20160134860A1 (en) * 2014-11-12 2016-05-12 Dejan Jovanovic Multiple template improved 3d modeling of imaged objects using camera position and pose to obtain accuracy
US10820946B2 (en) 2014-12-12 2020-11-03 Inneroptic Technology, Inc. Surgical guidance intersection display
US11534245B2 (en) 2014-12-12 2022-12-27 Inneroptic Technology, Inc. Surgical guidance intersection display
US10188467B2 (en) 2014-12-12 2019-01-29 Inneroptic Technology, Inc. Surgical guidance intersection display
EP3051493A1 (en) * 2015-01-27 2016-08-03 Kabushiki Kaisha Topcon Survey data processing device, survey data processing method, and program therefor
CN105825498A (en) * 2015-01-27 2016-08-03 株式会社拓普康 Survey data processing device, survey data processing method, and program therefor
US10677583B2 (en) 2015-04-17 2020-06-09 Rememdia LC Strain sensor
US10083522B2 (en) 2015-06-19 2018-09-25 Smart Picture Technologies, Inc. Image based measurement system
US11103200B2 (en) 2015-07-22 2021-08-31 Inneroptic Technology, Inc. Medical device approaches
US9949700B2 (en) 2015-07-22 2018-04-24 Inneroptic Technology, Inc. Medical device approaches
US20180249088A1 (en) * 2015-09-03 2018-08-30 3Digiview Asia Co., Ltd. Method for correcting image of multi-camera system by using multi-sphere correction device
US10904458B2 (en) 2015-09-03 2021-01-26 3Digiview Asia Co., Ltd. Error correction unit for time slice image
US10778908B2 (en) * 2015-09-03 2020-09-15 3Digiview Asia Co., Ltd. Method for correcting image of multi-camera system by using multi-sphere correction device
US9609307B1 (en) 2015-09-17 2017-03-28 Legend3D, Inc. Method of converting 2D video to 3D video using machine learning
US10102680B2 (en) * 2015-10-30 2018-10-16 Snap Inc. Image based tracking in augmented reality systems
US11315331B2 (en) 2015-10-30 2022-04-26 Snap Inc. Image based tracking in augmented reality systems
US10366543B1 (en) 2015-10-30 2019-07-30 Snap Inc. Image based tracking in augmented reality systems
US9836890B2 (en) * 2015-10-30 2017-12-05 Snap Inc. Image based tracking in augmented reality systems
US11769307B2 (en) 2015-10-30 2023-09-26 Snap Inc. Image based tracking in augmented reality systems
US20170243371A1 (en) * 2015-10-30 2017-08-24 Snap Inc. Image based tracking in augmented reality systems
US10733802B2 (en) 2015-10-30 2020-08-04 Snap Inc. Image based tracking in augmented reality systems
US10997783B2 (en) 2015-11-30 2021-05-04 Snap Inc. Image and point cloud based tracking and in augmented reality systems
US9984499B1 (en) 2015-11-30 2018-05-29 Snap Inc. Image and point cloud based tracking and in augmented reality systems
US10657708B1 (en) 2015-11-30 2020-05-19 Snap Inc. Image and point cloud based tracking and in augmented reality systems
US11380051B2 (en) 2015-11-30 2022-07-05 Snap Inc. Image and point cloud based tracking and in augmented reality systems
US9675319B1 (en) 2016-02-17 2017-06-13 Inneroptic Technology, Inc. Loupe display
US11179136B2 (en) 2016-02-17 2021-11-23 Inneroptic Technology, Inc. Loupe display
US10433814B2 (en) 2016-02-17 2019-10-08 Inneroptic Technology, Inc. Loupe display
WO2017209213A1 (en) * 2016-05-31 2017-12-07 Necソリューションイノベータ株式会社 Image processing device, image processing method, and computer-readable recording medium
JPWO2017209213A1 (en) * 2016-05-31 2019-01-31 Necソリューションイノベータ株式会社 Image processing apparatus, image processing method, and program
US10488195B2 (en) 2016-10-25 2019-11-26 Microsoft Technology Licensing, Llc Curated photogrammetry
US10772686B2 (en) 2016-10-27 2020-09-15 Inneroptic Technology, Inc. Medical device navigation using a virtual 3D space
US11369439B2 (en) 2016-10-27 2022-06-28 Inneroptic Technology, Inc. Medical device navigation using a virtual 3D space
US10278778B2 (en) 2016-10-27 2019-05-07 Inneroptic Technology, Inc. Medical device navigation using a virtual 3D space
US11002537B2 (en) 2016-12-07 2021-05-11 Magik Eye Inc. Distance sensor including adjustable focus imaging sensor
CN108257182A (en) * 2016-12-29 2018-07-06 深圳超多维光电子有限公司 A kind of scaling method and device of three-dimensional camera module
US10319149B1 (en) 2017-02-17 2019-06-11 Snap Inc. Augmented reality anamorphosis system
US11861795B1 (en) 2017-02-17 2024-01-02 Snap Inc. Augmented reality anamorphosis system
US10614828B1 (en) 2017-02-20 2020-04-07 Snap Inc. Augmented reality speech balloon system
US11748579B2 (en) 2017-02-20 2023-09-05 Snap Inc. Augmented reality speech balloon system
US11189299B1 (en) 2017-02-20 2021-11-30 Snap Inc. Augmented reality speech balloon system
US10074381B1 (en) 2017-02-20 2018-09-11 Snap Inc. Augmented reality speech balloon system
US10387730B1 (en) 2017-04-20 2019-08-20 Snap Inc. Augmented reality typography personalization system
US11195018B1 (en) 2017-04-20 2021-12-07 Snap Inc. Augmented reality typography personalization system
US10083524B1 (en) * 2017-04-21 2018-09-25 Octi Systems and methods for determining location and orientation of a camera
US10665035B1 (en) 2017-07-11 2020-05-26 B+T Group Holdings, LLC System and process of using photogrammetry for digital as-built site surveys and asset tracking
US11259879B2 (en) 2017-08-01 2022-03-01 Inneroptic Technology, Inc. Selective transparency to assist medical device navigation
US11164387B2 (en) 2017-08-08 2021-11-02 Smart Picture Technologies, Inc. Method for measuring and modeling spaces using markerless augmented reality
US10679424B2 (en) 2017-08-08 2020-06-09 Smart Picture Technologies, Inc. Method for measuring and modeling spaces using markerless augmented reality
US10304254B2 (en) 2017-08-08 2019-05-28 Smart Picture Technologies, Inc. Method for measuring and modeling spaces using markerless augmented reality
US11682177B2 (en) 2017-08-08 2023-06-20 Smart Picture Technologies, Inc. Method for measuring and modeling spaces using markerless augmented reality
US11335067B2 (en) 2017-09-15 2022-05-17 Snap Inc. Augmented reality system
US10740974B1 (en) 2017-09-15 2020-08-11 Snap Inc. Augmented reality system
US11721080B2 (en) 2017-09-15 2023-08-08 Snap Inc. Augmented reality system
US10885761B2 (en) * 2017-10-08 2021-01-05 Magik Eye Inc. Calibrating a sensor system including multiple movable sensors
US20190108743A1 (en) * 2017-10-08 2019-04-11 Magik Eye Inc. Calibrating a sensor system including multiple movable sensors
CN111164650A (en) * 2017-10-08 2020-05-15 魔眼公司 Calibrating a sensor system comprising a plurality of movable sensors
US11199397B2 (en) 2017-10-08 2021-12-14 Magik Eye Inc. Distance measurement using a longitudinal grid pattern
US11484365B2 (en) 2018-01-23 2022-11-01 Inneroptic Technology, Inc. Medical image guidance
US11381753B2 (en) 2018-03-20 2022-07-05 Magik Eye Inc. Adjusting camera exposure for three-dimensional depth sensing and two-dimensional imaging
US11062468B2 (en) 2018-03-20 2021-07-13 Magik Eye Inc. Distance measurement using projection patterns of varying densities
US10931883B2 (en) 2018-03-20 2021-02-23 Magik Eye Inc. Adjusting camera exposure for three-dimensional depth sensing and two-dimensional imaging
US20210231810A1 (en) * 2018-05-30 2021-07-29 Maxell, Ltd. Camera apparatus
US11474245B2 (en) 2018-06-06 2022-10-18 Magik Eye Inc. Distance measurement using high density projection patterns
US11475584B2 (en) 2018-08-07 2022-10-18 Magik Eye Inc. Baffles for three-dimensional sensors having spherical fields of view
US11450050B2 (en) 2018-08-31 2022-09-20 Snap Inc. Augmented reality anthropomorphization system
US10997760B2 (en) 2018-08-31 2021-05-04 Snap Inc. Augmented reality anthropomorphization system
US11676319B2 (en) 2018-08-31 2023-06-13 Snap Inc. Augmented reality anthropomorphtzation system
US11288831B2 (en) * 2018-12-05 2022-03-29 Vivotek Inc. Information measuring method and information measuring system
US11151782B1 (en) 2018-12-18 2021-10-19 B+T Group Holdings, Inc. System and process of generating digital images of a site having a structure with superimposed intersecting grid lines and annotations
US11483503B2 (en) 2019-01-20 2022-10-25 Magik Eye Inc. Three-dimensional sensor including bandpass filter having multiple passbands
US11595638B2 (en) 2019-02-06 2023-02-28 Robert Bosch Gmbh Calibration unit for a monitoring device, monitoring device for man-overboard monitoring, and method for calibration
WO2020160874A1 (en) * 2019-02-06 2020-08-13 Robert Bosch Gmbh Calibration unit for a monitoring device, monitoring device for man-overboard monitoring and method for calibration
US11474209B2 (en) 2019-03-25 2022-10-18 Magik Eye Inc. Distance measurement using high density projection patterns
US11347331B2 (en) 2019-04-08 2022-05-31 Dell Products L.P. Portable information handling system stylus garage and charge interface
US10788865B1 (en) 2019-04-26 2020-09-29 Dell Products L.P. Information handling system dual pivot hinge signal path
US11017742B2 (en) 2019-05-02 2021-05-25 Dell Products L.P. Information handling system multiple display viewing angle brightness adjustment
US11341925B2 (en) 2019-05-02 2022-05-24 Dell Products L.P. Information handling system adapting multiple display visual image presentations
US11009936B2 (en) 2019-05-02 2021-05-18 Dell Products L.P. Information handling system power control sensor
US11138757B2 (en) 2019-05-10 2021-10-05 Smart Picture Technologies, Inc. Methods and systems for measuring and modeling spaces using markerless photo-based augmented reality process
US11527009B2 (en) 2019-05-10 2022-12-13 Smart Picture Technologies, Inc. Methods and systems for measuring and modeling spaces using markerless photo-based augmented reality process
US11019249B2 (en) 2019-05-12 2021-05-25 Magik Eye Inc. Mapping three-dimensional depth map data onto two-dimensional images
US11908042B2 (en) * 2019-08-02 2024-02-20 Samsung Electronics Co., Ltd. Electronic apparatus and control method thereof
US11587265B2 (en) * 2019-08-02 2023-02-21 Samsung Electronics Co., Ltd. Electronic apparatus and control method thereof
US11423605B2 (en) * 2019-11-01 2022-08-23 Activision Publishing, Inc. Systems and methods for remastering a game space while maintaining the underlying game simulation
US11320537B2 (en) 2019-12-01 2022-05-03 Magik Eye Inc. Enhancing triangulation-based three-dimensional distance measurements with time of flight information
US11536857B2 (en) 2019-12-19 2022-12-27 Trimble Inc. Surface tracking on a survey pole
US11580662B2 (en) 2019-12-29 2023-02-14 Magik Eye Inc. Associating three-dimensional coordinates with two-dimensional feature points
US11688088B2 (en) 2020-01-05 2023-06-27 Magik Eye Inc. Transferring the coordinate system of a three-dimensional camera to the incident point of a two-dimensional camera
US11609345B2 (en) 2020-02-20 2023-03-21 Rockwell Automation Technologies, Inc. System and method to determine positioning in a virtual coordinate system
US11568614B1 (en) 2021-08-02 2023-01-31 Bank Of America Corporation Adaptive augmented reality system for dynamic processing of spatial component parameters based on detecting accommodation factors in real time

Similar Documents

Publication Publication Date Title
US5699444A (en) Methods and apparatus for using image data to determine camera location and orientation
US11200734B2 (en) Method for reconstructing three-dimensional space scene based on photographing
US6246412B1 (en) Interactive construction and refinement of 3D models from multiple panoramic images
CN110296691B (en) IMU calibration-fused binocular stereo vision measurement method and system
Pollefeys et al. Self-calibration and metric reconstruction inspite of varying and unknown intrinsic camera parameters
US7233691B2 (en) Any aspect passive volumetric image processing method
US8107722B2 (en) System and method for automatic stereo measurement of a point of interest in a scene
US6271855B1 (en) Interactive construction of 3D models from panoramic images employing hard and soft constraint characterization and decomposing techniques
US6084592A (en) Interactive construction of 3D models from panoramic images
WO1997036147A1 (en) Methods and apparatus for using image data to determine camera location and orientation
US8509522B2 (en) Camera translation using rotation from device
TW565736B (en) Method for determining the optical parameters of a camera
JP3277105B2 (en) Method and apparatus for creating partial solid model
Rawlinson Design and implementation of a spatially enabled panoramic virtual reality prototype
Huang et al. Rotating line cameras: model and calibration
Ahmadabadian Photogrammetric multi-view stereo and imaging network design
Negahdaripour et al. Integrated system for robust 6-dof positioning utilizing new closed-form visual motion estimation methods in planar terrains
Stylianidis et al. Measurements: Introduction to Photogrammetry
Perfant et al. Scene registration in aerial image analysis
Huang et al. Calibration of line-based panoramic cameras
Erdnüß A review of the one-parameter division undistortion model
Scheibe Design and test of algorithms for the evaluation of modern sensors in close-range photogrammetry
Mikhail An introduction to photogrammetry
Scheibe et al. Pose Estimation of Rotating Sensors in the Context of Accurate 3D Scene Modeling.
Srestasathiern Line Based Estimation of Object Space Geometry and Camera Motion

Legal Events

Date Code Title Description
AS Assignment

Owner name: SYNTHONICS INCORPORATED, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PALM, CHARLES S.;REEL/FRAME:007488/0551

Effective date: 19950506

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

AS Assignment

Owner name: PATENT PORTFOLIOS CONSULTING, INC., FLORIDA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SYNTHONICS TECHNOLOGIES, INC.;REEL/FRAME:013746/0833

Effective date: 20021114

AS Assignment

Owner name: DIVERSIFIED PATENT INVESTMENTS, LLC., FLORIDA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PATENT PORTFOLIOS CONSULTING, INC.;REEL/FRAME:014154/0597

Effective date: 20030507

FPAY Fee payment

Year of fee payment: 8

AS Assignment

Owner name: JADE PIXEL, LLC, DELAWARE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:DIVERSIFIED PATENT INVESTMENTS, LLC;REEL/FRAME:019224/0406

Effective date: 20060731

AS Assignment

Owner name: PATENT PORTFOLIOS CONSULTING, INC., FLORIDA

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE NAME OF THE ASSIGNOR FOR A TRANSCRIPTIONAL ERROR PREVIOUSLY RECORDED ON REEL 013746 FRAME 0833;ASSIGNOR:SYNTHONICS INCORPORATED;REEL/FRAME:019477/0119

Effective date: 20021114

FEPP Fee payment procedure

Free format text: PAT HOLDER NO LONGER CLAIMS SMALL ENTITY STATUS, ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: STOL); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FEPP Fee payment procedure

Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 12