US5889550A - Camera tracking system - Google Patents

Camera tracking system Download PDF

Info

Publication number
US5889550A
US5889550A US08/661,201 US66120196A US5889550A US 5889550 A US5889550 A US 5889550A US 66120196 A US66120196 A US 66120196A US 5889550 A US5889550 A US 5889550A
Authority
US
United States
Prior art keywords
markers
camera
target assembly
recording
viewer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
US08/661,201
Inventor
Mark C. Reynolds
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Adaptive Optics Associates Inc
Original Assignee
Adaptive Optics Associates Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Adaptive Optics Associates Inc filed Critical Adaptive Optics Associates Inc
Priority to US08/661,201 priority Critical patent/US5889550A/en
Assigned to ADAPTIVE OPTICS ASSOCIATES, INC. reassignment ADAPTIVE OPTICS ASSOCIATES, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: REYNOLDS, MARK C.
Application granted granted Critical
Publication of US5889550A publication Critical patent/US5889550A/en
Assigned to PNC BANK, NATIONAL ASSOCIATION reassignment PNC BANK, NATIONAL ASSOCIATION SECURITY AGREEMENT Assignors: ADAPTIVE OPTICS ASSOCIATES, INC., METROLOGIC INSTRUMENTS, INC.
Assigned to METROLOGIC INSTRUMENTS, INC. reassignment METROLOGIC INSTRUMENTS, INC. RELEASE OF SECURITY INTEREST Assignors: PNC BANK, NATIONAL ASSOCIATION
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/2224Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment

Definitions

  • This invention relates to the field of cinematography, and more particularly to a system for identifying the spatial position of the film plane of a camera relative to its filmed subject.
  • the animated motion picture "Who Framed Roger Rabbit” was produced with a variety of animation processes, including the blue screen process, and is considered a film animation landmark because of its realism.
  • the difficulty of integrating live actors into a bluescreen artificial environment, or of integrating animated characters into live action footage is the matching of the three dimensional object of the live and artificial action to each other.
  • Human visual acuity is sufficiently precise to detect millimeter offsets in the relative scale, positioning, and dimensional object of the live actor to the animated character. These relative characteristics must be accurately matched to obtain realism and present the viewer with a seamless view of the composite image. If the human actor is to be seen shaking hands with the animated character, then precise relative location of that character is essential to prevent having the two figures overlap or to miss one another.
  • a camera tracking system includes a motion capture system for recording two dimensional ("2D") projections of the recording camera's location in three dimensional ("3D") space using a plurality of viewer cameras positioned at different sightlines around the recording camera and which track the 3D position of a patterned geometric target mounted to recording camera, the 2D projections are recorded in successive sample intervals, beginning from a calibrated reference position of the camera, the sample intervals having a periodicity at least equal to the recording intervals of the recording camera to associate each recording interval with at least one set of 2D projection data, the calibrated reference position of the camera being established with a calibration procedure performed prior to the recording session to provide a baseline position from which all subsequent camera motion is measured in relation to, the tracking system thereafter analyzing each sample interval's 2D projections from each viewer camera to calculate a result set of 3D coordinate data for each recording interval (e.g. film frame), whereby the 3D visual object of the subjects being recorded may be readily determined relative to the calibrated reference location.
  • 2D two dimensional
  • 3D three
  • the system converts each set of 2D data to a result set of 3D coordinate data through a photogrammetry mathematical process.
  • the patterned geometric target includes a plurality of light reflective markers, each assembled at known coordinate locations within the geometric structure of the target assembly, the coordinate positions of each marker being used by the system as the object data set in the mathematical transformation of the 2D projections to the result set of 3D coordinates.
  • the prior art animation process was to iterate the animated character's scale and position to make it fit the object and scale of the human actor. This had to be done frame by frame.
  • the present invention still requires that the initial frame associated with the calibrated reference position be scaled by manual composition to make the seamless fit, however, thereafter the relative change in the recording camera's 3D orientation as detected by the present camera tracking system invention would allow subsequent object and scale of the animated character to be scaled automatically, as appropriate relative to the initial insert frame, thereby saving substantial animation time and cost.
  • the recording camera's 3D location and orientation defines a 3D coordinate system. If the three dimensional location of the camera is known at all times then it becomes possible to insert artificial static or dynamic elements into the live action scene by mapping them into this coordinate system. This is an operation which can now be done by the computer animation system itself.
  • the first step in this process is to precisely locate the film camera(s) themselves
  • the present system provides accurate 3D tracking of the recording camera in a non-invasive manner. It does not introduce visual artifacts in filming, nor does it impose any significant constraints on the placement, orientation or operation of the recording camera. In particular, handheld cameras must be accommodated.
  • the output of the tracking system must provide sufficient information to uniquely locate the camera in three dimensional space. In addition, it is highly desirable that the tracking produce its results in "real time”. In this case this means that the track output data can be used to drive a display showing some representation of the composite live action and artificial elements.
  • FIG. 1 is a figurative illustration of the apparatus of the present invention being used in a film stage setting
  • FIG. 2 is an isometric illustration of one element of the apparatus of FIG. 1;
  • FIG. 3 is a plan view taken along the line 2--2 of FIG. 2;
  • FIG. 4 is an abstract illustration of the element of FIGS. 2, 3, which is used in the description of the invention.
  • FIG. 5 is an illustration of a functional feature of the element illustrated in FIGS. 2,3;
  • FIG. 6, is a schematic illustration of another element of the apparatus of FIG. 1;
  • FIG. 1 illustratively depicts a film stage 20 in which a human actor 22 is filmed in front of a blue screen 23 with a known model type camera 24 by an operator 26. As the actor 22 (the "filmed object") moves around the set the operator 26 continuously repositions the camera 24 to keep the filmed object within the field of view of the lens 28.
  • the present invention may be used with any type of camera, such as a track mounted or boom mounted model, the best mode embodiment is described in terms of a portable camera, which has an unrestricted range of motion and, therefore, represents the greatest test of the invention.
  • the camera tracking system of the present invention captures the three dimensional (“3D") motion of the camera 24 in successive two dimensional (“2D") projections, and converts the 2D projection coordinates into a 3D coordinate value set for the target assembly.
  • the 2D projections is recorded with a time periodicity which is not less than the nominal 24 fps film viewing speed, such that each frame of film has at least one associated 3D coordinate value defining its position and 3D object within the recorded field of view.
  • This allows artificial elements to be inserted into the recorded scene with proper 3D object and scale by mapping them into this coordinate system through a general mathematical process known as photogrametry.
  • the major elements of the present invention include: (i) a target assembly mounted to the camera 24, (ii) a motion capture system, having a plurality of viewer cameras, which record the real time 3D motion of the target assembly as 2D projections, (iii) a calibration procedure which defines the relative spatial positioning of the motion capture system to the target assembly, and (iv) tracking software which converts the recorded 2D projections into real time 3D spatial coordinates of the target assembly.
  • the 3D coordinate information provided by the tracking algorithm is descriptive of the camera's position and orientation, and may be presented in a various formats, including: (1) the normal vector to the film plane of the camera; or (2) the Euler angles (the angular coordinates ⁇ , ⁇ , and ⁇ corresponding to the pitch, yaw, roll) of the film plane of the camera with respect to a reference (calibrated) position; or (3) mathematical representation(s) of the transformation between the initial (calibrated) position/orientation of the camera and its current position, such as the rotation matrix or the "rotator" quaternion.
  • the camera 24 includes a camera mounted target assembly 30, which is shown in detail in FIGS. 2 and 3.
  • the assembly 30 is a substantially precise 3D geometric shape comprising an open grid structure of support struts 32 which fixedly position a plurality of light reflective markers, such as the markers 34-38, at specific locations within the absolute coordinate system defined by the assembly's structure.
  • the markers are generally spherical shapes, which are covered with a known type light reflective material, such as SCOTCHLITETM 8710 reflective transfer film manufactured by the 3M Company. However, any other reflective material having similar reflective brilliance characteristics may be used, as known to those skilled in the art.
  • the target's geometric shape is defined by the number and coordinate position of the reflective markers.
  • the minimum number of markers required is three, since three points are required to define the 3D position of the camera film plane. However, since camera position may cause one marker to occlude another from the sightline of the motion capture system, four or more markers are preferable. In the best mode embodiment, five markers 34-38 are chosen as an optimal balance between tracking accuracy and computational complexity.
  • the marker coordinates are selected with consideration given to the following.
  • the marker placements within the target structure are chosen such that if four or more marker locations are known in 3D space, then errors in the detected position or identity of the markers can be detected and corrected.
  • marker coordinate values are chosen such that if written in homogeneous coordinate format any four vectors may be assembled into an invertible 4 ⁇ 4 matrix.
  • At least three markers should be located in a plane parallel to the film plane.
  • the five markers 34-38 shown in FIGS. 2, 3 have the homogeneous coordinates listed in Table I below.
  • the coordinate values may be best understood by reference to the three axis coordinate system illustration 40 in FIG. 2, where it is assumed that marker 34 is positioned at the coordinate origin 42.
  • the target assembly 30 is adapted to be mounted to a visible portion of the camera 24. Typically this is the top surface of the case, away from the camera controls.
  • the mounted target position is then calibrated to the position of the camera's film plane.
  • FIG. 4 in an abstract illustration of the target assembly 30, the vector between markers 34, 35 defines the x axis and the vector between markers 35, 36 defines the y axis of the target's absolute coordinate space.
  • the three markers 34-36 in combination substantially define a right triangle 44 which, with proper target mounting, lies parallel to the film plane.
  • the fourth marker 37 defines the z axis of the absolute coordinate space, extending from the hypotenuse 46 of the triangle 44 in a substantially normal direction 47 to the film plane.
  • the fifth marker 38 also projects along the z axis, from the plane of the right triangle 44.
  • the metal struts 32 are preferably covered with a black anodized material to minimize stray light reflection. It is not necessary that the grid structure be absolutely rigid along all possible deformation axes.
  • the kinematic motion imparted to the target as the camera moves may effect the accuracy of the system's 3D reconstruction, but this is mitigated both by the relative rigidity of the target's base (the right triangle 44 formed by markers 34-38) and by the demagnification effect of the system. i.e. since the motion capture system cameras are located several feet from the target, small flexure deformations in marker position do not give rise to appreciable error.
  • the x axis dimension i.e. the length of the vector from marker 34 to marker 35 was chosen to be 3.75 inches.
  • the markers 34-38 which are substantially spheres, have a nominal diameter of one inch. These choices were dictated by the size discrimination algorithm (described below), the desired accuracy of the system (determined by the viewer resolution at a typical distance to the camera 24) and the relative size of the target assembly 30 and the camera 24.
  • the present camera tracking system software incorporates the absolute coordinates of the target assembly markers 34-38 in a single data structure, which is preferably a 4 ⁇ 5 matrix. If the marker coordinates of a given target assembly deviate from the nominal they can be recalibrated and the new marker coordinate data inserted into the software in such a way that the software function is not effected. If, for example, the coordinates of marker 36 were determined by measurement to be 0.97 2.02 0.01, rather than the specified 1 2 0 then this fact is accommodated by replacing row three of that matrix by the entry 0.97 2.02 0.01 1.00!
  • the present invention uses a known type motion capture system, such as the MULTI-TRAXTM motion capture system manufactured by Adaptive Optics Associates, Inc., Cambridge, Mass.
  • the MULTI-TRAXTM system provides real time 2D and 3D multi-object tracking using up to seven video cameras (referred to as "viewer cameras") which are positioned around the target.
  • Each camera provides pulsed infrared LED illumination of the target's passive reflective markers 34-38 and records 2D plane geometric projections of the reflected light patterns.
  • the camera's video processor organizes the camera pixel data and forwards it to the system's main signal processor which determines the X1,Y1 coordinates of each recorded 2D marker projection and calculates a result set of 3D coordinates for each marker.
  • Each 3D result set has three unknowns (x, y, z,), requiring at least two 2D projections (4 knowns and 3 unknowns) and, therefore, at least two viewer cameras.
  • two viewer cameras are insufficient to ensure an unoccluded view of the markers; instead requiring at least three viewers.
  • obstructions on the film set 20 may block one viewer camera's sightline to the target, and the possibility of target positional ambiguity (the inability of a viewer camera to discriminate between position of different markers in certain target orientations) makes four viewers preferable.
  • target positional ambiguity the inability of a viewer camera to discriminate between position of different markers in certain target orientations
  • the complexity of a given film set scenery may require additional viewers, up to the system maximum of seven.
  • four viewer cameras are used.
  • four viewer cameras 48-51 placed in a surround pattern to the film stage 20 and positioned, in terms of location and height, to achieve maximum visibility of the target assembly 30.
  • the target assembly is mounted to the camera case in a location most visible to the viewer cameras. Typically this is the top surface of the case, away from the camera controls.
  • the mounted target position is then calibrated to the position of the camera's film plane.
  • the MULTI-TRAXTM system viewer cameras 48-51 are pulsed infrared optical cameras capable of providing real time 3-D tracking of light reflective targets at indoor distances up to 30 meters and at sample rates of 50-350 Hz.
  • the camera's infrared strobe provides a light intensity of up to 15 watts/m 2 at one meter.
  • the viewer cameras provide a 60 Hz sample rate, which is capable of processing 2D projections for up to 40 markers.
  • the 2D projection data from the viewer cameras 48-51 is provided through an associated one of buss lines 52-55 to a related one of a like plurality of video processors 56-59.
  • the video processors 56-59 are serially interconnected through each processor's RS-422 communication port by a cable buss 60 to the camera tracking system's main signal processor 62.
  • the maximum baud rate for the transfer of data between the video processors and the main signal processor is 500,000.
  • the signal processor which may be a commercially available, DOS based PC with an INTEL® PENTIUM® microprocessor, includes a standard full keyset keyboard and a standard SVGA monitor with a 256 color video driver.
  • the system signal processor 62 may be connected directly to the end user animation system 63, which may be anyone of a number of computer based animation systems, as known to those skilled in the art.
  • 5 4 625 possible 3D film frame coordinates solution sets.
  • far less than the maximum possible 625 solutions are reviewed before a "best case" value set is chosen as the 3D result.
  • the 3D solution set is provided at the rate of 90 fps, which is 150% faster than the typical recording film speed of 60 fps, and over 360% faster than the normal viewing speed of 24 fps.
  • the camera tracking system of the present invention does not calculate the actual spatial position of the film plane of the camera 24 within the overall coordinate system of the film stage 20. Instead, it tracks the change in film plane position relative to an initial calibration coordinate reference obtained during the calibration process.
  • calibration is performed prior to filming any new scene which alters the film stage setting.
  • the calibration procedure establishes this reference coordinate for each viewer cameras in respect of how that viewer camera "sees” the target.
  • the actual location of a viewer camera on the stage 20 (the "spot location") is immaterial. What is important to the system are the differences in viewer camera locations, which will cause each viewer to have a different view of the target and a corresponding different absolute calibration coordinate solution set for marker 34. Whatever that actual value, it becomes the 0,0,0 calibration reference for that viewer.
  • Each viewer's individual view is established by the "Calibration Projection Matrix" created for each viewer on the basis of (i) the Observed Data or “Obs”, which is the 2D visual data received and formatted by each viewer/video processor pair, and (ii) the Object Set, which is the known coordinate positions of the target markers 34-38 on the target assembly. All subsequent data sets then provide data relative to the calibration data set which is stored in the system computer.
  • the calibration process uses the target assembly as the calibration target.
  • the target assembly is maintained in a steady position, and each viewer camera is adjusted as necessary to see at least four of the five target markers 34-38.
  • Each video processor computes the 2D data from the pixel data provided by its associated viewer camera.
  • the 2D data is correlated to each of the viewed markers by having the operator block each marker, one at a time to make the association.
  • the system processor 62 then calculates each of the viewer cameras 2D projection matrices.
  • Object Set is known since the absolute coordinates of markers 34, 35, 37, and 38 within the target assembly are known. Therefore, in this example Object Set is equal to:
  • the Object Set matrix is invertible. If the inverse of Object Set is denoted as I*Object Set then the projection matrix T for viewer camera 48 is easily computed as
  • the output of the calibration procedure will be a projection matrix for each viewer camera.
  • the projection matrices will all have the form:
  • This form of calibration has several distinct features.
  • the primary feature is the fact that the system's target assembly itself defines the world coordinate system. Calibration serves to fix the cameras in the target's coordinate system, by means of defining its Calibration Projection Matrix, rather than determining the camera's coordinates themselves.
  • the absolute scale of the target is not used in this approach, because each projection matrix views the 34 ⁇ 35 vector as having length one, regardless of its actual length. The scaling between physical size (the actual length of the 34 ⁇ 35 vector) and the target world coordinate system is contained within the projection matrices.
  • any target can be used so long as it satisfies the target constraints as defined above (and provided that the target 4 ⁇ 5 definition matrix has been loaded into the calibration software).
  • the calibration procedure takes place with the target (and the motion picture camera to which it is mounted) stationary and serves to fix the coordinate system. Once the user has designated that tracking is to begin the track algorithm is entered.
  • the track algorithm takes the reported two dimensional coordinate data from each of the viewer cameras and attempts to reconstruct the three dimensional positions of a subset of the five markers sufficient to calculate one or more of the film plane representations.
  • the track algorithm has to contend with a number of factors, including:
  • any given viewer camera may not see a given marker because that marker is obscured.
  • some viewer cameras may present no data at all for several frames, because, for example, one of the human actors is interposed between the viewer camera and the target;
  • any given viewer camera may see two (or more) markers as a single marker, because the 2D projections of those several markers is the same;
  • any given viewer camera may see a single marker as two (or more) markers. This can arise, for example, because one of the support struts of the target is interposed between the viewer and the marker, so that the viewer sees half of the marker, an obscured area, and then other half of the marker, with the two halves appearing as two separate markers;
  • the tracking software not be so computationally intensive that it cannot keep up with the incoming data, at least to the extent that it can provide output 3D positional data at the film rate, i.e. at 24 Hz (i.e. 24 fps). It is also extremely desirable that the tracker be robust to the extent that it can (a) provide a quality indicator as to the calculated likelihood that the computed track is correct, and (b) in cases when the computed track is suspected or known to be incorrect (such as total loss of data), that it can recover from such situations as quickly as practicable (i.e., as soon as data fidelity increases).
  • the camera tracking system of the present invention imposes the following restrictions in an attempt to meet these conditions:
  • the tracker algorithm operates in five basic steps:
  • the reconstructions are scored relative to each of the 5 desired marker positions.
  • the initial score is computed based on the distance between each putative position and the known positions of the markers in the previous three frames (during the first four frames the absolute positions of the calibration are used).
  • a scoreboard algorithm with weights is used so that once a 2D data point from one of the two viewer cameras is considered to be a likely candidate for a particular marker, it becomes a less likely candidate for other markers.
  • the ideal reconstruction will have at most one value chosen from each row and each column of the scoreboard;
  • step four gives the first pass assignments of the new 3D locations of the markers. Using this it is possible to compute tentative values for the velocity and acceleration of each marker, since its positions (and velocities and accelerations) in the previous three frames are saved. Once this has been done the scoreboarding of step 4 is repeated, this time using an aggregate nine dimensional distance as the input score (3 components of position, 3 of velocity and 3 of acceleration). Restriction 2 insures that during the initial frames of tracking, when the track history is not stable, that large velocities and/or accelerations do not cause instability in the track algorithm. In mathematical terms this amounts to bounding the contribution of transients until the steady state regime is entered.
  • Tracks are assigned four grades: "high quality”, “low quality”, “no quality” or “track lost”.
  • a high quality track is one in which all markers were deemed to be successfully reconstructed.
  • a low quality track is one in which at least three markers, but fewer than all five markers, were deemed to be successfully reconstructed.
  • a "no quality” track is one in which some reconstructions took place and no error in the algorithm occurred, but the new marker positions are sufficiently few, or sufficiently far from the old ones that the output is not trusted. "Track lost” implies that either there was total data loss (restriction (1) above was violated), or an error occurred.
  • step (5) for one particular camera pair was not of "high quality” or "low quality” then steps (2) through (5) are repeated for all other camera pairs until either one of these quality factors is achieved, or until all camera pairs are exhausted.
  • the current coordinates of the known markers can be used to infer the location of the 123 plane (the film plane).
  • the normal vector is then simply the cross product of the 1 ⁇ 2 and the 1 ⁇ 3 vectors.
  • the "pitch” and “yaw” Euler angles can be computed from the normal using the procedure of 1, pg 217-218).
  • the "roll” angle is obtained by coordinate transforming the 123 plane to be parallel to the z axis and then computing the angle between the base 1 ⁇ 2 vector (which is just the x axis) and the current 1 ⁇ 2 vector. This angle is just the inverse cosine of the dot product of those two vectors normalized to unity.
  • the rotation matrix or the quaternion representation can then be derived.
  • the output of the present camera tracking system is a realtime representation of the film plane of the recording camera 24.
  • This realtime information may be delivered in a variety of formats to suit the end-user animation equipment to which the system is connected.
  • This end-user equipment may be a digital media or digital studio software system, such as GIGTIMETM model animation system from ELECTROGIG, Herengracht 214, 1016 BS Amsterdam, The Netherlands.
  • the information would be delivered as either (a) the 3D coordinates of 34, 35, and 36, which define the plane of the right triangle 44 (FIG. 4) which is parallel to the film plane, or (b) as the Euler angles corresponding to the unit normal vector of this plane (the marker 37 (FIG. 4)).
  • This data would be presented as 3 floating point numbers (e.g. IEEE floating point format) at the desired input rate of the end-user equipment.
  • This frame rate would usually be either 24 fps (standard film rate) or the actual sample frame rate of the viewer camera.
  • the camera tracking system of the present invention tracks the location and orientation of the image plane of either a cinematographic or a video camera. Whichever type recording camera may be used, the present invention provides a 3D coordinate data set identifiying the spatial location of the recording camera image plane in each film frame or video frame.

Abstract

A camera tracking system determines the three dimensional (3D) location and orientation of the film plane of a camera providing live recording of a subject, thereby defining a 3D coordinate system of the live action scene into which animated objects or characters may be automatically mapped with proper scale and 3D visual object by a computer animation system.

Description

TECHNICAL FIELD
This invention relates to the field of cinematography, and more particularly to a system for identifying the spatial position of the film plane of a camera relative to its filmed subject.
BACKGROUND ART
Conventional film making technology has been revolutionized in recent years by advances in animation and in computer generated images. One of the first processes developed to combine live action with artificial environments was the use of a "blue screen". With this process a live performance is filmed against a blue background, and the background is then photographically replaced with art, inserts or animated characters which then become integrated with the filmed live action.
The animated motion picture "Who Framed Roger Rabbit" was produced with a variety of animation processes, including the blue screen process, and is considered a film animation landmark because of its realism. As may be known, the difficulty of integrating live actors into a bluescreen artificial environment, or of integrating animated characters into live action footage, is the matching of the three dimensional object of the live and artificial action to each other. Human visual acuity is sufficiently precise to detect millimeter offsets in the relative scale, positioning, and dimensional object of the live actor to the animated character. These relative characteristics must be accurately matched to obtain realism and present the viewer with a seamless view of the composite image. If the human actor is to be seen shaking hands with the animated character, then precise relative location of that character is essential to prevent having the two figures overlap or to miss one another.
In "Roger Rabbit" precision matching was achieved by building a composite image, frame by frame, from layers of filmed live action and computer generated images to visually present the animated characters in proper three dimensional object with their companion human actor. If there were only a single frame of action in which an interchange occurred the composition process could be done manually. However there is typically several minutes, or at least seconds, of exchange between the human and animated actors which, at a viewing speed of 24 frames per second (fps) equates to dozens or even hundreds of frames of film.
Since extreme precision in composition is required for realism, and since Roger Rabbit included several minutes of filmed action in which the human actors and animated characters are moving, the composite imagery required an extensive amount of manual labor, and cost. At present the only reliable way of doing this is to use a large amount of labor to do composition manually. There exists, therefore, a need for both method and apparatus which can identify the location of the filmed live elements within the context of a three dimensional setting so to allow automatic determination of the scale and position of the animated elements to be composed within the live scene. Ideally, this must be accomplished in a noninvasive manner, so as not to introduce unwanted elements into the film environment which must later be removed.
DISCLOSURE OF INVENTION
An object of the present invention is to provide method and apparatus for determining the three dimensional spatial coordinates of the film plane of a camera providing visual recording of subjects on a three dimensional film stage. Another object of the present invention is to provide apparatus for identifying the three dimensional object of subjects being visually recorded by a camera on a three dimensional film stage.
According to the present invention a camera tracking system includes a motion capture system for recording two dimensional ("2D") projections of the recording camera's location in three dimensional ("3D") space using a plurality of viewer cameras positioned at different sightlines around the recording camera and which track the 3D position of a patterned geometric target mounted to recording camera, the 2D projections are recorded in successive sample intervals, beginning from a calibrated reference position of the camera, the sample intervals having a periodicity at least equal to the recording intervals of the recording camera to associate each recording interval with at least one set of 2D projection data, the calibrated reference position of the camera being established with a calibration procedure performed prior to the recording session to provide a baseline position from which all subsequent camera motion is measured in relation to, the tracking system thereafter analyzing each sample interval's 2D projections from each viewer camera to calculate a result set of 3D coordinate data for each recording interval (e.g. film frame), whereby the 3D visual object of the subjects being recorded may be readily determined relative to the calibrated reference location.
In further accord with the present invention, the system converts each set of 2D data to a result set of 3D coordinate data through a photogrammetry mathematical process. In still further accord with the present invention, the patterned geometric target includes a plurality of light reflective markers, each assembled at known coordinate locations within the geometric structure of the target assembly, the coordinate positions of each marker being used by the system as the object data set in the mathematical transformation of the 2D projections to the result set of 3D coordinates.
The prior art animation process was to iterate the animated character's scale and position to make it fit the object and scale of the human actor. This had to be done frame by frame. The present invention still requires that the initial frame associated with the calibrated reference position be scaled by manual composition to make the seamless fit, however, thereafter the relative change in the recording camera's 3D orientation as detected by the present camera tracking system invention would allow subsequent object and scale of the animated character to be scaled automatically, as appropriate relative to the initial insert frame, thereby saving substantial animation time and cost. Alternatively stated, the recording camera's 3D location and orientation defines a 3D coordinate system. If the three dimensional location of the camera is known at all times then it becomes possible to insert artificial static or dynamic elements into the live action scene by mapping them into this coordinate system. This is an operation which can now be done by the computer animation system itself. The first step in this process is to precisely locate the film camera(s) themselves
The present system provides accurate 3D tracking of the recording camera in a non-invasive manner. It does not introduce visual artifacts in filming, nor does it impose any significant constraints on the placement, orientation or operation of the recording camera. In particular, handheld cameras must be accommodated. The output of the tracking system must provide sufficient information to uniquely locate the camera in three dimensional space. In addition, it is highly desirable that the tracking produce its results in "real time". In this case this means that the track output data can be used to drive a display showing some representation of the composite live action and artificial elements.
These and other objects, features, and advantages of the present invention will become more apparent in light of the following detailed description of a best mode embodiment thereof, as illustrated in the accompanying Drawing.
BRIEF DESCRIPTION OF DRAWINGS
FIG. 1, is a figurative illustration of the apparatus of the present invention being used in a film stage setting;
FIG. 2, is an isometric illustration of one element of the apparatus of FIG. 1;
FIG. 3, is a plan view taken along the line 2--2 of FIG. 2;
FIG. 4, is an abstract illustration of the element of FIGS. 2, 3, which is used in the description of the invention;
FIG. 5, is an illustration of a functional feature of the element illustrated in FIGS. 2,3;
FIG. 6, is a schematic illustration of another element of the apparatus of FIG. 1;
BEST MODE FOR CARRYING OUT THE INVENTION
FIG. 1 illustratively depicts a film stage 20 in which a human actor 22 is filmed in front of a blue screen 23 with a known model type camera 24 by an operator 26. As the actor 22 (the "filmed object") moves around the set the operator 26 continuously repositions the camera 24 to keep the filmed object within the field of view of the lens 28. Although the present invention may be used with any type of camera, such as a track mounted or boom mounted model, the best mode embodiment is described in terms of a portable camera, which has an unrestricted range of motion and, therefore, represents the greatest test of the invention.
As described in detail hereinafter, the camera tracking system of the present invention captures the three dimensional ("3D") motion of the camera 24 in successive two dimensional ("2D") projections, and converts the 2D projection coordinates into a 3D coordinate value set for the target assembly. The 2D projections is recorded with a time periodicity which is not less than the nominal 24 fps film viewing speed, such that each frame of film has at least one associated 3D coordinate value defining its position and 3D object within the recorded field of view. This allows artificial elements to be inserted into the recorded scene with proper 3D object and scale by mapping them into this coordinate system through a general mathematical process known as photogrametry.
The major elements of the present invention include: (i) a target assembly mounted to the camera 24, (ii) a motion capture system, having a plurality of viewer cameras, which record the real time 3D motion of the target assembly as 2D projections, (iii) a calibration procedure which defines the relative spatial positioning of the motion capture system to the target assembly, and (iv) tracking software which converts the recorded 2D projections into real time 3D spatial coordinates of the target assembly. The 3D coordinate information provided by the tracking algorithm is descriptive of the camera's position and orientation, and may be presented in a various formats, including: (1) the normal vector to the film plane of the camera; or (2) the Euler angles (the angular coordinates ∠α, ∠β, and ∠δ corresponding to the pitch, yaw, roll) of the film plane of the camera with respect to a reference (calibrated) position; or (3) mathematical representation(s) of the transformation between the initial (calibrated) position/orientation of the camera and its current position, such as the rotation matrix or the "rotator" quaternion.
Target Assembly
In FIG. 1, the camera 24 includes a camera mounted target assembly 30, which is shown in detail in FIGS. 2 and 3. In the best mode embodiment the assembly 30 is a substantially precise 3D geometric shape comprising an open grid structure of support struts 32 which fixedly position a plurality of light reflective markers, such as the markers 34-38, at specific locations within the absolute coordinate system defined by the assembly's structure. The markers are generally spherical shapes, which are covered with a known type light reflective material, such as SCOTCHLITE™ 8710 reflective transfer film manufactured by the 3M Company. However, any other reflective material having similar reflective brilliance characteristics may be used, as known to those skilled in the art.
The target's geometric shape is defined by the number and coordinate position of the reflective markers. The minimum number of markers required is three, since three points are required to define the 3D position of the camera film plane. However, since camera position may cause one marker to occlude another from the sightline of the motion capture system, four or more markers are preferable. In the best mode embodiment, five markers 34-38 are chosen as an optimal balance between tracking accuracy and computational complexity.
Once the number of markers is selected the marker coordinates are selected with consideration given to the following.
1. The marker placements within the target structure are chosen such that if four or more marker locations are known in 3D space, then errors in the detected position or identity of the markers can be detected and corrected.
2. To provide computational simplicity the marker coordinate values are chosen such that if written in homogeneous coordinate format any four vectors may be assembled into an invertible 4×4 matrix.
3. At least three markers should be located in a plane parallel to the film plane.
In the best mode embodiment, the five markers 34-38 shown in FIGS. 2, 3 have the homogeneous coordinates listed in Table I below. The coordinate values may be best understood by reference to the three axis coordinate system illustration 40 in FIG. 2, where it is assumed that marker 34 is positioned at the coordinate origin 42.
              TABLE I
______________________________________
Marker No.   x      y          z    w
______________________________________
34           0      0          0    1
35           1      0          0    1
36           1      2          0    1
37             0.5  1          3    1
38           0      2          3    1
______________________________________
Referring simultaneously to FIGS. 4, 5, the target assembly 30 is adapted to be mounted to a visible portion of the camera 24. Typically this is the top surface of the case, away from the camera controls. For a given camera model, the mounted target position is then calibrated to the position of the camera's film plane. Referring to FIG. 4, in an abstract illustration of the target assembly 30, the vector between markers 34, 35 defines the x axis and the vector between markers 35, 36 defines the y axis of the target's absolute coordinate space. The three markers 34-36 in combination substantially define a right triangle 44 which, with proper target mounting, lies parallel to the film plane. The fourth marker 37 defines the z axis of the absolute coordinate space, extending from the hypotenuse 46 of the triangle 44 in a substantially normal direction 47 to the film plane. The fifth marker 38 also projects along the z axis, from the plane of the right triangle 44.
In the target's grid structure, the metal struts 32 are preferably covered with a black anodized material to minimize stray light reflection. It is not necessary that the grid structure be absolutely rigid along all possible deformation axes. The kinematic motion imparted to the target as the camera moves may effect the accuracy of the system's 3D reconstruction, but this is mitigated both by the relative rigidity of the target's base (the right triangle 44 formed by markers 34-38) and by the demagnification effect of the system. i.e. since the motion capture system cameras are located several feet from the target, small flexure deformations in marker position do not give rise to appreciable error.
One very important point about the target is the question of scale size. In the best mode embodiment of the target assembly 30 the x axis dimension, i.e. the length of the vector from marker 34 to marker 35 was chosen to be 3.75 inches. The markers 34-38, which are substantially spheres, have a nominal diameter of one inch. These choices were dictated by the size discrimination algorithm (described below), the desired accuracy of the system (determined by the viewer resolution at a typical distance to the camera 24) and the relative size of the target assembly 30 and the camera 24.
The present camera tracking system software incorporates the absolute coordinates of the target assembly markers 34-38 in a single data structure, which is preferably a 4×5 matrix. If the marker coordinates of a given target assembly deviate from the nominal they can be recalibrated and the new marker coordinate data inserted into the software in such a way that the software function is not effected. If, for example, the coordinates of marker 36 were determined by measurement to be 0.97 2.02 0.01, rather than the specified 1 2 0 then this fact is accommodated by replacing row three of that matrix by the entry 0.97 2.02 0.01 1.00!
Motion Capture System
The present invention uses a known type motion capture system, such as the MULTI-TRAX™ motion capture system manufactured by Adaptive Optics Associates, Inc., Cambridge, Mass. The MULTI-TRAX™ system provides real time 2D and 3D multi-object tracking using up to seven video cameras (referred to as "viewer cameras") which are positioned around the target. Each camera provides pulsed infrared LED illumination of the target's passive reflective markers 34-38 and records 2D plane geometric projections of the reflected light patterns. The camera's video processor organizes the camera pixel data and forwards it to the system's main signal processor which determines the X1,Y1 coordinates of each recorded 2D marker projection and calculates a result set of 3D coordinates for each marker. Each 3D result set has three unknowns (x, y, z,), requiring at least two 2D projections (4 knowns and 3 unknowns) and, therefore, at least two viewer cameras.
In the present invention, however, two viewer cameras are insufficient to ensure an unoccluded view of the markers; instead requiring at least three viewers. In addition, obstructions on the film set 20 (furniture, position of actors, etc) may block one viewer camera's sightline to the target, and the possibility of target positional ambiguity (the inability of a viewer camera to discriminate between position of different markers in certain target orientations) makes four viewers preferable. Of course the complexity of a given film set scenery may require additional viewers, up to the system maximum of seven.
In the best mode embodiment of the present invention four viewer cameras are used. As shown illustratively in FIG. 1 and in schematic block diagram form in FIG. 6, four viewer cameras 48-51 (hereinafter referred to as "viewer cameras") placed in a surround pattern to the film stage 20 and positioned, in terms of location and height, to achieve maximum visibility of the target assembly 30. The target assembly is mounted to the camera case in a location most visible to the viewer cameras. Typically this is the top surface of the case, away from the camera controls. For a given camera model, the mounted target position is then calibrated to the position of the camera's film plane.
As described briefly hereinbefore, the MULTI-TRAX™ system viewer cameras 48-51 are pulsed infrared optical cameras capable of providing real time 3-D tracking of light reflective targets at indoor distances up to 30 meters and at sample rates of 50-350 Hz. The camera's infrared strobe provides a light intensity of up to 15 watts/m2 at one meter. In the best mode embodiment the viewer cameras provide a 60 Hz sample rate, which is capable of processing 2D projections for up to 40 markers. The 2D projection data from the viewer cameras 48-51 is provided through an associated one of buss lines 52-55 to a related one of a like plurality of video processors 56-59.
The video processors 56-59 are serially interconnected through each processor's RS-422 communication port by a cable buss 60 to the camera tracking system's main signal processor 62. The maximum baud rate for the transfer of data between the video processors and the main signal processor is 500,000. The signal processor, which may be a commercially available, DOS based PC with an INTEL® PENTIUM® microprocessor, includes a standard full keyset keyboard and a standard SVGA monitor with a 256 color video driver. The system signal processor 62 may be connected directly to the end user animation system 63, which may be anyone of a number of computer based animation systems, as known to those skilled in the art.
The system processor 62 and video processors 56-59, in a tracking system having a five marker target assembly and four viewer cameras, must be capable of analyzing all possible combinations of the five sets (for 5 markers) of 2D coordinates form each of the four viewer cameras, or 54 =625 possible 3D film frame coordinates solution sets. Typically, however, as described in detail hereinafter with respect to the system's Tracking Algorithm, far less than the maximum possible 625 solutions are reviewed before a "best case" value set is chosen as the 3D result. Ideally the 3D solution set is provided at the rate of 90 fps, which is 150% faster than the typical recording film speed of 60 fps, and over 360% faster than the normal viewing speed of 24 fps.
The Calibration Procedure
The camera tracking system of the present invention does not calculate the actual spatial position of the film plane of the camera 24 within the overall coordinate system of the film stage 20. Instead, it tracks the change in film plane position relative to an initial calibration coordinate reference obtained during the calibration process. The actual value of this calibration reference is the 3D coordinate value of marker 34 as viewed by each of the viewer cameras during the calibration process. This actual 3D value is assigned a relative value of x=0, y=0, z=0 from which all 3D solution sets are measured until the next calibration is performed. Typically, calibration is performed prior to filming any new scene which alters the film stage setting.
The calibration procedure establishes this reference coordinate for each viewer cameras in respect of how that viewer camera "sees" the target. The actual location of a viewer camera on the stage 20 (the "spot location") is immaterial. What is important to the system are the differences in viewer camera locations, which will cause each viewer to have a different view of the target and a corresponding different absolute calibration coordinate solution set for marker 34. Whatever that actual value, it becomes the 0,0,0 calibration reference for that viewer. Each viewer's individual view is established by the "Calibration Projection Matrix" created for each viewer on the basis of (i) the Observed Data or "Obs", which is the 2D visual data received and formatted by each viewer/video processor pair, and (ii) the Object Set, which is the known coordinate positions of the target markers 34-38 on the target assembly. All subsequent data sets then provide data relative to the calibration data set which is stored in the system computer.
The calibration process uses the target assembly as the calibration target. The target assembly is maintained in a steady position, and each viewer camera is adjusted as necessary to see at least four of the five target markers 34-38. Each video processor computes the 2D data from the pixel data provided by its associated viewer camera. The 2D data is correlated to each of the viewed markers by having the operator block each marker, one at a time to make the association. The system processor 62 then calculates each of the viewer cameras 2D projection matrices.
To demonstrate the operation of the calibration process, assume that the four markers seen by viewer camera 48 are 34, 35, 37, and 38. The viewer 48 and its video processor 56 produce four two dimensional vectors D1, D2, D3 and D4, which can be considered three dimensional coordinates with z=0. The 3D coordinates are then placed in homogenous coordinates to provide a 4×4 matrix of Observed Data (Obs):
______________________________________
        |
            D1x    D1y    0   1   |
        |
            D2x    D2y    0   1   |
        |
            D3x    D3y    0   1   |
        |
            D4x    D4y    0   1   |
______________________________________
In homogeneous coordinates a third coordinate (W) is added to every coordinate pair (x, y) in a 2D system to become (x, y, W), or to every triple (x, y, z) in a in a 3D system to become (x, y, z, W). If W is nonzero we can "homogenize" the 3D (or 2D) point by dividing through by W, such that (x, y, z, W)=(x/W, y/W, z/W, 1), which is represented generally by the form (tx, ty, tz, tW), where: t=1/W
The Observed Data values in the above 4×4 matrix are obtained by multiplying the absolute coordinate values of the Object Set by the projection matrix (T). This can be expressed by the matrix equation as:
T×Object Set=Observed Data
The Object Set is known since the absolute coordinates of markers 34, 35, 37, and 38 within the target assembly are known. Therefore, in this example Object Set is equal to:
______________________________________
       |
           0      0      0    1    |
       |
           1      0      0    1    |
       |
           0.5    1      3    1    |
       |
           0      2      3    1    |
______________________________________
Since, as described hereinbefore with respect to the target assembly description, the Object Set matrix is invertible. If the inverse of Object Set is denoted as I*Object Set then the projection matrix T for viewer camera 48 is easily computed as
T=Obs×I*Object Set
A similar computation must be performed for every viewer camera. It is not necessary that each camera see the same set of markers. The output of the calibration procedure will be a projection matrix for each viewer camera. The projection matrices will all have the form:
______________________________________
         x    x      0      0
         x    x      0      0
         x    x      0      0
         x    x      0      1
______________________________________
 where "x" denotes a typically nonzero entry.
where "x" denotes a typically nonzero entry.
This form of calibration has several distinct features. The primary feature is the fact that the system's target assembly itself defines the world coordinate system. Calibration serves to fix the cameras in the target's coordinate system, by means of defining its Calibration Projection Matrix, rather than determining the camera's coordinates themselves. Second, the absolute scale of the target is not used in this approach, because each projection matrix views the 34→35 vector as having length one, regardless of its actual length. The scaling between physical size (the actual length of the 34→35 vector) and the target world coordinate system is contained within the projection matrices. Third, any target can be used so long as it satisfies the target constraints as defined above (and provided that the target 4×5 definition matrix has been loaded into the calibration software).
Tracking Algorithm
The calibration procedure takes place with the target (and the motion picture camera to which it is mounted) stationary and serves to fix the coordinate system. Once the user has designated that tracking is to begin the track algorithm is entered. The track algorithm takes the reported two dimensional coordinate data from each of the viewer cameras and attempts to reconstruct the three dimensional positions of a subset of the five markers sufficient to calculate one or more of the film plane representations.
The track algorithm has to contend with a number of factors, including:
(a) the order of marker data presented by each viewer camera will be raster order, and not necessarily marker order (as was the case with calibration);
(b) any given viewer camera may not see a given marker because that marker is obscured. In particular, some viewer cameras may present no data at all for several frames, because, for example, one of the human actors is interposed between the viewer camera and the target;
(c) any given viewer camera may see two (or more) markers as a single marker, because the 2D projections of those several markers is the same;
(d) any given viewer camera may see a single marker as two (or more) markers. This can arise, for example, because one of the support struts of the target is interposed between the viewer and the marker, so that the viewer sees half of the marker, an obscured area, and then other half of the marker, with the two halves appearing as two separate markers;
(e) there may be spurious markers recorded as a result of reflections, or there may be other data errors introduced at any point in the data acquisition chain up to the time at which the track software receives it.
It is also required that the tracking software not be so computationally intensive that it cannot keep up with the incoming data, at least to the extent that it can provide output 3D positional data at the film rate, i.e. at 24 Hz (i.e. 24 fps). It is also extremely desirable that the tracker be robust to the extent that it can (a) provide a quality indicator as to the calculated likelihood that the computed track is correct, and (b) in cases when the computed track is suspected or known to be incorrect (such as total loss of data), that it can recover from such situations as quickly as practicable (i.e., as soon as data fidelity increases).
The camera tracking system of the present invention imposes the following restrictions in an attempt to meet these conditions:
(a) at least two viewer cameras must see at least three markers each at all times; and
(b) during the track acquisition period (that period of twenty four frames after track was initiated) the velocity and acceleration of each marker must be bounded by an adjustable constant factor of the total displacement of that marker from its initial position.
The tracker algorithm operates in five basic steps:
(1) the 2D data for each viewer camera is compared against maximum and minimum size constraints (which may also be disabled individually or jointly). These constraints serve to eliminate certain obviously spurious data items, and also to reduce the number of split markers in the data stream. Although this can result in good data being discarded, the overall redundancy of the system is sufficient that this has not proven to be a problem;
(2) the 2 dimensional data for all viewer cameras is scanned, and based on how many markers each sees the viewer cameras are ordered from "best" to "worst";
(3) using the projection matrices for the two "best" viewers and the marker data for those viewers, all possible 3D reconstructed positions are computed using a mathematical process known as photogrammetry. With this process it is possible, with two or more viewer cameras, to reconstruct the bundle of rays that produced the 2D image in each different camera. By calculating the 3D intersection point of rays from the different 2D images, a 3D model of the object that formed the images can be constructed. This is sometimes referred to as the "photogrammetric problem". This 3D photogrammetric reconstruction may be performed by either one of two basic prior art methods known as the "direct linear transform" (DLT), or "the collinearity equations". The use of each method to solve the photogrammetric problem is described in detail in a Diploma Thesis entitled Different Calibration Methods for a CCD Camera Based 3D Motion Analysis System--Implementation. Testing and Comparison; presented by Magnus Sjoberg at the Chalmers University of Technology, Goteborg, Sweden, May 1993, and which is incorporated by reference herein, in its entirety.
If the first camera saw K markers, and the second saw J markers, then this would be K×J possible reconstructions. For two viewer cameras the maximum number of possible reconstructed positions is therefore 52 =25 3D result sets, instead of the maximum possible 625. Restriction (1) above insures that this product will give a number of possible values which is large enough to contain the actual marker locations in almost all cases;
(4) the reconstructions are scored relative to each of the 5 desired marker positions. The initial score is computed based on the distance between each putative position and the known positions of the markers in the previous three frames (during the first four frames the absolute positions of the calibration are used). A scoreboard algorithm with weights is used so that once a 2D data point from one of the two viewer cameras is considered to be a likely candidate for a particular marker, it becomes a less likely candidate for other markers. In general, for the K by J situation noted above the ideal reconstruction will have at most one value chosen from each row and each column of the scoreboard;
(5) step four gives the first pass assignments of the new 3D locations of the markers. Using this it is possible to compute tentative values for the velocity and acceleration of each marker, since its positions (and velocities and accelerations) in the previous three frames are saved. Once this has been done the scoreboarding of step 4 is repeated, this time using an aggregate nine dimensional distance as the input score (3 components of position, 3 of velocity and 3 of acceleration). Restriction 2 insures that during the initial frames of tracking, when the track history is not stable, that large velocities and/or accelerations do not cause instability in the track algorithm. In mathematical terms this amounts to bounding the contribution of transients until the steady state regime is entered.
Tracks are assigned four grades: "high quality", "low quality", "no quality" or "track lost". A high quality track is one in which all markers were deemed to be successfully reconstructed. A low quality track is one in which at least three markers, but fewer than all five markers, were deemed to be successfully reconstructed. A "no quality" track is one in which some reconstructions took place and no error in the algorithm occurred, but the new marker positions are sufficiently few, or sufficiently far from the old ones that the output is not trusted. "Track lost" implies that either there was total data loss (restriction (1) above was violated), or an error occurred.
Note that if the result of step (5) for one particular camera pair was not of "high quality" or "low quality" then steps (2) through (5) are repeated for all other camera pairs until either one of these quality factors is achieved, or until all camera pairs are exhausted.
Given a high or low quality track it is possible to generate one of the stated output formats easily. The current coordinates of the known markers can be used to infer the location of the 123 plane (the film plane). The normal vector is then simply the cross product of the 1→2 and the 1→3 vectors. The "pitch" and "yaw" Euler angles can be computed from the normal using the procedure of 1, pg 217-218). The "roll" angle is obtained by coordinate transforming the 123 plane to be parallel to the z axis and then computing the angle between the base 1→2 vector (which is just the x axis) and the current 1→2 vector. This angle is just the inverse cosine of the dot product of those two vectors normalized to unity. The rotation matrix or the quaternion representation can then be derived.
The output of the present camera tracking system is a realtime representation of the film plane of the recording camera 24. This realtime information may be delivered in a variety of formats to suit the end-user animation equipment to which the system is connected. This end-user equipment may be a digital media or digital studio software system, such as GIGTIME™ model animation system from ELECTROGIG, Herengracht 214, 1016 BS Amsterdam, The Netherlands. Typically, the information would be delivered as either (a) the 3D coordinates of 34, 35, and 36, which define the plane of the right triangle 44 (FIG. 4) which is parallel to the film plane, or (b) as the Euler angles corresponding to the unit normal vector of this plane (the marker 37 (FIG. 4)). This data would be presented as 3 floating point numbers (e.g. IEEE floating point format) at the desired input rate of the end-user equipment. This frame rate would usually be either 24 fps (standard film rate) or the actual sample frame rate of the viewer camera.
The camera tracking system of the present invention tracks the location and orientation of the image plane of either a cinematographic or a video camera. Whichever type recording camera may be used, the present invention provides a 3D coordinate data set identifiying the spatial location of the recording camera image plane in each film frame or video frame.
Although the invention has been shown and described with respect to a best mode embodiment thereof, it should be understood by those skilled in the art that various changes, omissions, and additions may be made to the form and detail of the disclosed embodiment without departing from the spirit and scope of the invention as recited in the following claims.

Claims (18)

I claim:
1. Apparatus for providing, to user animation equipment, the three dimensional coordinate data identifying the location and orientation of the image plane of a recording camera which is visually recording subjects on a three dimensional film stage, comprising:
a target assembly, having a plurality of light reflective markers disposed in a geometric pattern thereon, said pattern including a target plane, said target assembly being adapted for mounting to the recording camera in a manner such as to position said target plane parallel to the film plane of the recording camera;
a motion capture system, having a plurality of viewer cameras and associated video processors, for recording the spatial location and orientation of said target assembly markers in each of a plurality of successive sample intervals with each of said plurality of viewer cameras; and for providing at an output thereof said recorded location and orientation as a two dimensional data set from each of said plurality of associated video processor in each said sample interval; and
signal processing means, responsive to each said two dimensional data set from each said associated video processor in each said sample interval, for reviewing at least two of said two dimensional data sets in each said sample interval and for reconstructing therefrom a result set of three dimensional coordinate data for each said sample interval, each said three dimensional data set being indicative of the location and orientation of the film plane of the recording camera in its related sample interval.
2. The apparatus of claim 1, wherein said plurality of viewer cameras are placed in position around the recording camera, each at a different sightline to the target assembly.
3. The apparatus of claim 1, wherein said target assembly comprises:
a support structure, having a different one of a plurality of structural members thereof arrayed in each of three substantially orthogonal geometric planes, said support structure further having a mounting member adapted to mount said structure to the recording camera; and
a plurality of light reflective markers arrayed in a known pattern among said plurality of structural members, at least one each of said plurality of markers being located in a different one of said three orthogonal planes, said plurality of markers being responsive to the pulsed light from the motion capture system viewer cameras to reflect a known three dimensional pattern of light.
4. The target assembly of claim 3, wherein said plurality of structural members comprise a light absorbent outer surface.
5. The target assembly of claim 3, wherein said light reflective markers are substantially spherical in shape.
6. The target assembly of claim 3, wherein said plurality of light reflective markers comprises at least three said markers.
7. The target assembly of claim 3, wherein said plurality of light reflective markers comprises at least five said markers.
8. The target assembly of claim 3, wherein said plurality of markers each have an exterior surface of SCOTCHLITE™ 8710 reflective transfer film.
9. The apparatus of claim 7, wherein said plurality of viewer cameras are positioned around the recording camera, at different sightlines to said target assembly.
10. The apparatus of claim 9, wherein each of said plurality of viewer cameras record 2D projections of the location and orientation of said target assembly in successive sample intervals, beginning from a calibrated reference position of the camera, the sample intervals having a periodicity at least equal to the recording intervals of the recording camera so as to associate each recording interval with at least one set of 2D projection data, the calibrated reference position of the camera being established with a calibration procedure performed prior to the recording session to provide a baseline position from which all subsequent camera motion is measured in relation to.
11. The apparatus of claim 1, wherein said signal processing means provides each said three dimensional data set using a direct linear transform mathematical process.
12. The apparatus of claim 1, wherein said signal processing means provides each said three dimensional data set using the collinearity equations mathematical process.
13. A target assembly, for use in a system of the type which uses a motion capture system, having a plurality of pulsed light viewer cameras for recording the spatial location and orientation of the image plane of a recording camera recording subjects on a three dimensional film stage, comprising:
a support structure, having a different one of a plurality of structural members thereof arrayed in each of three substantially orthogonal geometric planes, said support structure further having a mounting member adapted to mount said structure to the recording camera; and
a plurality of light reflective markers arrayed in a known pattern among said plurality of structural members, at least one each of said plurality of markers being located in a different one of said three orthogonal planes, said plurality of markers being responsive to the pulsed light from the motion capture system viewer cameras to reflect a known three dimensional pattern of light.
14. The target assembly of claim 13, wherein said plurality of structural members comprise a light absorbent outer surface.
15. The target assembly of claim 13, wherein said light reflective markers are substantially spherical in shape.
16. The target assembly of claim 13, wherein said plurality of light reflective markers comprises at least three said markers.
17. The target assembly of claim 13, wherein said plurality of light reflective markers comprises at least five said markers.
18. The target assembly of claim 13, wherein said plurality of markers each have an exterior surface of SCOTCHLITE™ 8710 reflective transfer film.
US08/661,201 1996-06-10 1996-06-10 Camera tracking system Expired - Lifetime US5889550A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US08/661,201 US5889550A (en) 1996-06-10 1996-06-10 Camera tracking system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US08/661,201 US5889550A (en) 1996-06-10 1996-06-10 Camera tracking system

Publications (1)

Publication Number Publication Date
US5889550A true US5889550A (en) 1999-03-30

Family

ID=24652601

Family Applications (1)

Application Number Title Priority Date Filing Date
US08/661,201 Expired - Lifetime US5889550A (en) 1996-06-10 1996-06-10 Camera tracking system

Country Status (1)

Country Link
US (1) US5889550A (en)

Cited By (58)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6201579B1 (en) * 1997-02-01 2001-03-13 Orad Hi-Tech Systems Limited Virtual studio position sensing system
US6243491B1 (en) * 1996-12-31 2001-06-05 Lucent Technologies Inc. Methods and apparatus for controlling a video system with visually recognized props
US6278479B1 (en) * 1998-02-24 2001-08-21 Wilson, Hewitt & Associates, Inc. Dual reality system
US6301549B1 (en) * 1998-06-26 2001-10-09 Lucent Technologies, Inc. Three dimensional object boundary and motion determination device and method of operation thereof
US20020052708A1 (en) * 2000-10-26 2002-05-02 Pollard Stephen B. Optimal image capture
US6401936B1 (en) 1999-04-30 2002-06-11 Siemens Electrocom, L.P. Divert apparatus for conveyor system
US6415051B1 (en) 1999-06-24 2002-07-02 Geometrix, Inc. Generating 3-D models using a manually operated structured light source
US6466275B1 (en) * 1999-04-16 2002-10-15 Sportvision, Inc. Enhancing a video of an event at a remote location using data acquired at the event
US6471044B1 (en) 1999-04-30 2002-10-29 Siemens Electrocom, L.P. Hold and release singulator
US6559884B1 (en) * 1997-09-12 2003-05-06 Orad Hi-Tec Systems, Ltd. Virtual studio position sensing system
US20030156189A1 (en) * 2002-01-16 2003-08-21 Akira Utsumi Automatic camera calibration method
US20040176925A1 (en) * 2003-01-10 2004-09-09 Canon Kabushiki Kaisha Position/orientation measurement method, and position/orientation measurement apparatus
US20050060899A1 (en) * 2003-09-23 2005-03-24 Snap-On Technologies, Inc. Invisible target illuminators for 3D camera-based alignment systems
US6873924B1 (en) * 2003-09-30 2005-03-29 General Electric Company Method and system for calibrating relative fields of view of multiple cameras
US20050128286A1 (en) * 2003-12-11 2005-06-16 Angus Richards VTV system
US6915954B2 (en) 1999-06-07 2005-07-12 Metrologic Instruments, Inc. Programmable data element queuing, handling, processing and linking device integrated into an object identification and attribute acquisition system
US20050253870A1 (en) * 2004-05-14 2005-11-17 Canon Kabushiki Kaisha Marker placement information estimating method and information processing device
US20060074921A1 (en) * 2002-07-24 2006-04-06 Total Immersion Method and system enabling real time mixing of synthetic images and video images by a user
WO2006047610A2 (en) * 2004-10-27 2006-05-04 Cinital Method and apparatus for a virtual scene previewing system
US20060127852A1 (en) * 2004-12-14 2006-06-15 Huafeng Wen Image based orthodontic treatment viewing system
US20060127854A1 (en) * 2004-12-14 2006-06-15 Huafeng Wen Image based dentition record digitization
US20060127836A1 (en) * 2004-12-14 2006-06-15 Huafeng Wen Tooth movement tracking system
US7104453B1 (en) 1999-06-07 2006-09-12 Metrologic Instruments, Inc. Unitary package identification and dimensioning system employing ladar-based scanning methods
US20070076096A1 (en) * 2005-10-04 2007-04-05 Alexander Eugene J System and method for calibrating a set of imaging devices and calculating 3D coordinates of detected features in a laboratory coordinate system
US20070076090A1 (en) * 2005-10-04 2007-04-05 Alexander Eugene J Device for generating three dimensional surface models of moving objects
US20070104361A1 (en) * 2005-11-10 2007-05-10 Alexander Eugene J Device and method for calibrating an imaging device for generating three dimensional surface models of moving objects
US20070152157A1 (en) * 2005-11-04 2007-07-05 Raydon Corporation Simulation arena entity tracking system
US20070248283A1 (en) * 2006-04-21 2007-10-25 Mack Newton E Method and apparatus for a wide area virtual scene preview system
US20080012866A1 (en) * 2006-07-16 2008-01-17 The Jim Henson Company System and method of producing an animated performance utilizing multiple cameras
KR100825859B1 (en) 2007-02-15 2008-04-28 한국과학기술연구원 Indirect object pose estimation method with respect to user-wearable camera using multi-view camera system
US20080128508A1 (en) * 1998-03-24 2008-06-05 Tsikos Constantine J Tunnel-type digital imaging system for use within retail shopping environments such as supermarkets
US20080130985A1 (en) * 2006-12-02 2008-06-05 Electronic And Telecommunications Research Institute Correlation extract method for generating 3d motion data, and motion capture system and method for easy composition of humanoid character on real background image using the same
US20080306708A1 (en) * 2007-06-05 2008-12-11 Raydon Corporation System and method for orientation and location calibration for image sensors
US20090039167A1 (en) * 2007-08-10 2009-02-12 Hand-Held Products, Inc. Indicia reading terminal having spatial measurement functionality
US20090134221A1 (en) * 2000-11-24 2009-05-28 Xiaoxun Zhu Tunnel-type digital imaging-based system for use in automated self-checkout and cashier-assisted checkout operations in retail store environments
US20090141019A1 (en) * 2007-12-03 2009-06-04 Nokia Corporation 4d real-world browsing with capability to recognize and access objects in real-time
EP2153168A1 (en) * 2007-05-21 2010-02-17 Snap-on Incorporated Method and apparatus for wheel alignment
US20100110184A1 (en) * 2007-03-22 2010-05-06 Takehiko Hanada Article monitoring system
US20100165332A1 (en) * 2005-09-28 2010-07-01 Hunter Engineering Company Method and Apparatus For Vehicle Service System Optical Target Assembly
US20100201829A1 (en) * 2009-02-09 2010-08-12 Andrzej Skoskiewicz Camera aiming using an electronic positioning system for the target
US20100225757A1 (en) * 2009-03-04 2010-09-09 Hand Held Products, Inc. System and method for measuring irregular objects with a single camera
FR2946765A1 (en) * 2009-06-16 2010-12-17 Centre Nat Rech Scient SYSTEM AND METHOD FOR LOCATING PHOTOGRAPHIC IMAGE.
US20110096169A1 (en) * 2009-10-22 2011-04-28 Electronics And Telecommunications Research Institute Camera tracking system and method, and live video compositing system
US20110185584A1 (en) * 2007-05-21 2011-08-04 Snap-On Incorporated Method and apparatus for wheel alignment
WO2013040516A1 (en) 2011-09-14 2013-03-21 Motion Analysis Corporation Systems and methods for incorporating two dimensional images captured by a moving studio camera with actively controlled optics into a virtual three dimensional coordinate system
US8405717B2 (en) 2009-03-27 2013-03-26 Electronics And Telecommunications Research Institute Apparatus and method for calibrating images between cameras
WO2013092596A1 (en) * 2011-12-21 2013-06-27 Rolls-Royce Plc Measurement fixture for precision position measurement
US20140225915A1 (en) * 2013-02-14 2014-08-14 Research In Motion Limited Wearable display system with detached projector
US20150187133A1 (en) * 2012-07-30 2015-07-02 Sony Computer Entertainment Europe Limited Localisation and mapping
WO2017043181A1 (en) * 2015-09-09 2017-03-16 ソニー株式会社 Sensor device, sensor system, and information-processing device
WO2018017326A1 (en) * 2016-07-22 2018-01-25 Kimberly-Clark Worldwide, Inc. Positioning systems and methods for hand held devices for magnetic induction tomography
US10469758B2 (en) 2016-12-06 2019-11-05 Microsoft Technology Licensing, Llc Structured light 3D sensors with variable focal length lenses and illuminators
US10554881B2 (en) 2016-12-06 2020-02-04 Microsoft Technology Licensing, Llc Passive and active stereo vision 3D sensors with variable focal length lenses
US10755432B2 (en) * 2017-09-27 2020-08-25 Boe Technology Group Co., Ltd. Indoor positioning system and indoor positioning method
US11039083B1 (en) * 2017-01-24 2021-06-15 Lucasfilm Entertainment Company Ltd. Facilitating motion capture camera placement
US11403769B2 (en) * 2020-09-01 2022-08-02 Lucasfilm Entertainment Company Ltd. Break away mocap tracker
US11457127B2 (en) * 2020-08-14 2022-09-27 Unity Technologies Sf Wearable article supporting performance capture equipment
US20220342488A1 (en) * 2021-04-23 2022-10-27 Lucasfilm Enterntainment Company Ltd. Light capture device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4396945A (en) * 1981-08-19 1983-08-02 Solid Photography Inc. Method of sensing the position and orientation of elements in space
US4488173A (en) * 1981-08-19 1984-12-11 Robotic Vision Systems, Inc. Method of sensing the position and orientation of elements in space
US4691446A (en) * 1985-09-05 1987-09-08 Ferranti Plc Three-dimensional position measuring apparatus
US4789940A (en) * 1985-08-30 1988-12-06 Texas Instruments Incorporated Method and apparatus for filtering reflections from direct images for mobile robot navigation
US5227985A (en) * 1991-08-19 1993-07-13 University Of Maryland Computer vision system for position monitoring in three dimensions using non-coplanar light sources attached to a monitored object
US5396331A (en) * 1993-08-10 1995-03-07 Sanyo Machine Works, Ltd. Method for executing three-dimensional measurement utilizing correctively computing the absolute positions of CCD cameras when image data vary

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4396945A (en) * 1981-08-19 1983-08-02 Solid Photography Inc. Method of sensing the position and orientation of elements in space
US4488173A (en) * 1981-08-19 1984-12-11 Robotic Vision Systems, Inc. Method of sensing the position and orientation of elements in space
US4789940A (en) * 1985-08-30 1988-12-06 Texas Instruments Incorporated Method and apparatus for filtering reflections from direct images for mobile robot navigation
US4691446A (en) * 1985-09-05 1987-09-08 Ferranti Plc Three-dimensional position measuring apparatus
US5227985A (en) * 1991-08-19 1993-07-13 University Of Maryland Computer vision system for position monitoring in three dimensions using non-coplanar light sources attached to a monitored object
US5396331A (en) * 1993-08-10 1995-03-07 Sanyo Machine Works, Ltd. Method for executing three-dimensional measurement utilizing correctively computing the absolute positions of CCD cameras when image data vary

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Thesis, Diploma & Sjoberg, Magnus, "Different Calibration Methods for a CCD Camera Based 3D Motion Analysis System--Implementation, Testing and Comparison", Goteborg, May 1993, pp. 1-44.
Thesis, Diploma & Sjoberg, Magnus, Different Calibration Methods for a CCD Camera Based 3D Motion Analysis System Implementation, Testing and Comparison , Goteborg, May 1993, pp. 1 44. *

Cited By (111)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6243491B1 (en) * 1996-12-31 2001-06-05 Lucent Technologies Inc. Methods and apparatus for controlling a video system with visually recognized props
US6438508B2 (en) * 1997-02-01 2002-08-20 Orad Hi-Tec Systems, Ltd. Virtual studio position sensing system
US6201579B1 (en) * 1997-02-01 2001-03-13 Orad Hi-Tech Systems Limited Virtual studio position sensing system
US6559884B1 (en) * 1997-09-12 2003-05-06 Orad Hi-Tec Systems, Ltd. Virtual studio position sensing system
US6278479B1 (en) * 1998-02-24 2001-08-21 Wilson, Hewitt & Associates, Inc. Dual reality system
US6498618B2 (en) * 1998-02-24 2002-12-24 Phillip C. Wilson Dual reality system
US20080128508A1 (en) * 1998-03-24 2008-06-05 Tsikos Constantine J Tunnel-type digital imaging system for use within retail shopping environments such as supermarkets
US6923374B2 (en) 1998-03-24 2005-08-02 Metrologic Instruments, Inc. Neutron-beam based scanning system having an automatic object identification and attribute information acquisition and linking mechanism integrated therein
US7673803B2 (en) 1998-03-24 2010-03-09 Metrologic Instruments, Inc. Planar laser illumination and imaging (PLIIM) based engine
US7832643B2 (en) 1998-03-24 2010-11-16 Metrologic Instruments, Inc. Hand-supported planar laser illumination and imaging (PLIIM) based systems with laser despeckling mechanisms integrated therein
US20080128507A1 (en) * 1998-03-24 2008-06-05 Tsikos Constantine J Tunnel-type digital imaging system for use within retail shopping environments such as supermarkets
US20080135621A1 (en) * 1998-03-24 2008-06-12 Tsikos Constantine J Hand-supportable planar laser illumination and imaging (PLIIM) based systems with laser despeckling mechanisms integrated therein
US20080128506A1 (en) * 1998-03-24 2008-06-05 Tsikos Constantine J Hand-supportable planar laser illumination and imaging (PLIIM) based systems with laser despeckling mechanisms integrated therein
US7584893B2 (en) 1998-03-24 2009-09-08 Metrologic Instruments, Inc. Tunnel-type digital imaging system for use within retail shopping environments such as supermarkets
US7581681B2 (en) 1998-03-24 2009-09-01 Metrologic Instruments, Inc. Tunnel-type digital imaging system for use within retail shopping environments such as supermarkets
US6301549B1 (en) * 1998-06-26 2001-10-09 Lucent Technologies, Inc. Three dimensional object boundary and motion determination device and method of operation thereof
US6466275B1 (en) * 1999-04-16 2002-10-15 Sportvision, Inc. Enhancing a video of an event at a remote location using data acquired at the event
US6471044B1 (en) 1999-04-30 2002-10-29 Siemens Electrocom, L.P. Hold and release singulator
US6401936B1 (en) 1999-04-30 2002-06-11 Siemens Electrocom, L.P. Divert apparatus for conveyor system
US7527205B2 (en) 1999-06-07 2009-05-05 Metrologic Instruments, Inc. Automated package dimensioning system
US6915954B2 (en) 1999-06-07 2005-07-12 Metrologic Instruments, Inc. Programmable data element queuing, handling, processing and linking device integrated into an object identification and attribute acquisition system
US6918541B2 (en) 1999-06-07 2005-07-19 Metrologic Instruments, Inc. Object identification and attribute information acquisition and linking computer system
US20070181685A1 (en) * 1999-06-07 2007-08-09 Metrologic Instruments, Inc. Automated package dimensioning subsystem
US20060086794A1 (en) * 1999-06-07 2006-04-27 Metrologic Instruments, Inc.. X-radiation scanning system having an automatic object identification and attribute information acquisition and linking mechanism integrated therein
US7303132B2 (en) 1999-06-07 2007-12-04 Meterologic Instruments, Inc. X-radiation scanning system having an automatic object identification and attribute information acquisition and linking mechanism integrated therein
US7104453B1 (en) 1999-06-07 2006-09-12 Metrologic Instruments, Inc. Unitary package identification and dimensioning system employing ladar-based scanning methods
US6415051B1 (en) 1999-06-24 2002-07-02 Geometrix, Inc. Generating 3-D models using a manually operated structured light source
US6529627B1 (en) 1999-06-24 2003-03-04 Geometrix, Inc. Generating 3D models by combining models from a video-based technique and data from a structured light technique
US20020052708A1 (en) * 2000-10-26 2002-05-02 Pollard Stephen B. Optimal image capture
US20090134221A1 (en) * 2000-11-24 2009-05-28 Xiaoxun Zhu Tunnel-type digital imaging-based system for use in automated self-checkout and cashier-assisted checkout operations in retail store environments
US7212228B2 (en) * 2002-01-16 2007-05-01 Advanced Telecommunications Research Institute International Automatic camera calibration method
US20030156189A1 (en) * 2002-01-16 2003-08-21 Akira Utsumi Automatic camera calibration method
US7471301B2 (en) * 2002-07-24 2008-12-30 Total Immersion Method and system enabling real time mixing of synthetic images and video images by a user
US20060074921A1 (en) * 2002-07-24 2006-04-06 Total Immersion Method and system enabling real time mixing of synthetic images and video images by a user
US20040176925A1 (en) * 2003-01-10 2004-09-09 Canon Kabushiki Kaisha Position/orientation measurement method, and position/orientation measurement apparatus
US7092109B2 (en) * 2003-01-10 2006-08-15 Canon Kabushiki Kaisha Position/orientation measurement method, and position/orientation measurement apparatus
US20050060899A1 (en) * 2003-09-23 2005-03-24 Snap-On Technologies, Inc. Invisible target illuminators for 3D camera-based alignment systems
WO2005033628A3 (en) * 2003-09-23 2008-01-17 Snap On Tech Inc Invisible target illuminators for 3d camera-based alignment systems
WO2005033628A2 (en) * 2003-09-23 2005-04-14 Snap-On Technologies, Inc. Invisible target illuminators for 3d camera-based alignment systems
US20050071105A1 (en) * 2003-09-30 2005-03-31 General Electric Company Method and system for calibrating relative fields of view of multiple cameras
US6873924B1 (en) * 2003-09-30 2005-03-29 General Electric Company Method and system for calibrating relative fields of view of multiple cameras
US20050128286A1 (en) * 2003-12-11 2005-06-16 Angus Richards VTV system
US7719563B2 (en) * 2003-12-11 2010-05-18 Angus Richards VTV system
US20050253870A1 (en) * 2004-05-14 2005-11-17 Canon Kabushiki Kaisha Marker placement information estimating method and information processing device
US7657065B2 (en) * 2004-05-14 2010-02-02 Canon Kabushiki Kaisha Marker placement information estimating method and information processing device
WO2006047610A2 (en) * 2004-10-27 2006-05-04 Cinital Method and apparatus for a virtual scene previewing system
WO2006047610A3 (en) * 2004-10-27 2007-03-01 Cinital Method and apparatus for a virtual scene previewing system
US20060165310A1 (en) * 2004-10-27 2006-07-27 Mack Newton E Method and apparatus for a virtual scene previewing system
US20060127852A1 (en) * 2004-12-14 2006-06-15 Huafeng Wen Image based orthodontic treatment viewing system
US20060127854A1 (en) * 2004-12-14 2006-06-15 Huafeng Wen Image based dentition record digitization
US20060127836A1 (en) * 2004-12-14 2006-06-15 Huafeng Wen Tooth movement tracking system
US20070141534A1 (en) * 2004-12-14 2007-06-21 Huafeng Wen Image-based orthodontic treatment viewing system
US7930834B2 (en) * 2005-09-28 2011-04-26 Hunter Engineering Company Method and apparatus for vehicle service system optical target assembly
US20100165332A1 (en) * 2005-09-28 2010-07-01 Hunter Engineering Company Method and Apparatus For Vehicle Service System Optical Target Assembly
US20070076090A1 (en) * 2005-10-04 2007-04-05 Alexander Eugene J Device for generating three dimensional surface models of moving objects
EP1941719A2 (en) * 2005-10-04 2008-07-09 Eugene J. Alexander System and method for calibrating a set of imaging devices and calculating 3d coordinates of detected features in a laboratory coordinate system
US8848035B2 (en) 2005-10-04 2014-09-30 Motion Analysis Corporation Device for generating three dimensional surface models of moving objects
US20070076096A1 (en) * 2005-10-04 2007-04-05 Alexander Eugene J System and method for calibrating a set of imaging devices and calculating 3D coordinates of detected features in a laboratory coordinate system
EP1941719A4 (en) * 2005-10-04 2010-12-22 Eugene J Alexander System and method for calibrating a set of imaging devices and calculating 3d coordinates of detected features in a laboratory coordinate system
US20070152157A1 (en) * 2005-11-04 2007-07-05 Raydon Corporation Simulation arena entity tracking system
US20070104361A1 (en) * 2005-11-10 2007-05-10 Alexander Eugene J Device and method for calibrating an imaging device for generating three dimensional surface models of moving objects
US8223208B2 (en) 2005-11-10 2012-07-17 Motion Analysis Corporation Device and method for calibrating an imaging device for generating three dimensional surface models of moving objects
US20070248283A1 (en) * 2006-04-21 2007-10-25 Mack Newton E Method and apparatus for a wide area virtual scene preview system
US8339402B2 (en) * 2006-07-16 2012-12-25 The Jim Henson Company System and method of producing an animated performance utilizing multiple cameras
US20130100141A1 (en) * 2006-07-16 2013-04-25 Jim Henson Company, Inc. System and method of producing an animated performance utilizing multiple cameras
US8633933B2 (en) * 2006-07-16 2014-01-21 The Jim Henson Company System and method of producing an animated performance utilizing multiple cameras
US20080012866A1 (en) * 2006-07-16 2008-01-17 The Jim Henson Company System and method of producing an animated performance utilizing multiple cameras
US20080130985A1 (en) * 2006-12-02 2008-06-05 Electronic And Telecommunications Research Institute Correlation extract method for generating 3d motion data, and motion capture system and method for easy composition of humanoid character on real background image using the same
US8340398B2 (en) 2006-12-02 2012-12-25 Electronics And Telecommunications Research Institute Correlation extract method for generating 3D motion data, and motion capture system and method for easy composition of humanoid character on real background image using the same
KR100825859B1 (en) 2007-02-15 2008-04-28 한국과학기술연구원 Indirect object pose estimation method with respect to user-wearable camera using multi-view camera system
US20100110184A1 (en) * 2007-03-22 2010-05-06 Takehiko Hanada Article monitoring system
EP2153168A1 (en) * 2007-05-21 2010-02-17 Snap-on Incorporated Method and apparatus for wheel alignment
US8401236B2 (en) 2007-05-21 2013-03-19 Snap-On Incorporated Method and apparatus for wheel alignment
EP2153168B1 (en) * 2007-05-21 2017-05-10 Snap-on Incorporated Method and apparatus for wheel alignment
US20110185584A1 (en) * 2007-05-21 2011-08-04 Snap-On Incorporated Method and apparatus for wheel alignment
EP2636989A1 (en) * 2007-05-21 2013-09-11 Snap-on Incorporated Method and apparatus for wheel alignment
US20080306708A1 (en) * 2007-06-05 2008-12-11 Raydon Corporation System and method for orientation and location calibration for image sensors
US20090039167A1 (en) * 2007-08-10 2009-02-12 Hand-Held Products, Inc. Indicia reading terminal having spatial measurement functionality
US7726575B2 (en) 2007-08-10 2010-06-01 Hand Held Products, Inc. Indicia reading terminal having spatial measurement functionality
US20090141019A1 (en) * 2007-12-03 2009-06-04 Nokia Corporation 4d real-world browsing with capability to recognize and access objects in real-time
US8125529B2 (en) 2009-02-09 2012-02-28 Trimble Navigation Limited Camera aiming using an electronic positioning system for the target
US20100201829A1 (en) * 2009-02-09 2010-08-12 Andrzej Skoskiewicz Camera aiming using an electronic positioning system for the target
US20100225757A1 (en) * 2009-03-04 2010-09-09 Hand Held Products, Inc. System and method for measuring irregular objects with a single camera
US8643717B2 (en) 2009-03-04 2014-02-04 Hand Held Products, Inc. System and method for measuring irregular objects with a single camera
US8405717B2 (en) 2009-03-27 2013-03-26 Electronics And Telecommunications Research Institute Apparatus and method for calibrating images between cameras
WO2010146289A3 (en) * 2009-06-16 2011-04-28 Centre National De La Recherche Scientifique - Cnrs - System and method for locating a photographic image
FR2946765A1 (en) * 2009-06-16 2010-12-17 Centre Nat Rech Scient SYSTEM AND METHOD FOR LOCATING PHOTOGRAPHIC IMAGE.
US20110096169A1 (en) * 2009-10-22 2011-04-28 Electronics And Telecommunications Research Institute Camera tracking system and method, and live video compositing system
WO2013040516A1 (en) 2011-09-14 2013-03-21 Motion Analysis Corporation Systems and methods for incorporating two dimensional images captured by a moving studio camera with actively controlled optics into a virtual three dimensional coordinate system
US10271036B2 (en) 2011-09-14 2019-04-23 Motion Analysis Corporation Systems and methods for incorporating two dimensional images captured by a moving studio camera with actively controlled optics into a virtual three dimensional coordinate system
US9175950B2 (en) 2011-12-21 2015-11-03 Rolls-Royce Plc Position measurement
WO2013092596A1 (en) * 2011-12-21 2013-06-27 Rolls-Royce Plc Measurement fixture for precision position measurement
US10657663B2 (en) 2012-07-30 2020-05-19 Sony Interactive Entertainment Europe Limited Localisation and mapping
US9679381B2 (en) 2012-07-30 2017-06-13 Sony Computer Entertainment Europe Limited Localisation and mapping
US9704244B2 (en) * 2012-07-30 2017-07-11 Sony Computer Entertainment Europe Limited Localisation and mapping
US9779509B2 (en) 2012-07-30 2017-10-03 Sony Interactive Entertainment Europe Limited Localisation and mapping
US9824450B2 (en) 2012-07-30 2017-11-21 Sony Interactive Entertainment Europe Limited Localisation and mapping
US20150187133A1 (en) * 2012-07-30 2015-07-02 Sony Computer Entertainment Europe Limited Localisation and mapping
US9507147B2 (en) * 2013-02-14 2016-11-29 Blackberry Limited Wearable display system with detached projector
US20140225915A1 (en) * 2013-02-14 2014-08-14 Research In Motion Limited Wearable display system with detached projector
WO2017043181A1 (en) * 2015-09-09 2017-03-16 ソニー株式会社 Sensor device, sensor system, and information-processing device
US10976343B2 (en) 2015-09-09 2021-04-13 Sony Corporation Sensor device, sensor system, and information processing device
WO2018017326A1 (en) * 2016-07-22 2018-01-25 Kimberly-Clark Worldwide, Inc. Positioning systems and methods for hand held devices for magnetic induction tomography
US10469758B2 (en) 2016-12-06 2019-11-05 Microsoft Technology Licensing, Llc Structured light 3D sensors with variable focal length lenses and illuminators
US10554881B2 (en) 2016-12-06 2020-02-04 Microsoft Technology Licensing, Llc Passive and active stereo vision 3D sensors with variable focal length lenses
US11039083B1 (en) * 2017-01-24 2021-06-15 Lucasfilm Entertainment Company Ltd. Facilitating motion capture camera placement
US10755432B2 (en) * 2017-09-27 2020-08-25 Boe Technology Group Co., Ltd. Indoor positioning system and indoor positioning method
US11457127B2 (en) * 2020-08-14 2022-09-27 Unity Technologies Sf Wearable article supporting performance capture equipment
US11403769B2 (en) * 2020-09-01 2022-08-02 Lucasfilm Entertainment Company Ltd. Break away mocap tracker
US20220342488A1 (en) * 2021-04-23 2022-10-27 Lucasfilm Enterntainment Company Ltd. Light capture device
US11762481B2 (en) * 2021-04-23 2023-09-19 Lucasfilm Entertainment Company Ltd. Light capture device

Similar Documents

Publication Publication Date Title
US5889550A (en) Camera tracking system
US11490069B2 (en) Multi-dimensional data capture of an environment using plural devices
US8705799B2 (en) Tracking an object with multiple asynchronous cameras
US7684613B2 (en) Method and system for aligning three-dimensional shape data from photogrammetry data and three-dimensional measurement data using target locations and surface vectors
US20070035562A1 (en) Method and apparatus for image enhancement
US8970690B2 (en) Methods and systems for determining the pose of a camera with respect to at least one object of a real environment
JP4789745B2 (en) Image processing apparatus and method
Kannala et al. A generic camera model and calibration method for conventional, wide-angle, and fish-eye lenses
EP0498542A2 (en) Automated video imagery database generation using photogrammetry
JP4245963B2 (en) Method and system for calibrating multiple cameras using a calibration object
US7002551B2 (en) Optical see-through augmented reality modified-scale display
US7224386B2 (en) Self-calibration for a catadioptric camera
Neumann et al. Augmented reality tracking in natural environments
CN109416744A (en) Improved camera calibration system, target and process
US20050128196A1 (en) System and method for three dimensional modeling
RU2204149C2 (en) Method and facility for cartography of radiation sources
KR20150013709A (en) A system for mixing or compositing in real-time, computer generated 3d objects and a video feed from a film camera
JP2009017480A (en) Camera calibration device and program thereof
GB2288662A (en) Measuring rotational velocity of a spherical object
JP4052382B2 (en) Non-contact image measuring device
CN111882608A (en) Pose estimation method between augmented reality glasses tracking camera and human eyes
US6616347B1 (en) Camera with rotating optical displacement unit
EP0587328A2 (en) An image processing system
US20080252746A1 (en) Method and apparatus for a hybrid wide area tracking system
JP3452188B2 (en) Tracking method of feature points in 2D video

Legal Events

Date Code Title Description
AS Assignment

Owner name: ADAPTIVE OPTICS ASSOCIATES, INC., MASSACHUSETTS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:REYNOLDS, MARK C.;REEL/FRAME:008148/0134

Effective date: 19960812

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: PNC BANK, NATIONAL ASSOCIATION, PENNSYLVANIA

Free format text: SECURITY AGREEMENT;ASSIGNORS:METROLOGIC INSTRUMENTS, INC.;ADAPTIVE OPTICS ASSOCIATES, INC.;REEL/FRAME:011511/0142

Effective date: 20010108

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

AS Assignment

Owner name: METROLOGIC INSTRUMENTS, INC., NEW JERSEY

Free format text: RELEASE OF SECURITY INTEREST;ASSIGNOR:PNC BANK, NATIONAL ASSOCIATION;REEL/FRAME:016026/0789

Effective date: 20041026

FPAY Fee payment

Year of fee payment: 8

FEPP Fee payment procedure

Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 12