US20040169656A1 - Method for motion simulation of an articulated figure using animation input - Google Patents

Method for motion simulation of an articulated figure using animation input Download PDF

Info

Publication number
US20040169656A1
US20040169656A1 US10/715,778 US71577803A US2004169656A1 US 20040169656 A1 US20040169656 A1 US 20040169656A1 US 71577803 A US71577803 A US 71577803A US 2004169656 A1 US2004169656 A1 US 2004169656A1
Authority
US
United States
Prior art keywords
articulated
computer
scaling
animation
readable media
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/715,778
Inventor
Daniele David Piponi
Oliver James
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Warner Bros Entertainment Inc
Original Assignee
ESC Entertainment
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ESC Entertainment filed Critical ESC Entertainment
Priority to US10/715,778 priority Critical patent/US20040169656A1/en
Assigned to ESC ENTERTAINMENT reassignment ESC ENTERTAINMENT ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: JAMES, OLIVER, PIPONI, DANIELE PAOLO DAVID
Publication of US20040169656A1 publication Critical patent/US20040169656A1/en
Assigned to WARNER BROS. ENTERTAINMENT INC. reassignment WARNER BROS. ENTERTAINMENT INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ESC ENTERTAINMENT
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/802D [Two Dimensional] animation, e.g. using sprites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/08Indexing scheme for image data processing or generation, in general involving all processing steps from image acquisition to 3D model generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/61Scene description

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Generation (AREA)
  • Image Processing (AREA)
  • Processing Or Creating Images (AREA)
  • Studio Devices (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

Motion of an articulated figure in response to an external force function and to a co-acting internal force function is simulated, for use in computer animation. The internal force function is defined by applying an inverse-dynamic method to solve for internal forces driving the articulated figure to conform to a predefined animation curve. The internal force function may be scaled in any desired way, an combined with the external force function as input for a forward-dynamics motion simulation. The resulting motion simulation models a combined effect of the external force function and the predefined animation curve on motion of the articulated figure.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application claims priority pursuant to 35 U.S.C. §119(e) to U.S. Provisional Application No. 60/426,560, filed Nov. 15, 2002, which application is specifically incorporated herein, in its entirety, by reference.[0001]
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention [0002]
  • The present invention relates to methods for animating digitally-modeled articulated figures using a computer. [0003]
  • 2. Description of Related Art [0004]
  • An aspect of computer-generated animation involves simulating the motion of an articulated figure in response to externally-applied forces. For example, it may be desirable to cause a character in a computer game or motion picture to react to events, such as a collision with an object or another character, in a realistic way. It is further desirable to achieve this realistic motion in an automatic or semi-automatic fashion, without requiring an animator to manually pose the articulated figure as if reacting to the externally-applied forces. Presently, various methods and tools exist to simulate the response of an articulated figure to external forces, using the equations of motion to calculate the motion of each articulated link of the figure. Once physical parameters of an articulated figure have been defined, the response of the articulated figure to a defined impulse or other applied force can be automatically simulated. The simulated response of the articulated figure may then be used as the basis for generating an animated depiction of the response, in a manner well understood in the art. [0005]
  • Articulated figures for which motion is simulated in this manner are sometimes referred to as “rag dolls,” and the response of the figure to the externally applied force indeed resembles the passive and involuntary movements of an inanimate doll. Sometimes this is the desired effect, as when the character is to be depicted as killed or knocked unconscious by a bullet or other blow. Many times, however, it is desirable to depict an animated character that continues to struggle or otherwise move as if under intelligent self-control, while also being moved involuntarily by an externally applied force. For example, it may be desirable to depict a character involuntarily recoiling backwards after landing a punch on an opponent, while continuing to fight. [0006]
  • Animated characters, of course, lack the intelligent self-control needed to generate voluntary muscle movement. Therefore, a character's voluntary movement is normally animated using a human animator, who manually poses the character's articulated “skeleton” through a sequence of key frames. In the alternative, or in addition, motion data is captured from an appropriate subject and transformed into a pose sequence for a modeled articulated figure using motion-capture methods. The manually-defined key frame poses of the figure then become the basis for a finished animation of the character, as known in the art. Animators may be assisted in this process by various tools that apply defined constraints and/or rules of behavior for the articulated figure to fill in between the key-frames, to string predefined animation sequences together in ways that simulate voluntary behavior, and/or to ensure realistic posing in the key frames. Such tools, however, do not generally obviate the need for a least a degree of manual input from an animator, or collection of motion-capture data, to portray basic voluntary character movements. [0007]
  • Skilled animators and motion-capture actors are capable of manually defining a figure's voluntary movements to achieve a variety of desired effects in the final animation. Involuntary motion, however, poses a greater problem for the animator and actor. Many animators find it difficult to intuit realistic involuntary movements of an articulated figure, particularly when involuntary motion is combined with voluntary motion. In the case of motion-capture, it may be too difficult, too dangerous, or simply impossible to replicate forces and reactions that are desired for an animation sequence. Hence, it is often difficult to realistically animate characters in scenes that require both voluntary and involuntary motion of an articulated figure. Furthermore, any given manually-defined animation that includes both voluntary and involuntary movement, no matter how realistic, can only depict a single response. If it is desired to depict a response with a different involuntary motion component, another manual animation process is required, even if the voluntary component of the motion is unchanged. [0008]
  • The pose of an articulated figure is defined by a set of parameters that describe transforms between nodes of a hierarchical data structure, which is sometimes called a directed acyclic graph. Each node represents a segment of the articulated figure, for example, a forearm. For human characters, most of these transforms define rotations (often specified by Euler angles) at joints of the figure. At any moment in time, the pose of a character may be defined by a list of numbers defining its joint angles. A sequence of such poses defines the movement of the articulated figure, and is sometimes referred to as an animation curve. Rag doll motion simulation can be used to produce such an animation curve. So can manual key frame animation or motion capture. It may therefore seem that the combination of different motions may be simulated by combining corresponding animation curves. [0009]
  • Indeed, combination of animation curves is known in the art, as follows: two poses of the same figure from different animation sequences may be combined by interpolating between pose parameters at each frame. For example, if pose “A” has angles 45 degrees and 90 degrees for the left and right elbows and pose “B” has 15 degrees and 70 degrees for the elbows, then a pose half-way between would have an angle of 30 degrees for the left elbow and 80 degrees for the right elbow. Traditionally, animation sequences have been combined in this way by performing an interpolation for each frame (or at selected key frames) of the animation sequences. The prior art includes other, more sophisticated interpolation methods besides simple averaging, for example, weighted averaging. Regardless, interpolation often results in blended animation sequences that are not as realistic as desired, and sometimes even depicts movements that are obviously impossible. [0010]
  • It is desirable, therefore, to provide a method for defining a combination of differently-defined animation sequences for an articulated figure, that achieves realistic results without requiring an animator or actor to manually account for involuntary motion. It is further desirable to provide a method whereby a given animation sequence depicting a predefined motion can be combined with different simulated involuntary responses, without requiring additional labor-intensive operations, such as manual definitions of key frame poses or motion capture. [0011]
  • SUMMARY OF THE INVENTION
  • The present invention provides a method for animating articulated figures, that overcome the limitations of the prior art. The method may be used to combine any defined animation sequence of an articulated figure with modeled involuntary movement governed by equations of motion, for the same figure. Applications for the new method may include animating action sequences for motion pictures or computer games, or for any other animation application. The method may be used with standard animation methods and tools used to provide character animation tied to an underlying articulated figure comprising a plurality of segments or nodes linked by defined transforms or joint parameters. Articulated figures of this type, and related animation methods, are well known in the art. Usually, articulated figures comprise data structures having the topography of a directed acylic graph, although the invention is not limited to articulated figures of this type. As is known for articulated figures used in simulated motion, each segment of the articulated figure should additionally have associated parameters defining its mass and center of mass, and a defined angular velocity vector (which may be zero). [0012]
  • Essentially, a method according to the invention may be used to process several differently-defined animation drivers, to thereby obtain an animation sequence that blends both drivers. A first driver may comprise a defined animation sequence for the articulated figure. The defined animation sequence may be manually generated by any suitable method, or generated in any other desired manner, such as by motion capture. In an embodiment of the invention, the defined animation sequence is intended to depict voluntary movements of an animated character. The animation sequence defines a pose of the articulated figure at defined times of the sequence. The defined times may be selected at increments less than, equal to, or greater than the frame rate for a desired output animation. At each instant of the sequence, the pose defines a position and orientation of each segment of the articulated figure, according to methods known in the art. Accordingly, as the increment of time is known, the sequence defines an angular velocity and a translational velocity for each segment, for each increment of the animation sequence, except for an initial state. Initial velocities for the articulated figure may be separately defined. Position, orientation, and the rate of change of these quantities may be collectively expressed as Q(t), wherein t represents time. Normally these quantities are defined with respect to a frame of reference for the articulated figure. [0013]
  • A second driver may comprise data defining one or more external forces and/or impulses to be applied to the animated character during the sequence, collectively represented herein by G(t). As the notation suggests, G(t) may vary as a function of time. For example, such forces or impulses may include a time-unvarying gravitational force acting on the center of mass of each segment of the figure, and/or an impulse of time-varying magnitude applied at a defined location of the articulated figure. The external forces G(t) may include both translational forces and torques, defined with respect to the articulated figure. An initial position and orientation of a segment of the articulated figures may also be defined with respect to an external reference system, such as to the scene where the action is to occur. [0014]
  • The first animation driver is received as input for an inverse-dynamics solution process. The inverse-dynamics process calculates a solution F(t), representing time-varying internal torques acting on each segment of the articulated figure that will result in Q(t). These internal torques may be thought of as exerted by “muscles” of the articulated figure on its respective body segments. In an embodiment of the invention, externally-applied forces, such as gravity, are not generally considered. This maintains computational simplicity. In other embodiments, the influence of parameters such as the force of gravity and joint friction may be considered. However, for many animation applications, there may be little benefit, and considerably more computational complexity, associated with including such additional parameters in the solution of F(t). Various inverse-dynamic solution methods are known for articulated figures, and any suitable method may be used. [0015]
  • In an embodiment of the invention, a discrete time increment variable Δt is defined prior to the inverse-dynamics solution step described above. This time variable represents a forward-looking interval over which F(t) is linearly defined, for example, F(t) may be assumed constant over the interval. Thus, in an embodiment of the invention, a value for F(t) may be determined from a difference between a pose P(t) determined from a motion simulation and a desired joint orientation Q(t+Δt) at a time Δt in the future. In an alternative embodiment, an approximate value for F(t) over this interval may be directly determined from Q(t+Δt)−Q(t), independently of P(t). [0016]
  • In either case, torque may be determined from the relation τ={dot over (L)}, where τ represents torque and {dot over (L)} represents the time derivative of angular momentum. In turn, {dot over (L)} may be approximated from [0017] I ω ( t + Δ t ) - ω ( t ) Δ t ,
    Figure US20040169656A1-20040902-M00001
  • wherein I represents the moment of inertia of the respective segment, and ω is the associated angular velocity. The smaller the value of Δt, the more accurate the approximation for F(t) will be. Generally, Q(t) is defined at discrete time increments (although it may be interpolated between it minimum increments), and in such cases Δt should be defined as equal to, or greater than the minimum time increment of Q(t). [0018]
  • A forward-dynamics solution process may then proceed, starting from an initial state Q[0019] 0 of Q(t), at an initial time t=0. Initially, a force F(t) for the time interval from t=0 to t=Δt may be determined from Q(t+Δt)−Q(t). In an embodiment of the invention, the external force function G(t) and F(t) for the first interval may then be provided as input to any suitable motion simulation process for an articulated figure, calculating an initial pose P(nΔt), where n=1. This initial pose may then be compared with Q(t+(n+1)Δt), i.e., the desired pose a time Δt in the future, to compute the internal force function F(t) for the interval from Δt to 2Δt. The sum F(t)+G(t) is then provided to the motion simulator for the next interval (n=2) to compute an output pose for this next interval. The foregoing process is repeated until the pose sequence P(t) for the articulated figure is defined over the entire period of interest. The output P(t) may then be used to develop a finished animation according to methods known in the art.
  • In an alternative embodiment, the internal torque function F(t) may be calculated ahead of time, based on Q(t+Δt)−Q(t) independently of P(t). The sum of the internal and external forces F(t)+G(t) may then be provided as input to any suitable motion simulation process for an articulated figure. The motion simulation thereby computes a resulting time-varying pose P(t) for the articulated figure, including its time-varying position and orientation with respect to an external frame of reference, over the entire time period of interest. [0020]
  • As an alternative to driving the forward-dynamics solution process with the sum F(t)+G(t) according to either of the foregoing embodiments, a modified sum defined as sF(t)+G(t), wherein s is a user-selected scale factor, may be used instead. Selecting a value for s less than one will result in a P(t) in which movement corresponding to F(t) is diminished. Setting s greater than one will exaggerate these movements. By adjusting the value of s, a user may adjust the prominence of the separately-defined animation sequence Q(t) in the output sequence P(t), depending on the intended effect. Optionally, s may be defined as a time-varying function. [0021]
  • Advantageously, the output P(t) is a function of the separately determined inputs F(t) and G(t). Any particular animation sequence Q(t) may be combined with any number of separate external force functions G(t), as desired. It should be possible, therefore, to reduce the need to perform labor-intensive operations for defining Q(t), such as manual key frame animation or motion capture, using a method according to the invention. At the same time, quite varied yet realistic animation results may be realized by combining the same internal force function F(t) with different external force functions G(t). This may readily be accomplished in real time, for application in computer game applications. [0022]
  • Thus, a character animated according to the invention may be provided with a state of motion that depends in part on the character's voluntary movements. No longer will the character behave like a passive “rag doll” under the influence of external forces. For example, if a character pulls its limbs in towards its center of mass while tumbling, its overall angular velocity will increase. In other words, it will tumble faster. Such effects may create an impression than a character is alive and able to influence its motion in a realistic way by voluntary movement, thereby increasing the interest and appeal of action sequences and other animations. [0023]
  • A more complete understanding of the method according to the present invention will be afforded to those skilled in the art, as well as a realization of additional advantages and objects thereof, by a consideration of the following detailed description of the preferred embodiment. Reference will be made to the appended sheets of drawings, which will first be described briefly.[0024]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIGS. 1A are diagrams showing exemplary successive poses of an articulated figure for use in computer animation. [0025]
  • FIG. 2 is a flow chart showing exemplary steps for determining movements of an articulated figure, according to an embodiment of the invention. [0026]
  • FIG. 3 is a block diagram showing an exemplary system for performing a method according to the invention. [0027]
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • The present invention provides a method an system for determining movement of an articulated figure, such as may be used for computer-generated animation, that overcomes the limitations of the prior art. In the detailed description that follows, like element numerals are used to denote like elements appearing in one or more of the figures. [0028]
  • FIG. 1A shows an exemplary articulated figure [0029] 100 for use in computer animation. FIG. 1B shows the same figure 100 in a different pose, as it might appear at a later time in an animation sequence. Various forms of articulated figures are known in the art. It should be appreciated that the 3-segment, 2-jointed figure 100 shown in FIG. 1A, although sufficient for illustrating the methods of the invention, is much simpler than a typical figure used for animating humans and the like. figure 100 comprises “n” number of body segments 101, 102, 103, connected by “n−1” joints 104, 105. A mass and center of mass 106, 107, 108 are associated with each body segment. Joints 104, 105 may be defined with various constraints. For example, joint 104 may be defined as a ball-and-socket joint, while joint 105 may be defined as a pin joint, permitting rotation around a single axis. The connectivity of the joints of an articulated figure having “n” joints and “m” segments may be expressed by a joint connectivity matrix, here denoted J. Articulated figures for animating humanoid or other character movement, and methods for mathematically expressing and calculating poses of such articulated figures, are well known in the art.
  • For example, three Euler angles, sometimes called joint angles, may be used to define the orientation of an attached segment. In the alternative, nine Euler parameters (direction cosines) may be used to define the orientation. When nine parameters are used, they may be expressed as a 3×3 matrix, herein denoted R. Each segment may be described in its own coordinate space, with relationships between coordinate spaces defined by any intervening Euler angles and segment geometry. A database or table of Euler angles, Euler parameters, or other values which define the position of an articulated figure at successive instants of time is sometimes referred to as an animation curve. An animation curve that is defined by manual key frame animation, by capturing the motion of an actor or other subject, or in any other suitable manner, is herein referred to as Q(t). [0030]
  • For purely manual, intuitive development of animation curves, it is not necessary to include a mass or center of mass for the articulated segments. Articulated figures with associated mass and center-of-mass parameters are known, however, for simulating the motion of an articulated figure in reaction to externally-applied forces. Software tools are commercially available to perform motion simulation of this type, sometimes referred to as “rag doll” simulation. For example, a rag doll simulator is available, as more fully described at the Internet address www.havok.com. Simulators of this type calculate the reaction of an articulated figure having a defined mass and joint parameters, using Newton's equations of motion. It should be appreciated, therefore, that implementation of the invention does not require any further definition of the animation subject's articulated figure than would be required for a conventional motion simulation. [0031]
  • In the present invention, mass and center-of-mass parameters are used differently than in conventional motion simulation, for the solution of an inverse-dynamics problem: determining internal torques (or if relevant, translation forces) by which an articulated figure can be made to respond as defined by an animation curve. For each segment of the articulated figure, a rotational inertia tensor I may be defined such that each segment's angular momentum is given by L=Iω), wherein ω is the segment's angular velocity tensor. In an inverse-dynamic solution step of the invention, a matrix of inertia tensors I for each segment may be used to solve for internal torques τ acting at each joint of the articulated figure, so as to result in animation curve that is equivalent or closely corresponding to Q(t). For example, referring to FIG. 1A, a [0032] first torque 109 and a second torque 110 may be computed that, when applied over a defined interval of time in the absence of externally applied forces, will move figure 100 from the pose shown in FIG. 1A to that shown in FIG. 1B.
  • As noted above, for many or most articulated figures, a solution for τ is sufficient to define an animation curve, and definition of translational forces exerted between joints is not required. Most articulated figures modeled after natural beings do not contain joints that permit appreciable translation between jointed segments. Any internal translational forces exerted by the muscles therefore do not perform appreciable work, and may usually be ignored. In the unusual case where the articulated figure includes one or more joints that permits appreciable translation between adjacent segments, non-zero translational forces may also be considered. [0033]
  • The foregoing principles may be applied to determine movement of an articulated figure, including both “voluntary” and simulated involuntary movement, according to the invention. FIG. 2 shows exemplary steps of a [0034] method 200 for determining such movement. At step 202, a pose sequence Q(t) for the articulated figure is accessed, such as by reading from a computer database or memory. The pose sequence may be in any suitable form, such as a standard animation curve. Generally this animation curve should depict the figure's voluntary motions (e.g., jumping, running, kicking) and should not attempt to depict involuntary motion from externally-applied forces. Methods for defining such animation curves are well understood in the art. An initial rotational velocity for the segments, Q0(t), may also be defined. Note that segment velocities subsequent to the initial velocity may be determined from the initial velocities, the positions of the segments at future times as defined by Q(t), and the topology of the articulated figure, such as described by a connectivity matrix J.
  • In respect to step [0035] 202, an amount of time between poses of the animation curve may be defined as a “look-ahead” interval Δt. The look-ahead interval may be user-defined, or may be determined from other parameters of the desired animation, such as the frame rate. A difference Q(t+Δt)−Q(t) may be used to define a rotational velocity ω of each segment of the articulated figure, at successive instants of the animation curve. In particular, in each pose of the sequence, an orientation of each joint “i” of the figure may be defined by
  • O i =R j T R i,  (Eq.1)
  • where O[0036] i represents the orientation of the ith joint, and Rj and Ri are the orientation of its adjoining joints. An incremental change in ΔOi corresponding to the look-ahead interval, divided by Δt, may represent a segment velocity ω. This, in turn, may be used to solve for the torque τ at each joint, using the basic relationship
  • {dot over (L)}=τ,  (Eq. 2)
  • where {dot over (L)} is the time derivative of angular momentum L, defined as Iω. [0037]
  • At [0038] step 204, such calculation should proceed. Analytically, the determination of the time derivative of Iω is not trivial, but the literature contains methods for an efficient solution. The problem may be characterized as the solution of
  • ΔR≅J T M −1   (Eq. 3)
  • for τ, where ΔR is a matrix of changes in joint orientation for a given interval, M[0039] −1 is a diagonal matrix of inverse inertia tensors for each segment of the articulated figure, and J and τ are as previously described. Similar problems and solution methods are described, for example, in Linear-Time Dynamics using Lagrange Multipliers, by David Baraff, Siggraph 1996 (© 1996 ACM-0-89791-746-4/96/008), and in particular, section 7.3 therein, by which a solution for τ may be obtained in linear time. A collection of values for τ defined over the intervals of interest, may comprise an internal force function F(t), representing the internal torques that will drive the articulated figure to conform to the desired animation curve. As a whole, the algorithm for calculating the internal force function need only be concerned with the joint orientations, and need not be concerned with the overall position or velocity of the segments except as used to calculated the joint orientations.
  • A target animation curve for defining the internal force function F(t) may be defined in various way. In an embodiment of the invention, the target animation curve is defined in a stepwise fashion, contemporaneously with the calculation of P(t) during motion simulation, as outlined in [0040] step 214. During forward-dynamics solution process 214, F(t) for a first time interval from t=0 to t=Δt may be determined from Q(t+Δt)−Q(t). The external force function G(t) and F(t) for the first interval may then be provided as input to any suitable motion simulation process for an articulated figure, calculating an initial pose P(nΔt), where n=1. This initial pose may then be compared with Q(t+(n+1)Δt), i.e., the desired pose a time Δt in the future, to compute the internal force function F(t) for the next interval. For example, for n=2, the next interval is Δt to 2Δt. The sum F(t)+G(t) for the next interval may then be calculated as described for step 214 to compute a corresponding output pose. The foregoing may be repeated until the pose sequence P(t) for the articulated figure is defined over the entire period of interest. Optionally, if a particular value of F(t) exceeds a specified value for a muscular torque at any particular joint, it may be limited to the specified maximum value. Such constraints may be derived from biometric data, and may be used to prevent unrealistic character movement, if desired.
  • In an alternative embodiment, the internal force function may be calculated directly from Q(t), independently of the calculation of P(t) in [0041] step 214. In such case, F(t) may be calculated from Q(t+Δt)−Q(t) for each successive interval. The sum of the internal and external forces F(t)+G(t) may then be provided as input to any suitable motion simulation process for an articulated figure. Note that this alternative method should generally result in a different output P(t) for a given Q(t) and G(t), as compared to the first embodiment described above. Because there is no feedback between F(t) and P(t) according to this second embodiment, the predetermined component of the animated movement may “drift” from its original appearance as dictated by Q(t). Whether or not drift is desirable may depend on the intended effect.
  • The invention is not limited to a particular solution method. In more general terms, [0042] step 204 may be described as determining those internal forces of the defined articulated figure that will drive the figure according to its predefined animation curve. A variety of different approaches may be suitable for solving this problem. Generally, external forces, including the force of gravity, may be ignored at this step, for computational simplicity. It should be appreciated that the object of step 204, unlike many prior-art applications for inverse dynamics, is not generally concerned with an accurate solution for a robotics or biomechanical problem. Instead, a principal object of step 204 is to transform an arbitrarily-determined animation curve to a form that may readily be combined with other input to a motion-simulation function. Hence, the force function F(t), whether defining tensors τ and/or other data, should be configured for this purpose.
  • At [0043] step 206, data G(t) may be accessed, such as from a computer database or memory. G(t) may be defined in an early step, and should define one or more external forces and/or impulses to be applied to the articulated figure during a desired animation sequence. Data G(t) may include values defining both translational forces and torques, defined with respect to the articulated figure. Such forces may include, for example, a force of gravity, forces from interactions with flowing fluids, and impulse forces from collisions with other objects. Such forces may be scripted, and/or defined using a computational method, such as collision detection. Motion of the articulated figure is to be simulated starting from some defined initial position and state. Therefore, an initial position and orientation of a segment of the articulated figures may also be defined with respect to an external reference system, such as to the scene where the action is to occur. The initial pose of the figure may be defined by Q(t).
  • At [0044] optional step 208, a scale factor s may be determined. The scale factor may be used to scale the component of simulated motion that is caused by the internal force function F(t). The scale factor may be constant, or a time varying function s(t). For example, a constant value less than one may be used to reduce the amount of voluntary character movement during an action sequence; a constant value greater than one may be used to exaggerate this component of motion; and a time-decaying value may be used to gradually reduce the component of voluntary character movement. A user interface, such as a slider, may be used to receive direct user input. At step 210, if a scale factor is entered, the solution F(t) to step 204 may be appropriately scaled by the selected amount. For example, each tensor τ comprising F(t) may be scaled by s. If no scale factor is specified, F(t) may be left unscaled.
  • At [0045] step 212, the force functions previously defined may be summed. If no scale factor is to be used, the resulting sum may be represented by F(t)+G(t). If a scale factor is to be applied, the resulting sum may be represented by sF(t)+G(t). In either case, the sum should be expressed in a form suitable for providing input to a motion simulation program for articulated figures. It should be appreciated that in some cases, the force functions may be separately provided to a motion simulation algorithm, and implicitly summed though iterative motion simulations. For example, an intermediate simulated result may be calculated using G(t) as input, and then the intermediate result and F(t) may be provided as input to a second iteration of the motion simulation to obtain a final result. The final result may be the same as if the sum F(t)+G(t) had been provided to the motion simulation algorithm. Such techniques merely perform the same summation implicitly, and should be considered as within the scope of the invention.
  • At [0046] step 214, the sum computed at step 212 is used as input to a motion simulation algorithm, which determines a simulated motion P(t) for the articulated figure using a forward-dynamics solution process. Essentially, the equations of motion are used to define the position of each segment of the articulated figure at each desired time increment. Such calculation methods are straightforward and need not be described here. For example, a rag doll motion simulation algorithm may be configured to operate on the sum of F(t)+G(t). Any suitable method motion simulation method as known in the art for articulated figures may be used. The resulting output P(t) may provide the basis for a character animation, as known in the art. As previously described, the output P(t) for each time increment may also be compared with the. predetermined animation curve Q(t) overall a look-ahead interval Δt to calculate F(t) during step 214, thereby driving the output animation towards a desired pose.
  • The methods of the invention may be performed using any suitable computer system. FIG. 3 shows a block diagram of a [0047] computer system 300, comprising a computer 302 connected to a memory 306 containing instructions for performing steps of the invention, and to a database 304 containing previously-defined data, such as an input animation curve Q(t) and/or an external force function G(t). Computer 302 may be configured to read a removable media 308, on which instructions for performing the steps of the invention may be encoded. Such instructions may be read and transferred to memory 306 for execution by computer 302. The instructions may be written in any suitable programming language and encoded for performing steps of a method according to the invention, as described above.
  • Having thus described preferred embodiments of a method for animating an articulated figure, it should be apparent to those skilled in the art that certain advantages of the within system have been achieved. It should also be appreciated that various modifications, adaptations, and alternative embodiments thereof may be made within the scope and spirit of the present invention. The invention is defined by the appended claims. [0048]

Claims (19)

What is claimed is:
1. A method for determining movements of an articulated figure for use in computer-generated animation, the method comprising:
accessing a pose sequence Q(t), wherein Q(t) comprises position values associated with segments of an articulated figure at sequential times of the pose sequence;
calculating an inverse-dynamics solution F(t), wherein F(t) comprises calculated torque values for the segments during sequential forward-looking intervals Δt, such as would result in movements of the articulated figure corresponding to Q(t);
accessing force data G(t), wherein G(t) comprises external force values for simulating a response of the articulated figure; and
simulating a dynamic response of the articulated figure in reaction to a sum of F(t) and G(t), thereby defining a simulated pose sequence P(t).
2. The method of claim 1, further comprising setting Δt equal to a user-determinable value, prior to the calculating step.
3. The method of claim 1, further comprising scaling F(t) by a scale factor s, whereby the simulating step defines P(t) by a simulated dynamic response of the articulated figure in reaction to a sum of F(t) scaled by s and G(t).
4. The method of claim 3, further comprising receiving user input defining a value of s, prior to the scaling step.
5. The method of claim 3, wherein the scaling step further comprises scaling F(t) by s, wherein s is less than one.
6. The method of claim 3, wherein the scaling step further comprises scaling F(t) by s, wherein s is greater than one.
7. The method of claim 3, wherein the scaling step further comprises scaling F(t) by s, wherein s comprises a time-dependent function.
8. The method of claim 1, further comprising calculating G(t) using P(t) as input to determine collision events between the articulated figure and other simulated objects, whereby impulse values for G(t) are determined.
9. The method of claim 1, wherein the calculating step and the simulating step are performed concurrently.
10. The method of claim 1, wherein the simulating step is performed after the calculating step has completed by defining F(t) over an animation sequence.
11. A computer-readable media encoded with instructions for determining movements of an articulated figure for use in computer-generated animation, the instructions comprising:
accessing a pose sequence Q(t), wherein Q(t) comprises position values associated with segments of an articulated figure at sequential times of the pose sequence;
calculating an inverse-dynamics solution F(t), wherein F(t) comprises calculated torque values for the segments during sequential forward-looking intervals Δt, such as would result in movements of the articulated figure corresponding to Q(t);
accessing force data G(t), wherein G(t) comprises external force values for simulating a response of the articulated figure; and
providing a sum of F(t) and G(t) suitable for input in simulating a dynamic response of the articulated figure using a forward-dynamics motion simulation to determine a simulated pose sequence P(t).
12. The computer-readable media of claim 1, wherein the instructions further comprise setting Δt equal to a user-determinable value, prior to the calculating step.
13. The computer-readable media of claim 1, wherein the instructions further comprise scaling F(t) by a scale factor s, whereby the providing step provides a sum of F(t) scaled by s and G(t).
14. The computer-readable media of claim 13, wherein the instructions further comprise receiving user input defining a value of s, prior to the scaling step.
15. The computer-readable media of claim 13, wherein the instructions further comprise scaling F(t) by s, wherein s is less than one.
16. The computer-readable media of claim 13, wherein the instructions further comprise scaling F(t) by s, wherein s is greater than one.
17. The computer-readable media of claim 13, wherein the instructions further comprise scaling F(t) by s, wherein s comprises a time-dependent function.
18. The computer-readable media of claim 11, wherein the instructions further comprise calculating G(t) using P(t) as input to determine collision events between the articulated figure and other simulated objects, whereby impulse values for G(t) are determined. The computer-readable media of claim 11, wherein the instructions further comprise performing the calculating step and the simulating step concurrently.
19. The computer-readable media of claim 11, wherein the instructions further comprise performing the simulating step after the calculating step has completed by defining F(t) over an animation sequence.
US10/715,778 2002-11-15 2003-11-17 Method for motion simulation of an articulated figure using animation input Abandoned US20040169656A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/715,778 US20040169656A1 (en) 2002-11-15 2003-11-17 Method for motion simulation of an articulated figure using animation input

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US42656002P 2002-11-15 2002-11-15
US10/715,778 US20040169656A1 (en) 2002-11-15 2003-11-17 Method for motion simulation of an articulated figure using animation input

Publications (1)

Publication Number Publication Date
US20040169656A1 true US20040169656A1 (en) 2004-09-02

Family

ID=32326376

Family Applications (6)

Application Number Title Priority Date Filing Date
US10/715,778 Abandoned US20040169656A1 (en) 2002-11-15 2003-11-17 Method for motion simulation of an articulated figure using animation input
US10/715,777 Active 2025-11-14 US7536047B2 (en) 2002-11-15 2003-11-17 Method for digitally rendering skin or like materials
US10/715,869 Expired - Lifetime US6990230B2 (en) 2002-11-15 2003-11-17 Reverse-rendering method for digital modeling
US10/715,870 Active 2025-01-05 US7079137B2 (en) 2002-11-15 2003-11-17 Method for digitally rendering an object using measured BRDF data
US10/715,775 Expired - Lifetime US6983082B2 (en) 2002-11-15 2003-11-17 Reality-based light environment for digital imaging in motion pictures
US12/403,185 Active 2025-12-20 US8515157B2 (en) 2002-11-15 2009-03-12 Method for digitally rendering skin or like materials

Family Applications After (5)

Application Number Title Priority Date Filing Date
US10/715,777 Active 2025-11-14 US7536047B2 (en) 2002-11-15 2003-11-17 Method for digitally rendering skin or like materials
US10/715,869 Expired - Lifetime US6990230B2 (en) 2002-11-15 2003-11-17 Reverse-rendering method for digital modeling
US10/715,870 Active 2025-01-05 US7079137B2 (en) 2002-11-15 2003-11-17 Method for digitally rendering an object using measured BRDF data
US10/715,775 Expired - Lifetime US6983082B2 (en) 2002-11-15 2003-11-17 Reality-based light environment for digital imaging in motion pictures
US12/403,185 Active 2025-12-20 US8515157B2 (en) 2002-11-15 2009-03-12 Method for digitally rendering skin or like materials

Country Status (5)

Country Link
US (6) US20040169656A1 (en)
EP (3) EP1565872A4 (en)
JP (3) JP2006507585A (en)
AU (3) AU2003295582A1 (en)
WO (3) WO2004047008A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050096889A1 (en) * 2003-10-29 2005-05-05 Snecma Moteurs Moving a virtual articulated object in a virtual environment while avoiding collisions between the articulated object and the environment
US20050096890A1 (en) * 2003-10-29 2005-05-05 Snecma Moteurs Moving a virtual articulated object in a virtual environment while avoiding internal collisions between the articulated elements of the articulated object
WO2006102599A2 (en) * 2005-03-23 2006-09-28 Electronic Arts Inc. Computer simulation of body dynamics including a solver that solves in linear time for a set of constraints
WO2006133228A2 (en) * 2005-06-06 2006-12-14 Electronic Arts Inc. Adaptive contact based skeleton for animation of characters in video games
US20060286522A1 (en) * 2005-06-17 2006-12-21 Victor Ng-Thow-Hing System and method for activation-driven muscle deformations for existing character motion
US7403202B1 (en) * 2005-07-12 2008-07-22 Electronic Arts, Inc. Computer animation of simulated characters using combinations of motion-capture data and external force modelling or other physics models
US20090259450A1 (en) * 2008-04-15 2009-10-15 Cleary Paul William physics-based simulation
US20130033486A1 (en) * 2011-08-05 2013-02-07 Mccartney Jeffrey Computer System For Animating 3D Models Using Offset Transforms
US8860731B1 (en) * 2009-12-21 2014-10-14 Lucasfilm Entertainment Company Ltd. Refining animation
US20180204366A1 (en) * 2005-04-19 2018-07-19 Digitalfish, Inc. Techniques and Workflows for Computer Graphics Animation System
US10229530B2 (en) 2016-01-19 2019-03-12 Canon Kabushiki Kaisha Image processing device and method therefor
CN110930483A (en) * 2019-11-20 2020-03-27 腾讯科技(深圳)有限公司 Role control method, model training method and related device

Families Citing this family (140)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1074943A3 (en) * 1999-08-06 2004-03-24 Canon Kabushiki Kaisha Image processing method and apparatus
US7342489B1 (en) 2001-09-06 2008-03-11 Siemens Schweiz Ag Surveillance system control unit
US7239345B1 (en) * 2001-10-12 2007-07-03 Worldscape, Inc. Camera arrangements with backlighting detection and methods of using same
GB2393887B (en) * 2002-10-04 2005-10-26 Criterion Software Ltd Three-dimensional computer graphics
JP3962676B2 (en) * 2002-11-29 2007-08-22 キヤノン株式会社 Image processing method and apparatus
US7050078B2 (en) * 2002-12-19 2006-05-23 Accenture Global Services Gmbh Arbitrary object tracking augmented reality applications
US7714858B2 (en) * 2003-04-18 2010-05-11 Hewlett-Packard Development Company, L.P. Distributed rendering of interactive soft shadows
US7787692B2 (en) * 2003-09-25 2010-08-31 Fujifilm Corporation Image processing apparatus, image processing method, shape diagnostic apparatus, shape diagnostic method and program
GB2410639A (en) * 2004-01-30 2005-08-03 Hewlett Packard Development Co Viewfinder alteration for panoramic imaging
JP4692956B2 (en) * 2004-11-22 2011-06-01 株式会社ソニー・コンピュータエンタテインメント Drawing processing apparatus and drawing processing method
KR100609145B1 (en) * 2004-12-20 2006-08-08 한국전자통신연구원 Rendering Apparatus and Method for real-time global illumination in real light environment
EP1686531B1 (en) * 2005-01-27 2018-04-25 QUALCOMM Incorporated A method, a software product and an electronic device for generating an image composition
US8606383B2 (en) * 2005-01-31 2013-12-10 The Invention Science Fund I, Llc Audio sharing
US20060170956A1 (en) 2005-01-31 2006-08-03 Jung Edward K Shared image devices
US9124729B2 (en) 2005-01-31 2015-09-01 The Invention Science Fund I, Llc Shared image device synchronization or designation
US20060221197A1 (en) * 2005-03-30 2006-10-05 Jung Edward K Image transformation estimator of an imaging device
US8902320B2 (en) 2005-01-31 2014-12-02 The Invention Science Fund I, Llc Shared image device synchronization or designation
US9910341B2 (en) 2005-01-31 2018-03-06 The Invention Science Fund I, Llc Shared image device designation
US9489717B2 (en) 2005-01-31 2016-11-08 Invention Science Fund I, Llc Shared image device
US9082456B2 (en) 2005-01-31 2015-07-14 The Invention Science Fund I Llc Shared image device designation
US8223845B1 (en) 2005-03-16 2012-07-17 Apple Inc. Multithread processing of video frames
US7710423B2 (en) * 2005-03-21 2010-05-04 Microsoft Corproation Automatic layout of items along an embedded one-manifold path
KR101199498B1 (en) 2005-03-31 2012-11-09 삼성전자주식회사 Apparatus for encoding or generation of multi-view video by using a camera parameter, and a method thereof, and a recording medium having a program to implement thereof
US8345252B2 (en) * 2005-04-25 2013-01-01 X-Rite, Inc. Method and system for enhanced formulation and visualization rendering
US9001215B2 (en) 2005-06-02 2015-04-07 The Invention Science Fund I, Llc Estimating shared image device operational capabilities or resources
US9451200B2 (en) 2005-06-02 2016-09-20 Invention Science Fund I, Llc Storage access technique for captured data
US9819490B2 (en) 2005-05-04 2017-11-14 Invention Science Fund I, Llc Regional proximity for shared image device(s)
US9191611B2 (en) 2005-06-02 2015-11-17 Invention Science Fund I, Llc Conditional alteration of a saved image
US9967424B2 (en) 2005-06-02 2018-05-08 Invention Science Fund I, Llc Data storage usage protocol
US8964054B2 (en) 2006-08-18 2015-02-24 The Invention Science Fund I, Llc Capturing selected image objects
US9621749B2 (en) 2005-06-02 2017-04-11 Invention Science Fund I, Llc Capturing selected image objects
US10003762B2 (en) 2005-04-26 2018-06-19 Invention Science Fund I, Llc Shared image devices
US8681225B2 (en) 2005-06-02 2014-03-25 Royce A. Levien Storage access technique for captured data
US9093121B2 (en) 2006-02-28 2015-07-28 The Invention Science Fund I, Llc Data management of an audio data stream
US9942511B2 (en) 2005-10-31 2018-04-10 Invention Science Fund I, Llc Preservation/degradation of video/audio aspects of a data stream
US9167195B2 (en) 2005-10-31 2015-10-20 Invention Science Fund I, Llc Preservation/degradation of video/audio aspects of a data stream
US9076208B2 (en) * 2006-02-28 2015-07-07 The Invention Science Fund I, Llc Imagery processing
WO2006122320A2 (en) * 2005-05-12 2006-11-16 Tenebraex Corporation Improved methods of creating a virtual window
US20070120980A1 (en) 2005-10-31 2007-05-31 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Preservation/degradation of video/audio aspects of a data stream
US20070273711A1 (en) * 2005-11-17 2007-11-29 Maffei Kenneth C 3D graphics system and method
WO2007070088A1 (en) * 2005-12-16 2007-06-21 Thomson Licensing Method, apparatus and system for providing reproducible digital imagery products
US7589724B1 (en) 2006-02-15 2009-09-15 Adobe Systems, Incorporated Successive-convolution-compositing technique for rendering soft shadows
US7623137B1 (en) * 2006-02-15 2009-11-24 Adobe Systems, Incorporated Successive-convolution-compositing technique for rendering translucent surfaces
US7411688B1 (en) 2006-03-17 2008-08-12 Arius3D Inc. Method and system for laser intensity calibration in a three-dimensional multi-color laser scanning system
US20070242141A1 (en) * 2006-04-14 2007-10-18 Sony Corporation And Sony Electronics Inc. Adjustable neutral density filter system for dynamic range compression from scene to imaging sensor
US8633927B2 (en) 2006-07-25 2014-01-21 Nvidia Corporation Re-render acceleration of frame with lighting change
US8115774B2 (en) * 2006-07-28 2012-02-14 Sony Computer Entertainment America Llc Application of selective regions of a normal map based on joint position in a three-dimensional model
US8446509B2 (en) * 2006-08-09 2013-05-21 Tenebraex Corporation Methods of creating a virtual window
GB0616685D0 (en) * 2006-08-23 2006-10-04 Warwick Warp Ltd Retrospective shading approximation from 2D and 3D imagery
US8739137B2 (en) 2006-10-19 2014-05-27 Purdue Research Foundation Automatic derivative method for a computer programming language
US8281299B2 (en) 2006-11-10 2012-10-02 Purdue Research Foundation Map-closure: a general purpose mechanism for nonstandard interpretation
US8094182B2 (en) * 2006-11-16 2012-01-10 Imove, Inc. Distributed video sensor panoramic imaging system
CA2669001C (en) * 2006-11-20 2015-10-20 Thomson Licensing Method and system for modeling light
JP4808600B2 (en) * 2006-11-22 2011-11-02 デジタルファッション株式会社 Rendering program, rendering apparatus, and rendering method
JP4842242B2 (en) * 2006-12-02 2011-12-21 韓國電子通信研究院 Method and apparatus for real-time expression of skin wrinkles during character animation
US9767599B2 (en) * 2006-12-29 2017-09-19 X-Rite Inc. Surface appearance simulation
US20080178087A1 (en) * 2007-01-19 2008-07-24 Microsoft Corporation In-Scene Editing of Image Sequences
EP2122577B1 (en) * 2007-01-24 2018-03-07 Swiftfoot Graphics Ab Method, display adapter and computer program product for improved graphics performance by using a replaceable culling program
KR100967701B1 (en) * 2007-02-26 2010-07-07 한국외국어대학교 연구산학협력단 Reconstructing three dimensional oil paintings
US20080320126A1 (en) * 2007-06-25 2008-12-25 Microsoft Corporation Environment sensing for interactive entertainment
US7929142B2 (en) * 2007-09-25 2011-04-19 Microsoft Corporation Photodiode-based bi-directional reflectance distribution function (BRDF) measurement
US20090079758A1 (en) * 2007-09-25 2009-03-26 Max-Planck-Gesellschaft Zur Forderung Per Wissenschaften E.V. Method and device for generating shadow maps
US8310481B2 (en) * 2007-10-12 2012-11-13 Edward Ernest Bailey Computer aided design method for enhancement of local refinement through T-splines
US8159490B2 (en) * 2007-10-16 2012-04-17 Dreamworks Animation Llc Shading of translucent objects
WO2009064504A1 (en) * 2007-11-16 2009-05-22 Tenebraex Corporation Systems and methods of creating a virtual window
US8791984B2 (en) * 2007-11-16 2014-07-29 Scallop Imaging, Llc Digital security camera
US20090290033A1 (en) * 2007-11-16 2009-11-26 Tenebraex Corporation Systems and methods of creating a virtual window
KR100901270B1 (en) * 2007-12-15 2009-06-09 한국전자통신연구원 System and method for rendering surface materials
US20090172756A1 (en) * 2007-12-31 2009-07-02 Motorola, Inc. Lighting analysis and recommender system for video telephony
US8509569B2 (en) * 2008-02-11 2013-08-13 Apple Inc. Optimization of image processing using multiple processing units
US8243071B2 (en) 2008-02-29 2012-08-14 Microsoft Corporation Modeling and rendering of heterogeneous translucent materials using the diffusion equation
US9098647B2 (en) 2008-03-10 2015-08-04 Apple Inc. Dynamic viewing of a three dimensional space
US8350850B2 (en) * 2008-03-31 2013-01-08 Microsoft Corporation Using photo collections for three dimensional modeling
IL190539A (en) * 2008-03-31 2015-01-29 Rafael Advanced Defense Sys Methods for transferring points of interest between images with non-parallel viewing directions
US7937245B2 (en) * 2008-04-02 2011-05-03 Dreamworks Animation Llc Rendering of subsurface scattering effects in translucent objects
US8239822B2 (en) * 2008-04-18 2012-08-07 Microsoft Corp. Symbolic forward and reverse differentiation
US8238651B2 (en) * 2008-06-05 2012-08-07 Microsoft Corporation Image-guided abstraction of building facades
US9619917B2 (en) 2008-10-03 2017-04-11 Apple Inc. Depth of field for a camera in a media-editing application
US8791951B2 (en) * 2008-12-01 2014-07-29 Electronics And Telecommunications Research Institute Image synthesis apparatus and method supporting measured materials properties
US9098926B2 (en) * 2009-02-06 2015-08-04 The Hong Kong University Of Science And Technology Generating three-dimensional façade models from images
WO2010088840A1 (en) 2009-02-06 2010-08-12 The Hong Kong University Of Science And Technology Generating three-dimensional models from images
US9098945B2 (en) * 2009-05-01 2015-08-04 Microsoft Technology Licensing, Llc Modeling anisotropic surface reflectance with microfacet synthesis
US8797336B2 (en) * 2009-06-30 2014-08-05 Apple Inc. Multi-platform image processing framework
US20110069148A1 (en) * 2009-09-22 2011-03-24 Tenebraex Corporation Systems and methods for correcting images in a multi-sensor system
CN101872491B (en) * 2010-05-21 2011-12-28 清华大学 Free view angle relighting method and system based on photometric stereo
KR101633377B1 (en) * 2010-06-18 2016-07-08 삼성전자주식회사 Method and Apparatus for Processing Frames Obtained by Multi-Exposure
US8928659B2 (en) * 2010-06-23 2015-01-06 Microsoft Corporation Telepresence systems with viewer perspective adjustment
CN103155004B (en) * 2010-09-01 2016-05-18 玛斯柯有限公司 Demonstrate equipment, the system and method for illumination scheme by image rendering
KR101194364B1 (en) * 2011-07-14 2012-10-25 광주과학기술원 Appearance material design and manufacturing method and system
DE102011079380A1 (en) * 2011-07-19 2013-01-24 Siemens Aktiengesellschaft Method, computer program and system for computer-aided evaluation of image data sets
US9250966B2 (en) * 2011-08-11 2016-02-02 Otoy, Inc. Crowd-sourced video rendering system
JP6049327B2 (en) 2011-08-11 2016-12-21 キヤノン株式会社 Image processing apparatus and control method thereof
JP2013127774A (en) * 2011-11-16 2013-06-27 Canon Inc Image processing device, image processing method, and program
US9183654B2 (en) * 2012-03-02 2015-11-10 Sean Geggie Live editing and integrated control of image-based lighting of 3D models
CN103327221B (en) * 2012-03-20 2016-12-14 华晶科技股份有限公司 Camera head and image prebrowsing system thereof and image method for previewing
GB2500405B (en) 2012-03-20 2014-04-16 Lightmap Ltd Point and click lighting for image based lighting surfaces
TWI520604B (en) * 2012-03-20 2016-02-01 華晶科技股份有限公司 Image pickup device and image preview system and image preview method thereof
US8416240B1 (en) * 2012-04-02 2013-04-09 Google Inc. Determining 3D model information from stored images
CN109584343B (en) * 2012-08-03 2023-07-25 梦工厂动画公司 Time dependencies in a dependency graph
US9589000B2 (en) 2012-08-30 2017-03-07 Atheer, Inc. Method and apparatus for content association and history tracking in virtual and augmented reality
US20140078144A1 (en) * 2012-09-14 2014-03-20 Squee, Inc. Systems and methods for avatar creation
US9947132B2 (en) * 2013-03-15 2018-04-17 Nvidia Corporation Material representation data structure and method of representing a material for digital image synthesis
EP2984521A4 (en) 2013-04-11 2016-10-19 De Roer Carlo Van System and method for producing virtual light source movement in motion pictures and other media
JPWO2015045501A1 (en) * 2013-09-27 2017-03-09 日立オートモティブシステムズ株式会社 External recognition device
CA2931205A1 (en) 2013-11-22 2015-05-28 Sonify Biosciences, Llc Skin cancer treatment using low intensity ultrasound
US9509905B2 (en) * 2013-12-17 2016-11-29 Google Inc. Extraction and representation of three-dimensional (3D) and bidirectional reflectance distribution function (BRDF) parameters from lighted image sequences
US9600904B2 (en) 2013-12-30 2017-03-21 Samsung Electronics Co., Ltd. Illuminating a virtual environment with camera light data
US9648699B2 (en) 2014-03-03 2017-05-09 LiveLocation, Inc. Automatic control of location-registered lighting according to a live reference lighting environment
JP6410451B2 (en) * 2014-03-31 2018-10-24 キヤノン株式会社 Information processing apparatus, measurement system, information processing method, and program.
US10169909B2 (en) * 2014-08-07 2019-01-01 Pixar Generating a volumetric projection for an object
US9767620B2 (en) 2014-11-26 2017-09-19 Restoration Robotics, Inc. Gesture-based editing of 3D models for hair transplantation applications
US10133830B2 (en) * 2015-01-30 2018-11-20 Hover Inc. Scaling in a multi-dimensional building model
FR3034233B1 (en) * 2015-03-25 2018-08-10 Morpho METHOD OF CORRECTING AN IMAGE OF AT LEAST ONE REMOTELY PRESENTED OBJECT IN FRONT OF AN IMAGER AND LIGHTING BY A LIGHTING SYSTEM AND SHOOTING SYSTEM FOR IMPLEMENTING SAID METHOD
US11432046B1 (en) 2015-06-12 2022-08-30 Veepio Holdings, Llc Interactive, personalized objects in content creator's media with e-commerce link associated therewith
WO2017075452A1 (en) * 2015-10-29 2017-05-04 True Image Interactive, Inc Systems and methods for machine-generated avatars
WO2017217296A1 (en) * 2016-06-16 2017-12-21 株式会社ソニー・インタラクティブエンタテインメント Image processing device
US10489968B1 (en) 2016-09-14 2019-11-26 Musco Corporation Apparatus, method, and system for three-dimensional (3D) visualization of light for evaluation of playability, glare, and gaps
US10594995B2 (en) * 2016-12-13 2020-03-17 Buf Canada Inc. Image capture and display on a dome for chroma keying
EP3336801A1 (en) * 2016-12-19 2018-06-20 Thomson Licensing Method and apparatus for constructing lighting environment representations of 3d scenes
EP3351899B1 (en) * 2017-01-24 2020-06-17 Leica Geosystems AG Method and device for inpainting of colourised three-dimensional point clouds
JP6859763B2 (en) * 2017-03-10 2021-04-14 株式会社リコー Program, information processing device
US11004173B2 (en) 2017-03-13 2021-05-11 Mediatek Inc. Method for processing projection-based frame that includes at least one projection face packed in 360-degree virtual reality projection layout
US11057643B2 (en) * 2017-03-13 2021-07-06 Mediatek Inc. Method and apparatus for generating and encoding projection-based frame that includes at least one padding region and at least one projection face packed in 360-degree virtual reality projection layout
WO2018184528A1 (en) 2017-04-05 2018-10-11 Mediatek Inc. Method and apparatus for processing projection-based frame with at least one projection face generated using non-uniform mapping
US10181199B2 (en) * 2017-05-08 2019-01-15 Adobe Systems Incorporated Material capture using imaging
KR102149180B1 (en) * 2017-07-07 2020-08-28 한국전자통신연구원 Method for synthesizing virtual content for augmented reality and apparatus using the same
CN111034191A (en) 2017-08-18 2020-04-17 联发科技股份有限公司 Method and apparatus for reducing artifacts in projection-based frames
KR102107706B1 (en) * 2017-10-31 2020-05-07 에스케이텔레콤 주식회사 Method and apparatus for processing image
CN109147023A (en) * 2018-07-27 2019-01-04 北京微播视界科技有限公司 Three-dimensional special efficacy generation method, device and electronic equipment based on face
JP7328651B2 (en) * 2018-08-01 2023-08-17 東芝ライテック株式会社 Generation device, generation method and generation program
CN109587557B (en) * 2019-01-11 2022-03-08 京东方科技集团股份有限公司 Data transmission method and device and display device
US10986308B2 (en) 2019-03-20 2021-04-20 Adobe Inc. Intelligent video reframing
US10949646B2 (en) 2019-04-30 2021-03-16 Samsung Electronics Co., Ltd. Performing an iterative bundle adjustment for an imaging device
EP3764249A1 (en) * 2019-07-08 2021-01-13 Dmitri Goloubentsev A streaming compiler for automatic adjoint differentiation
GB2586060B (en) * 2019-08-01 2022-09-21 Sony Interactive Entertainment Inc Surface characterisation apparatus and system
US11461968B2 (en) * 2020-01-30 2022-10-04 Unity Technologies Sf Method of inferring microdetail on skin animation
US11620765B2 (en) * 2020-07-02 2023-04-04 Unity Technologies Sf Automatic detection of a calibration object for modifying image parameters
CN111815768B (en) * 2020-09-14 2020-12-18 腾讯科技(深圳)有限公司 Three-dimensional face reconstruction method and device
CN113034662B (en) * 2021-03-29 2023-03-31 网易(杭州)网络有限公司 Virtual scene rendering method and device, storage medium and electronic equipment
KR102555166B1 (en) * 2022-10-04 2023-07-12 인하대학교 산학협력단 Method and System for Facial Texture Synthesis with Skin Microelement Structure

Citations (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5403238A (en) * 1993-08-19 1995-04-04 The Walt Disney Company Amusement park attraction
US5499306A (en) * 1993-03-08 1996-03-12 Nippondenso Co., Ltd. Position-and-attitude recognition method and apparatus by use of image pickup means
US5526254A (en) * 1992-06-05 1996-06-11 Fujitsu Limited Simulation method and apparatus for manipulator apparatus, simulation and control method and apparatus for manipulator apparatus, and control method and apparatus for manipulator apparatus
US5586224A (en) * 1990-12-25 1996-12-17 Shukyohojin, Kongo Zen Sohonzan Shorinji Robot or numerical control programming method
US5623428A (en) * 1990-12-25 1997-04-22 Shukyohoji, Kongo Zen Sohozan Shoriji Method for developing computer animation
US5625577A (en) * 1990-12-25 1997-04-29 Shukyohojin, Kongo Zen Sohonzan Shorinji Computer-implemented motion analysis method using dynamics
US5974168A (en) * 1998-04-16 1999-10-26 International Business Machines Corporation Acquiring bump maps from curved objects
US5982389A (en) * 1996-06-17 1999-11-09 Microsoft Corporation Generating optimized motion transitions for computer animated objects
US6044180A (en) * 1990-04-20 2000-03-28 Nec Corporation Method and apparatus for rapid scanning of color images
US6057859A (en) * 1997-03-31 2000-05-02 Katrix, Inc. Limb coordination system for interactive computer animation of articulated characters with blended motion data
US6088042A (en) * 1997-03-31 2000-07-11 Katrix, Inc. Interactive motion data animation system
US6104412A (en) * 1996-08-21 2000-08-15 Nippon Telegraph And Telephone Corporation Method for generating animations of a multi-articulated structure, recording medium having recorded thereon the same and animation generating apparatus using the same
US6124864A (en) * 1997-04-07 2000-09-26 Synapix, Inc. Adaptive modeling and segmentation of visual image streams
US6148113A (en) * 1998-02-03 2000-11-14 Micrografx, Inc. System for stimulating the depth of field of an image in two dimensional space and method of operation
US6160907A (en) * 1997-04-07 2000-12-12 Synapix, Inc. Iterative three-dimensional process for creating finished media content
US6166744A (en) * 1997-11-26 2000-12-26 Pathfinder Systems, Inc. System for combining virtual images with real-world scenes
US6184899B1 (en) * 1997-03-31 2001-02-06 Treyarch Invention, L.L.C. Articulated figure animation using virtual actuators to simulate solutions for differential equations to display more realistic movements
US6246420B1 (en) * 1996-10-11 2001-06-12 Matsushita Electric Industrial Co., Ltd. Movement data connecting method and apparatus therefor
US6362822B1 (en) * 1999-03-12 2002-03-26 Terminal Reality, Inc. Lighting and shadowing methods and arrangements for use in computer graphic simulations
US20020186314A1 (en) * 2001-06-08 2002-12-12 University Of Southern California Realistic scene illumination reproduction
US20030012448A1 (en) * 2001-04-30 2003-01-16 Ronny Kimmel System and method for image enhancement, dynamic range compensation and illumination correction
US6519360B1 (en) * 1997-09-17 2003-02-11 Minolta Co., Ltd. Image processing apparatus for comparing images based on color feature information and computer program product in a memory
US6538396B1 (en) * 2001-09-24 2003-03-25 Ultimatte Corporation Automatic foreground lighting effects in a composited scene
US6552731B1 (en) * 1999-04-16 2003-04-22 Avid Technology, Inc. Multi-tone representation of a digital image on a digital nonlinear editing system
US6564108B1 (en) * 2000-06-07 2003-05-13 The Delfin Project, Inc. Method and system of auxiliary illumination for enhancing a scene during a multimedia presentation
US6571193B1 (en) * 1996-07-03 2003-05-27 Hitachi, Ltd. Method, apparatus and system for recognizing actions
US6571024B1 (en) * 1999-06-18 2003-05-27 Sarnoff Corporation Method and apparatus for multi-view three dimensional estimation
US20030103057A1 (en) * 2001-12-03 2003-06-05 Eric Graves Method and apparatus for color correction
US20030139849A1 (en) * 2000-11-17 2003-07-24 Yoshihiro Kuroki Device and method for controlling motion of legged mobile robot, and motion unit generating method for legged mobile robot
US6628830B1 (en) * 1998-06-24 2003-09-30 Canon Kabushiki Kaisha Image processing method and apparatus and storage medium
US6628298B1 (en) * 1998-07-17 2003-09-30 The Regents Of The University Of California Apparatus and method for rendering synthetic objects into real scenes using measurements of scene illumination
US20030202120A1 (en) * 2002-04-05 2003-10-30 Mack Newton Eliot Virtual lighting system
US6657637B1 (en) * 1998-07-30 2003-12-02 Matsushita Electric Industrial Co., Ltd. Moving image combining apparatus combining computer graphic image and at least one video sequence composed of a plurality of video frames
US20040017372A1 (en) * 2002-07-18 2004-01-29 Park Min Je Motion reconstruction method from inter-frame feature correspondences of a singular video stream using a motion library
US6750866B1 (en) * 2000-04-21 2004-06-15 Realistic Dynamics, Inc. Method and system for dynamically filtering the motion of articulated bodies

Family Cites Families (59)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4645459A (en) * 1982-07-30 1987-02-24 Honeywell Inc. Computer generated synthesized imagery
JPH0743765B2 (en) 1987-10-20 1995-05-15 富士写真フイルム株式会社 Radiation image processing method and apparatus
CA1316591C (en) * 1987-10-20 1993-04-20 Kazuhiro Hishinuma Method and apparatus for radiation image processing and x-ray image processing
US5739820A (en) * 1992-11-19 1998-04-14 Apple Computer Inc. Method and apparatus for specular reflection shading of computer graphic images
US5490240A (en) * 1993-07-09 1996-02-06 Silicon Graphics, Inc. System and method of generating interactive computer graphic images incorporating three dimensional textures
JPH0773339A (en) * 1993-09-03 1995-03-17 Sharp Corp Light source luminance calculator
US5546475A (en) * 1994-04-29 1996-08-13 International Business Machines Corporation Produce recognition system
US5600763A (en) * 1994-07-21 1997-02-04 Apple Computer, Inc. Error-bounded antialiased rendering of complex scenes
US5745759A (en) * 1994-10-14 1998-04-28 Qnx Software Systems, Ltd. Window kernel
JP3554616B2 (en) * 1994-12-13 2004-08-18 富士通株式会社 Drawing method and apparatus using radiosity method
GB9501832D0 (en) * 1995-01-31 1995-03-22 Videologic Ltd Texturing and shading of 3-d images
IL113496A (en) * 1995-04-25 1999-09-22 Cognitens Ltd Apparatus and method for recreating and manipulating a 3d object based on a 2d projection thereof
US5966134A (en) * 1996-06-28 1999-10-12 Softimage Simulating cel animation and shading
GB9616262D0 (en) * 1996-08-02 1996-09-11 Philips Electronics Nv Post-processing generation of focus/defocus effects for computer graphics images
US5748792A (en) * 1996-08-13 1998-05-05 Polaroid Corporation Large kernel filtering using a fixed-size block processor
JP3750830B2 (en) * 1996-08-30 2006-03-01 ソニー株式会社 Color correction apparatus in imaging apparatus
US6078332A (en) * 1997-01-28 2000-06-20 Silicon Graphics, Inc. Real-time lighting method using 3D texture mapping
US6052124A (en) * 1997-02-03 2000-04-18 Yissum Research Development Company System and method for directly estimating three-dimensional structure of objects in a scene and camera motion from three two-dimensional views of the scene
US5894309A (en) * 1997-02-27 1999-04-13 Mitsubishi Electric Information Technology Center America, Inc. System for modifying lighting in photographs
US6310644B1 (en) * 1997-03-26 2001-10-30 3Dm Devices Inc. Camera theodolite system
US6097394A (en) * 1997-04-28 2000-08-01 Board Of Trustees, Leland Stanford, Jr. University Method and system for light field rendering
JPH11175762A (en) * 1997-12-08 1999-07-02 Katsushi Ikeuchi Light environment measuring instrument and device and method for shading virtual image using same
JP3688879B2 (en) * 1998-01-30 2005-08-31 株式会社東芝 Image recognition apparatus, image recognition method, and recording medium therefor
US6272231B1 (en) * 1998-11-06 2001-08-07 Eyematic Interfaces, Inc. Wavelet-based facial motion capture for avatar animation
US6333749B1 (en) * 1998-04-17 2001-12-25 Adobe Systems, Inc. Method and apparatus for image assisted modeling of three-dimensional scenes
US6137491A (en) * 1998-06-05 2000-10-24 Microsoft Corporation Method and apparatus for reconstructing geometry using geometrically constrained structure from motion with points on planes
US6271855B1 (en) * 1998-06-18 2001-08-07 Microsoft Corporation Interactive construction of 3D models from panoramic images employing hard and soft constraint characterization and decomposing techniques
US6373496B1 (en) * 1998-08-12 2002-04-16 S3 Graphics Co., Ltd. Apparatus and method for texture mapping
US6342887B1 (en) * 1998-11-18 2002-01-29 Earl Robert Munroe Method and apparatus for reproducing lighting effects in computer animated objects
US6278460B1 (en) * 1998-12-15 2001-08-21 Point Cloud, Inc. Creating a three-dimensional model from two-dimensional images
CA2259882A1 (en) * 1999-01-22 2000-07-22 I.S.G. Technologies, Inc. Interactive sculpting for volumetric exploration and feature extraction
US6313842B1 (en) * 1999-03-03 2001-11-06 Discreet Logic Inc. Generating image data
US6496597B1 (en) * 1999-03-03 2002-12-17 Autodesk Canada Inc. Generating image data
US6400848B1 (en) * 1999-03-30 2002-06-04 Eastman Kodak Company Method for modifying the perspective of a digital image
US6483514B1 (en) * 1999-04-15 2002-11-19 Pixar Animation Studios Motion blurring implicit surfaces
JP4001435B2 (en) * 1999-04-19 2007-10-31 株式会社バンダイナムコゲームス Game device, image data creation tool, and information storage medium
US6297834B1 (en) 1999-06-10 2001-10-02 Hewlett-Packard Company Direction-dependent texture maps in a graphics system
US6504538B1 (en) * 1999-07-01 2003-01-07 Microsoft Corporation Method and system for generating light values for a set of vertices
JP3486575B2 (en) * 1999-08-31 2004-01-13 キヤノン株式会社 Mixed reality presentation apparatus and method, and storage medium
US6373487B1 (en) * 1999-09-17 2002-04-16 Hewlett-Packard Company Methods and apparatus for constructing a 3D model of a scene from calibrated images of the scene
FR2799022B1 (en) * 1999-09-29 2002-02-01 Oreal MAKEUP ASSISTANCE DEVICE AND ASSEMBLY CONSISTING OF SUCH A DEVICE AND A DEVICE FOR DELIVERING A PRODUCT HAVING A PREDETERMINED BRDF, SELECTED BY THE MAKEUP ASSISTANCE DEVICE
US6694064B1 (en) * 1999-11-19 2004-02-17 Positive Systems, Inc. Digital aerial image mosaic method and apparatus
US20020122589A1 (en) * 1999-11-29 2002-09-05 Donald M. Reiman Constructing profiles to compensate for non-linearities in image capture
WO2001048697A1 (en) * 1999-12-23 2001-07-05 Intel Corporation Methods of hierarchical static scene simplification and polygon budgeting for 3d models
US6515674B1 (en) * 2000-03-17 2003-02-04 Hewlett-Packard Company Apparatus for and of rendering 3d objects with parametric texture maps
US6750873B1 (en) * 2000-06-27 2004-06-15 International Business Machines Corporation High quality texture reconstruction from multiple scans
EP1323013A2 (en) * 2000-08-24 2003-07-02 Immersive Technologies LLC Computerized image system
JP2002152719A (en) * 2000-08-29 2002-05-24 Usc Corp Monitor method and monitor device utilizing curved surface image
US6765573B2 (en) * 2000-10-26 2004-07-20 Square Enix Co., Ltd. Surface shading using stored texture map based on bidirectional reflectance distribution function
JP3406965B2 (en) * 2000-11-24 2003-05-19 キヤノン株式会社 Mixed reality presentation device and control method thereof
JP3572025B2 (en) * 2001-03-07 2004-09-29 キヤノン株式会社 Image reproducing apparatus, image processing apparatus and their methods
US6639594B2 (en) * 2001-06-03 2003-10-28 Microsoft Corporation View-dependent image synthesis
US7106325B2 (en) * 2001-08-03 2006-09-12 Hewlett-Packard Development Company, L.P. System and method for rendering digital images having surface reflectance properties
US6961058B2 (en) * 2001-08-10 2005-11-01 Microsoft Corporation Macrostructure modeling with microstructure reflectance slices
JP4443083B2 (en) 2001-10-09 2010-03-31 株式会社バンダイナムコゲームス Image generation system and information storage medium
US7221809B2 (en) * 2001-12-17 2007-05-22 Genex Technologies, Inc. Face recognition system and method
US7009608B2 (en) * 2002-06-06 2006-03-07 Nvidia Corporation System and method of using multiple representations per object in computer graphics
US7075534B2 (en) * 2002-06-21 2006-07-11 Forrester Hardenbergh Cole Method and system for automatically generating factored approximations for arbitrary bidirectional reflectance distribution functions
JP3972784B2 (en) * 2002-09-30 2007-09-05 ソニー株式会社 Image processing apparatus and method

Patent Citations (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6044180A (en) * 1990-04-20 2000-03-28 Nec Corporation Method and apparatus for rapid scanning of color images
US5623428A (en) * 1990-12-25 1997-04-22 Shukyohoji, Kongo Zen Sohozan Shoriji Method for developing computer animation
US5586224A (en) * 1990-12-25 1996-12-17 Shukyohojin, Kongo Zen Sohonzan Shorinji Robot or numerical control programming method
US5625577A (en) * 1990-12-25 1997-04-29 Shukyohojin, Kongo Zen Sohonzan Shorinji Computer-implemented motion analysis method using dynamics
US5526254A (en) * 1992-06-05 1996-06-11 Fujitsu Limited Simulation method and apparatus for manipulator apparatus, simulation and control method and apparatus for manipulator apparatus, and control method and apparatus for manipulator apparatus
US5499306A (en) * 1993-03-08 1996-03-12 Nippondenso Co., Ltd. Position-and-attitude recognition method and apparatus by use of image pickup means
US5403238A (en) * 1993-08-19 1995-04-04 The Walt Disney Company Amusement park attraction
US5982389A (en) * 1996-06-17 1999-11-09 Microsoft Corporation Generating optimized motion transitions for computer animated objects
US20030208335A1 (en) * 1996-07-03 2003-11-06 Hitachi, Ltd. Method, apparatus and system for recognizing actions
US6941239B2 (en) * 1996-07-03 2005-09-06 Hitachi, Ltd. Method, apparatus and system for recognizing actions
US6571193B1 (en) * 1996-07-03 2003-05-27 Hitachi, Ltd. Method, apparatus and system for recognizing actions
US6104412A (en) * 1996-08-21 2000-08-15 Nippon Telegraph And Telephone Corporation Method for generating animations of a multi-articulated structure, recording medium having recorded thereon the same and animation generating apparatus using the same
US6246420B1 (en) * 1996-10-11 2001-06-12 Matsushita Electric Industrial Co., Ltd. Movement data connecting method and apparatus therefor
US6088042A (en) * 1997-03-31 2000-07-11 Katrix, Inc. Interactive motion data animation system
US6057859A (en) * 1997-03-31 2000-05-02 Katrix, Inc. Limb coordination system for interactive computer animation of articulated characters with blended motion data
US6184899B1 (en) * 1997-03-31 2001-02-06 Treyarch Invention, L.L.C. Articulated figure animation using virtual actuators to simulate solutions for differential equations to display more realistic movements
US6124864A (en) * 1997-04-07 2000-09-26 Synapix, Inc. Adaptive modeling and segmentation of visual image streams
US6160907A (en) * 1997-04-07 2000-12-12 Synapix, Inc. Iterative three-dimensional process for creating finished media content
US6519360B1 (en) * 1997-09-17 2003-02-11 Minolta Co., Ltd. Image processing apparatus for comparing images based on color feature information and computer program product in a memory
US6166744A (en) * 1997-11-26 2000-12-26 Pathfinder Systems, Inc. System for combining virtual images with real-world scenes
US6148113A (en) * 1998-02-03 2000-11-14 Micrografx, Inc. System for stimulating the depth of field of an image in two dimensional space and method of operation
US5974168A (en) * 1998-04-16 1999-10-26 International Business Machines Corporation Acquiring bump maps from curved objects
US6628830B1 (en) * 1998-06-24 2003-09-30 Canon Kabushiki Kaisha Image processing method and apparatus and storage medium
US6628298B1 (en) * 1998-07-17 2003-09-30 The Regents Of The University Of California Apparatus and method for rendering synthetic objects into real scenes using measurements of scene illumination
US6657637B1 (en) * 1998-07-30 2003-12-02 Matsushita Electric Industrial Co., Ltd. Moving image combining apparatus combining computer graphic image and at least one video sequence composed of a plurality of video frames
US6362822B1 (en) * 1999-03-12 2002-03-26 Terminal Reality, Inc. Lighting and shadowing methods and arrangements for use in computer graphic simulations
US6552731B1 (en) * 1999-04-16 2003-04-22 Avid Technology, Inc. Multi-tone representation of a digital image on a digital nonlinear editing system
US6571024B1 (en) * 1999-06-18 2003-05-27 Sarnoff Corporation Method and apparatus for multi-view three dimensional estimation
US6750866B1 (en) * 2000-04-21 2004-06-15 Realistic Dynamics, Inc. Method and system for dynamically filtering the motion of articulated bodies
US6564108B1 (en) * 2000-06-07 2003-05-13 The Delfin Project, Inc. Method and system of auxiliary illumination for enhancing a scene during a multimedia presentation
US20030139849A1 (en) * 2000-11-17 2003-07-24 Yoshihiro Kuroki Device and method for controlling motion of legged mobile robot, and motion unit generating method for legged mobile robot
US6961640B2 (en) * 2000-11-17 2005-11-01 Sony Corporation Motion control for a legged robot
US20030012448A1 (en) * 2001-04-30 2003-01-16 Ronny Kimmel System and method for image enhancement, dynamic range compensation and illumination correction
US20020186314A1 (en) * 2001-06-08 2002-12-12 University Of Southern California Realistic scene illumination reproduction
US6538396B1 (en) * 2001-09-24 2003-03-25 Ultimatte Corporation Automatic foreground lighting effects in a composited scene
US20030103057A1 (en) * 2001-12-03 2003-06-05 Eric Graves Method and apparatus for color correction
US20030202120A1 (en) * 2002-04-05 2003-10-30 Mack Newton Eliot Virtual lighting system
US20040017372A1 (en) * 2002-07-18 2004-01-29 Park Min Je Motion reconstruction method from inter-frame feature correspondences of a singular video stream using a motion library
US7099494B2 (en) * 2002-07-18 2006-08-29 Korea Advanced Institute Of Science And Technology Motion reconstruction method from inter-frame feature correspondences of a singular video stream using a motion library

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7457733B2 (en) * 2003-10-29 2008-11-25 Snecma Moving a virtual articulated object in a virtual environment while avoiding collisions between the articulated object and the environment
US20050096889A1 (en) * 2003-10-29 2005-05-05 Snecma Moteurs Moving a virtual articulated object in a virtual environment while avoiding collisions between the articulated object and the environment
US20050096890A1 (en) * 2003-10-29 2005-05-05 Snecma Moteurs Moving a virtual articulated object in a virtual environment while avoiding internal collisions between the articulated elements of the articulated object
US7403880B2 (en) * 2003-10-29 2008-07-22 Snecma Moving a virtual articulated object in a virtual environment while avoiding internal collisions between the articulated elements of the articulated object
WO2006102599A2 (en) * 2005-03-23 2006-09-28 Electronic Arts Inc. Computer simulation of body dynamics including a solver that solves in linear time for a set of constraints
WO2006102599A3 (en) * 2005-03-23 2007-12-27 Electronic Arts Inc Computer simulation of body dynamics including a solver that solves in linear time for a set of constraints
US7505883B2 (en) 2005-03-23 2009-03-17 Electronic Arts Inc. Computer simulation of body dynamics including a solver that solves in linear time for a set of constraints
US20180204366A1 (en) * 2005-04-19 2018-07-19 Digitalfish, Inc. Techniques and Workflows for Computer Graphics Animation System
US10546405B2 (en) * 2005-04-19 2020-01-28 Digitalfish, Inc. Techniques and workflows for computer graphics animation system
WO2006133228A3 (en) * 2005-06-06 2008-08-14 Electronic Arts Inc Adaptive contact based skeleton for animation of characters in video games
WO2006133228A2 (en) * 2005-06-06 2006-12-14 Electronic Arts Inc. Adaptive contact based skeleton for animation of characters in video games
US7573477B2 (en) * 2005-06-17 2009-08-11 Honda Motor Co., Ltd. System and method for activation-driven muscle deformations for existing character motion
US20060286522A1 (en) * 2005-06-17 2006-12-21 Victor Ng-Thow-Hing System and method for activation-driven muscle deformations for existing character motion
US7403202B1 (en) * 2005-07-12 2008-07-22 Electronic Arts, Inc. Computer animation of simulated characters using combinations of motion-capture data and external force modelling or other physics models
US20090259450A1 (en) * 2008-04-15 2009-10-15 Cleary Paul William physics-based simulation
AU2009201433B2 (en) * 2008-04-15 2013-11-21 Electronics And Telecommunications Research Institute Improved physics-based simulation
KR101687786B1 (en) * 2008-04-15 2016-12-20 한국전자통신연구원 Method for improved physics-based simulation
KR20090109511A (en) * 2008-04-15 2009-10-20 한국전자통신연구원 Method for improved physics-based simulation
US8860731B1 (en) * 2009-12-21 2014-10-14 Lucasfilm Entertainment Company Ltd. Refining animation
US8913065B2 (en) * 2011-08-05 2014-12-16 Jeffrey McCartney Computer system for animating 3D models using offset transforms
US20130033486A1 (en) * 2011-08-05 2013-02-07 Mccartney Jeffrey Computer System For Animating 3D Models Using Offset Transforms
US10229530B2 (en) 2016-01-19 2019-03-12 Canon Kabushiki Kaisha Image processing device and method therefor
US10529125B2 (en) 2016-01-19 2020-01-07 Canon Kabushiki Kaisha Image processing device and method therefor
CN110930483A (en) * 2019-11-20 2020-03-27 腾讯科技(深圳)有限公司 Role control method, model training method and related device

Also Published As

Publication number Publication date
AU2003295582A1 (en) 2004-06-15
US6990230B2 (en) 2006-01-24
EP1573653A4 (en) 2007-03-14
EP1565872A4 (en) 2007-03-07
US6983082B2 (en) 2006-01-03
JP2006507585A (en) 2006-03-02
AU2003295586B2 (en) 2009-05-07
WO2004047008A1 (en) 2004-06-03
US7079137B2 (en) 2006-07-18
US8515157B2 (en) 2013-08-20
EP1573653B1 (en) 2013-07-10
AU2003295586A1 (en) 2004-06-15
EP1573653A2 (en) 2005-09-14
JP4276178B2 (en) 2009-06-10
EP1566052A4 (en) 2007-02-07
US20040150642A1 (en) 2004-08-05
AU2003298666A1 (en) 2004-06-15
US20040146197A1 (en) 2004-07-29
JP2006506742A (en) 2006-02-23
WO2004047426A3 (en) 2004-07-15
WO2004047009A3 (en) 2004-07-08
US20040150643A1 (en) 2004-08-05
JP4220470B2 (en) 2009-02-04
US20040150641A1 (en) 2004-08-05
JP2006506745A (en) 2006-02-23
EP1565872A1 (en) 2005-08-24
US7536047B2 (en) 2009-05-19
WO2004047009A2 (en) 2004-06-03
EP1566052A2 (en) 2005-08-24
US20090174713A1 (en) 2009-07-09
WO2004047426A2 (en) 2004-06-03

Similar Documents

Publication Publication Date Title
US20040169656A1 (en) Method for motion simulation of an articulated figure using animation input
US10297066B2 (en) Animating a virtual object in a virtual world
Meyer et al. Interactive animation of cloth‐like objects in virtual reality
US8358310B2 (en) Musculo-skeletal shape skinning
US5623428A (en) Method for developing computer animation
US7515155B2 (en) Statistical dynamic modeling method and apparatus
US20100156935A1 (en) Method and apparatus for deforming shape of three dimensional human body model
Al Borno et al. Robust Physics‐based Motion Retargeting with Realistic Body Shapes
CN115018963B (en) Human-type intelligent body posture generation method based on physical simulation
Yin et al. Data-driven interactive balancing behaviors
Kenwright A lightweight rigid-body verlet simulator for real-time environments
Baek et al. Motion evaluation for VR-based motion training
Marsland et al. Physics-based animation of a trotting horse in a virtual environment
Thalmann Physical, behavioral, and sensor-based animation
Huang et al. Interactive human motion control using a closed-form of direct and inverse dynamics
Bar-Lev et al. Virtual marionettes: a system and paradigm for real-time 3D animation
JP4361878B2 (en) Statistical mechanical modeling method and apparatus
Westenhofer et al. Using kinematic clones to control the dynamic simulation of articulated figures
Vyas et al. Real-time Physics Based Character Control carrying Load
Hwang et al. Modularized Predictive Coding-Based Online Motion Synthesis Combining Environmental Constraints and Motion-Capture Data
Doyle Real-Time Physics-Based Goalkeeper Animations
Gijzel Quality of posture of a humanoid character in a reactive stepping animation obtained using inverse kinematics.
CA2043902A1 (en) Method for developing computer animation
Vosinakis et al. Design and implementation of synthetic humans for virtual environments and simulation systems
KR100629431B1 (en) Mass point location correction method and human-like articulated body animation method using it

Legal Events

Date Code Title Description
AS Assignment

Owner name: ESC ENTERTAINMENT, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PIPONI, DANIELE PAOLO DAVID;JAMES, OLIVER;REEL/FRAME:015306/0028;SIGNING DATES FROM 20040315 TO 20040423

AS Assignment

Owner name: WARNER BROS. ENTERTAINMENT INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ESC ENTERTAINMENT;REEL/FRAME:016895/0645

Effective date: 20050812

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION