US20030137516A1 - Three dimensional animation system and method - Google Patents
Three dimensional animation system and method Download PDFInfo
- Publication number
- US20030137516A1 US20030137516A1 US10/357,672 US35767203A US2003137516A1 US 20030137516 A1 US20030137516 A1 US 20030137516A1 US 35767203 A US35767203 A US 35767203A US 2003137516 A1 US2003137516 A1 US 2003137516A1
- Authority
- US
- United States
- Prior art keywords
- behavior
- movement
- animated character
- joint
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
- G06T13/40—3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
Definitions
- This invention relates generally to a system and method for animating a computer image on a computer display and in particular to a system and method for generating realistic three-dimensional animation of an object over a low bandwidth communications network.
- Some techniques and systems such as those used to generate three dimensional animation for a movie, are very high-end and expensive.
- these high-end three dimensional animations may be viewed by a consumer on a movie screen, for example, but cannot be interacted with in any way.
- the consumer may view the three dimensional animations which tell a story such as in a movie but the consumer cannot interact with the animations in any manner.
- These high-end animation systems are very useful for a movie, but cannot be readily used by the general public due to the costs of the system.
- the slow downloading time in turn leads to the consumer waiting a long period of time before viewing the animation.
- This long period of waiting before the animation is not acceptable since people become bored during the waiting period, cancel the animation and thus never see the animation displayed. It is desirable, however, to provide a three dimensional animation system in which the animation data which may be downloaded rapidly over a very slow communications link and it is to this end that the present invention is directed.
- the invention provides a three dimensional animation system and method in which the animation data may be downloaded over a relative slow communications link, such as the Internet or Web and a modem, to a local computer in a relatively short amount of time.
- the local computer may then execute a downloaded software application to animate the object.
- the user of the local computer may interact with the animated object (i.e., change its behaviors or actions).
- the main portion of the system may reside as a plurality of software applications on a web server and a client computer may access the web server to download the animations.
- the system may generate an initial animation download package containing the data about the actual object (e.g., the persistent data) and a few basic actions of the object (e.g., behavior data). For example, each downloaded object may have an idle behavior associated with it which is executed any time that the object is not executed another behavior. Then, as additional actions or behaviors or sound tracks for the object are required, the system may stream the behavior data down to the client computer before the action is required so that the behaviors are asynchronously downloaded (i.e., the behaviors for the three dimensional animated object do not need to be downloaded at the same time as the three dimensional animated object is downloaded). In this manner, the client computer may more quickly begin the animation while other yet unneeded actions or behaviors are being downloaded.
- the client computer may more quickly begin the animation while other yet unneeded actions or behaviors are being downloaded.
- the object may cross a bounding box which causes an action that will be needed shortly to be downloaded to the client computer so that the action is available when needed.
- the one or more behavior or action files for a particular object may contain information about the movements of the joints in the object which correspond to a particular behavior and any other data necessary to execute the behavior.
- a head nod behavior may involve the movement of the neck joint.
- a sound track for the object such as saying “Hello”, may involve the movement of the various pieces of the lips and a sound track synchronized to the movement of the lips.
- the total size of each behavior downloaded to the client computer is also relatively small in size so that the download time of the behavior, over a relatively slow communications link, is not too slow.
- the initial object downloaded to the client computer may include an object tree containing data about each portion of the object. For example, a person would include a leg object.
- each piece of skin on the object e.g., each polygon
- each contribution chart which lists each joint in the object, such as the knee or ankle, and the contributions that the movement of each joint makes to movement of the particular polygon.
- a polygon near the knee joint would probably be mostly influenced by knee movement while a polygon midway between the knee and ankle would be influenced by the movement of both the ankle and the knee.
- the client computer may easily determine the movement for each particular polygon based on the model of the object and the movement of the joints.
- the downloaded behavior file may contain data about the movement of the twelve joints whereas the behavior may cause the 6000 polygons on the model to move.
- the system also permits a downloaded behavior to be streamed to the player application residing on the user's computer.
- the behavior may start playing on the player application before the entire behavior is downloaded. For example, if a behavior is five minutes long, the user of the player application is not likely to wait 5 minutes for the behavior to be downloaded. Therefore, in accordance with the invention the system downloads a predetermined amount of the behavior (e.g., a two second portion of the behavior data) at a time so that the behavior may start executing the first predetermined portion of the behavior data while the second and subsequent portions of the behavior data are downloaded.
- a predetermined amount of the behavior e.g., a two second portion of the behavior data
- the system may also permit the user to interact with the animated objects in various different ways using actions and scripts which include one or more actions or behaviors. For each interaction with the object, there may be an action or behavior associated with that interaction. For example, when the user clicks on an animated door being displayed to the user, a behavior to open the door and show the door slowly opening with a creaking soundtrack may be downloaded to the client computer and executed. On the client computer, the user may see the door slowly open and hear the door creak as it opens. As another example, when the user drags a cursor over an object, such as a gray knife, a behavior to turn the knife red will be downloaded and executed so that the knife turns red when the user places the cursor over it. When the user moves the cursor off of the knife, the knife returns to its idle behavior which is having a gray color.
- actions and scripts which include one or more actions or behaviors. For each interaction with the object, there may be an action or behavior associated with that interaction. For example, when the user clicks on an animated door being displayed to the user, a
- the invention may use a spherical environmental map.
- the pixels resulting from a particular lighting condition is determined for a half-sphere and then the corresponding lighting for an object is determined based on the pixels in the half-sphere.
- the lighting model for the half-sphere may be downloaded to the client computer so that, for each pixel of the object, the client computer may look up the corresponding point on the half-sphere and apply the pixel value on that portion of the half-sphere to the object.
- the system does not attempt to calculate, in real-time, the lighting for an object.
- the system may also provide a character cache in the client computer that permits the character data to be stored in the client computer so that it does not need to be constantly refreshed.
- the character may be stored in the cache of the browser application which is periodically flushed.
- a system for animating a character on a computer comprising a first computer and a second computer connected to the first computer.
- the first computer may store one or more pieces of data associated with a particular animated character.
- the pieces of data may include a persistent data file containing one or more of a geometric model of the animated character, a texture associated with the animated character and a sound associated with the animated character.
- the pieces of data may further comprise one or more behavior files wherein each behavior file contains data about a particular behavior of the animated character and each behavior specifies the movement of the model.
- the second computer may initially download the persistent data file from the first computer in order to begin the animation of the animated character on the second computer and then asynchronously download a behavior file from the first computer just prior to the execution of the behavior of the animated character by the second computer.
- FIG. 1 is a block diagram illustrating the three dimensional animation system in accordance with the invention
- FIG. 2 is a flowchart illustrating a method for downloading three dimensional character files in accordance with the invention
- FIG. 3 is a diagram illustrating an example of an object hierarchy in accordance with the invention.
- FIGS. 4A and 4B are diagrams illustrating an example of a three dimensional object in accordance with the invention.
- FIG. 5 is a diagram illustrating an articulation of a joint
- FIG. 6 is a diagram illustrating a morphlink in accordance with the invention.
- FIG. 7 is a diagram illustrating an example of the area of influence of a joint in accordance with the invention.
- FIGS. 8A and 8B are diagrams illustrating an example of the bones within a model in accordance with the invention.
- FIGS. 9 and 10 are diagrams illustrating an example of a model with the bones and polygons in accordance with the invention.
- FIG. 11 is a diagram illustrating an example of a rendered, unlighted model in accordance with the invention.
- FIG. 12 is a diagram illustrating an example of a rendered, lighted model in accordance with the invention.
- FIG. 13 is a diagram illustrating an example of an environmental map lighting model in accordance with the invention.
- FIG. 14 is a diagram illustrating an example of the rendered model shown in FIGS. 11 and 12;
- FIGS. 15 and 16 are diagrams illustrating the bones in a leg model in accordance with the invention.
- FIG. 17 is a diagram illustrating the leg of FIGS. 15 and 16 showing the area of influence for each joint;
- FIG. 18 is a diagram illustrating an example of a behavior in accordance with the invention.
- FIGS. 19A, 19B and 19 C are diagrams illustrating another example of a behavior in accordance with the invention.
- FIG. 20 is a flowchart illustrating a method for streaming behaviors in accordance with the invention.
- FIG. 21 is a diagram illustrating the details of the streaming behaviors in accordance with the invention.
- FIG. 22 is a diagram illustrating more details of the streaming behavior shown in FIG. 21;
- FIG. 23 is a diagram illustrating a portion of a streamed behavior in accordance with the invention.
- FIGS. 24 and 25 are flowcharts illustrating the operation of the player in accordance with the invention.
- the invention is particularly applicable to a Web-based three dimensional animation system and method and it is in this context that the invention will be described. It will be appreciated, however, that the system and method in accordance with the invention has greater utility, such as to other types of three dimensional animation systems including stand-alone computer systems and the like.
- FIG. 1 is a block diagram illustrating a three dimensional animation system 40 in accordance with the invention.
- the system 40 may include a character creator 42 , a server 44 and one or more client computers (CLIENT #1-CLIENT #N) 46 .
- the client computers may be connected to the server by a communications network 48 that may include various communication or computer networks such as the Internet, the World Wide Web (the Web), a local area network, a wide area network or other similar communications networks which connect computer systems together.
- the creator 42 may be used to generate a three dimensional animated character as described below.
- the creator 42 may be a software application being executed by a computer system.
- the creator 42 may be stored on the server 44 or may be executed by a separate computer system as shown in FIG. 1.
- the creator 42 may generate one or more web files, as described below, which may be downloaded to the server 44 so that each client computer may then download the web files from the server.
- Each client computer may then interpret the downloaded web files and generate a three dimensional animation based on the web files as described in more detail below. Now, each portion of the system 40 will be described in more detail.
- the creator 42 may accept various information from either user input or other external files in order to generate a three dimensional realistic object or character which may be animated using the system.
- the creator may receive three dimensional models 50 which may be, for example, wire frame models of an object generated by a third party modeling system. The generation of a three dimensional wire frame model from a three dimensional object in well known and therefore will not be described here.
- the creator 42 may also receive texture map information 52 which may be used to place a texture over the polygons painted onto the three dimensional object.
- the texture may provide, for example, a realistic flesh color and texture for the skin of a human character or a realistic set of teeth.
- the creator 42 may also receive a sound file 54 so that a sound track may be incorporated into a behavior file in accordance with the invention.
- the sound file may be generated by a third party system which receives a sound and generates a digital representation of the sound which may be incorporated into a file.
- the creator may also receive a behavior file 56 .
- the behavior file may be combined with any sound file to generate a behavior for a three dimensional animation in accordance with the invention.
- the behavior file may be generated by the user using a separate software module of the creator 42 .
- the creator 42 may combine the three dimensional model information, the texture information, the sound file and the behavior file into one or more web files which are stored on the server.
- the creator 42 may generate more than one file for each animated object.
- the creator may generate a file containing persistent data 58 and then one or more files containing behavior data 60 .
- the persistent data file may be initially downloaded to the client computer to begin the animation of the object and may include one or more of the three dimensional object (including joints and polygons), any textures, any morphlinks as described below, and an idle behavior for the three dimensional object.
- the idle behavior may be the action or movement of the animated object when no other behavior is being executed.
- the idle behavior for an animated monster may be that the monster may breath which causes his chest to expand and contract.
- the persistent data file may also include the morphlink data associated with each polygon in the model in accordance with the invention.
- the model may include one or more joints connected together by one or more bones and a skin of polygons which cover the bones and joints.
- the morphlink data permits the client computer to easily determine the movement of the polygons based on the movement of the joints of the object.
- the movement of each polygon must be independently calculated which is very slow.
- the user of the creator defines the morphlinks so that the movement of a particular polygon based on the movement of the joints is determined by the user.
- the movement of each polygon relative to movement of the joints of the model is defined in the persistent data.
- the behavior when a behavior of the three dimensional animation is executed, the behavior may contain only information about the movement of each joint in the model and the client computer, based on the morphlinks, may determine the movement of each polygon on the model.
- the size of the behavior file downloaded to each client computer in accordance with the invention is reduced which speeds up the download speed of each behavior.
- the speed with which the three dimensional animation may be animated is increased since the client computer does not need to calculate the movement of each polygon on the three dimensional object or character each time a behavior occurs.
- the persistent data file 58 may also be compressed to further reduce the size of the persistent data.
- a persistent storage file may be approximately 10-200 Kb depending on the complexity of the three dimensional animation whereas a typical animation file may be approximately 1 Mb.
- the one or more behavior files 60 may each contain a data structure which contains data specifying the movement of each joint in the three dimensional object or character during a behavior and any sound file which is associated with the particular behavior.
- each different behavior of the three dimensional animation such as smiling, talking about a particular subject, sitting, etc., may be contained in a separate behavior file.
- each behavior file may be downloaded to the client computer only when the behavior is required. For example, a behavior to pick up an object from the ground for a game player may only be downloaded when the game player nears an object which may be picked up.
- a behavior to say “Good-bye” is only downloaded to the client computer when the user clicks on a good-bye button on the user interface.
- the system may download the behavior files asynchronously with the persistent data file. Due to the morphlinks in the persistent data, the size of the behavior files, as described above, is very small. It should be realized, however, that each behavior file is associated only with a particular persistent data file (since the structure of the behavior and the persistent storage are tied together) and therefore a walking behavior for two separate models will be slightly different.
- the behavior files may be generated by the creator 42 in response to user input.
- the behavior files may be compressed in that the data for any joints which do not move during a predetermined time during the behavior is not downloaded to the client computer.
- the behavior and persistent data files may be downloaded to the server 44 and stored in a character and behavior storage device 62 which may be a persistent storage device such as a hard disk drive, a tape drive or the like.
- the server 44 may also include a player store 64 which contains a player web file.
- the player web file is the software application which is first downloaded to the client computer so that the client computer may then interprets the persistent data file as well as the behavior files.
- the server 44 may first download the player software application to the client computer (if necessary) and then, based on user input into a Web user interface application 66 , download the persistent data file so that the animation may begin. Then, as behaviors of the three dimensional animation are needed by the client computer, the server may download the appropriate behavior file to be executed by the player on the client computer.
- Each client computer 46 may include a CPU 68 , a memory 70 containing one or more software application to be executed by the CPU 68 , a character cache 74 which may reside on a persistent storage device of the client computer and a display 76 .
- the character cache 74 may store the persistent data 58 when it is downloaded to the client computer the first time so that it may not be necessary to downloaded the persistent data again when the particular client computer again wants to view the same animation.
- the system has its own character cache.
- the memory 70 may store a browser application 78 which permits the client computer to interact with the server 44 by specifying a uniform resource locator (URL) of the server in order to receive the web files stored on the server using a hypertext transfer protocol (HTTP).
- the memory may also store the player software application 64 to interpret the persistent data file and the behavior files and generate the animated object, a character file 82 generated from the persistent data file, a first behavior file 84 containing the idle behavior and a second behavior file 86 that may contain a behavior that will be executed soon.
- the character file and the current behavior files are both stored in the memory 70 while being executed by the player. As new behaviors are needed, those behaviors are downloaded to the client computer and one of the old behaviors may be deleted to make room from the new behavior.
- the number of behaviors stored in the client computer at any time depends on the amount of memory space available for the animation system.
- the animation generated by the player 64 based on the persistent data and the behavior files may be displayed on the display 76 . Now, a method for downloading three dimensional character files in accordance with the invention will be described.
- FIG. 2 is a flowchart illustrating a method 100 for downloading three dimensional character files to a particular client computer in accordance with the invention.
- the downloading method permits the animation to begin rapidly since the downloading time is reduced due to the asynchronous download of the behavior files and the persistent data file.
- the server may determine if the player application has previously been downloaded to the client computer and may downloads the player software in step 104 if it has not already been downloaded. Once the player is downloaded to the client computer, the server may download the persistent data file (that may include the character file and an idle behavior file) to the client computer in step 106 .
- the client computer may then create a character cache if one does not exist and store the character file and the idle behavior file in the character cache.
- the player application is executed by the client computer and the player application may use the persistent data to animate the character and execute the idle behavior in step 108 .
- all of the behaviors for an animated object do not need to be initially downloaded so that the download time for the persistent data is reduced and the animation may begin more quickly.
- the player application may determine if a new behavior for the animated object is needed in step 110 and continue to execute the idle behavior in step 112 if no new behavior is needed. If a new behavior is needed, then the server may download the new behavior in step 114 in response to a request by the player application. The player application may then determine if the new behavior as finished downloading in step 116 and continue to execute the prior behavior until the new behavior is downloaded.
- the player may execute the new behavior in step 118 and return to step 108 .
- the behaviors are downloaded to the client computer as they are needed so that the start time of the animation is reduced.
- the total download time for any behavior file is also short.
- FIG. 3 is a diagram illustrating an example of an object hierarchy 130 in accordance with the invention.
- the object hierarchy may include a tree of objects organized underneath a root node 132 .
- the objects used in the three dimensional animation system may include a texture object 134 , a sound object 136 , a geometry object 138 and a behavior object 140 . Each of these objects may then include sub-objects as shown for the geometry and behavior objects.
- the geometry object 138 may include a polygon object 142 containing the polygons associated with a particular model, a morphlinks object 144 containing the morphlinks associated with each polygon in the model, a particle system object 146 for storing smaller objects such as rocket ship exhaust and a camera object 148 for storing information about the camera position and angle with respect to the model.
- the geometry may further include additional information about the model as shown in FIG. 4B.
- the behavior object 140 may include a transform object 150 containing movement information for the joints of an object to transform the object, a texture track object 152 containing an animated texture of a model, a sound object 154 containing a sound track associated with a behavior and a script object 156 containing a sequence of behaviors combined together to form a new behavior.
- a script may be for interacting with the animated character and may include a behavior for each response that the character makes to the user in response to user input.
- Each animated object may include one or more of the above objects.
- FIGS. 4A and 4B are diagrams illustrating an example of a character object and a three dimensional object associated with that object in accordance with the invention.
- FIG. 4A is a diagram illustrating a geometry object 160 for a human character in which the object models the joints of the character.
- the object shown includes a head, a neck, two elbows, two wrists, hips, two knees and two ankle joints.
- the movement of a character is specified by the movement of the joints of the character which may then be turned into movement of the polygons on the model based on the morphlinks.
- a three dimensional object tree 162 as shown in FIG. 4B, which models this human character has a similar structure.
- the object tree 162 may include a WORLD node connected to a CAMERA node and a BODY node.
- the BODY node may further include various objects modeling the various joints in the body.
- the BODY node may include a HIP JOINT node, a LFT KNEE object and a RT KNEE object connected to the HIP JOINT node and a LFT ANKLE and RT ANKLE node connected to the appropriate knee nodes.
- Each object in the object tree that is connected to an object above it in the tree inherits the attributes of the object above it. For example, any movement of the LFT KNEE object may cause the LFT ANKLE object to inherit the same movement.
- FIG. 5 is a diagram illustrating an example of the articulation of joints of a character.
- an arm 170 of a three dimensional character is shown for illustration purposes.
- the arm may include a shoulder joint 172 , an elbow joint 174 , a wrist joint 176 , an upper arm 178 , a lower arm 180 and a hand 182 .
- each joint has six degrees of freedom since each joint may move in a positive or negative X direction, a positive or negative Y direction and/or in a positive or negative Z direction.
- a key period each joint may move in each of the six directions.
- the key period may preferably be ⁇ fraction (1/10) ⁇ th of a second and the player may interpolate the movement of the joint in each direction in between the key period to ensure that the motion of the joint is smooth.
- the key period may be set by the user.
- the part of the body near that joint may also move.
- the synchronization of the movement of the body part and the polygons which make up the “skin” of the body part with the joints of the character are accomplished by the morphlinks in accordance with the invention that will now be described in more detail.
- FIG. 6 is a diagram illustrating a morphlink in accordance with the invention for the upper arm 178 shown in FIG. 5.
- the polygons covering the upper arm may be influenced by both the shoulder joint 172 and the elbow joint 174 .
- the movement of a polygon 184 on the outside of the upper arm based on the movement of the shoulder joint and the elbow joint will be described. If the movement of the polygon 184 is influenced only by the shoulder joint's movement, the polygon 184 may move to position x 1 . If the movement of the polygon 184 is influenced only by the movement of the elbow joint, the polygon may move to position x 2 .
- the polygon 184 should be influenced by both the shoulder joint and the elbow joint so that the polygon 184 moves to position x 3 when the influence of both joints are used.
- the relationship of each polygon on the character to the joints in the character may be stored in the morphlink data which is stored with the persistent data file.
- the morphlink data permits a behavior file to only specify the movement of the joints in the character (a small amount of data) and then the player application on the client computer may determine the movement of each polygon of the character based on the morphlink data.
- each joint on particular polygons on the three dimensional object is controlled by the user during the creation of the three dimensional character so that once the character is created, each polygon on the character has a fixed movement relationship with respect to the joints of the character.
- an example of the area of influence of a joint will be provided.
- FIG. 7 is a diagram illustrating an example of the area of influence of a joint in accordance with the invention.
- a joint 190 may have an inner area of influence 192 and an outer area of influence 194 .
- a body part 196 surrounding the joint 190 is shown by the dotted lines.
- the joint 190 contributes 100% of its movement to the movement of those polygons.
- the movement of polygons A and B have 100% contribution from the joint 190 so that, for example, if joint 190 moves 1′′ in the positive X direction, then polygons A and B also move 1′′ in the positive X direction.
- polygons A and B also move 1′′ in the Y direction in addition to the movement in the X direction.
- the amount of influence on the particular polygon decreases as the polygon is located farther away from the joint.
- the contribution is still 100% while the contribution for a polygon at the outside edge of the outer area of influence is 0%.
- the joint contributes 50% to the movement of polygon C so that if the joint moves 1′′ in the positive X direction, polygon C moves ⁇ fraction ( 1 / 2 ) ⁇ ′′ in the positive X direction.
- the joint 190 contributes 0% movement so that the movement of the joint does not affect those polygons.
- FIGS. 8A and 8B are diagrams illustrating an example of the bones within a model in accordance with the invention.
- the bones in the model are generated using a user interface 200 in the creator application.
- the user interface 200 may include an object window 202 which lists the objects, such as the geometry and materials (textures) associated with the particular three dimensional character being generated.
- the user interface 200 may also include a geometry window 204 which lists all of the geometry associated with the particular three dimensional character such as the neck, the left shoulder (ShouldL), the left upper leg (UpLegL) and the like.
- the geometry is listed in object order so that the associations of objects with other objects and the attributions of the characteristics (such as the association of the upper left arm with the left shoulder) may be viewed by glancing at the geometry window.
- the geometry window also permits the user to make certain portions of the character invisible, if desired.
- the user interface 200 may also include a three dimensional character viewing window 206 which shows the user the current view of the character. In this example, only one or more bones of the character 208 and one or more joints 210 are shown by clicking the appropriate locations in the geometry window.
- the bones and joints of the three dimensional model may be generated by the creator application or by a well known third party piece of software.
- FIGS. 9 and 10 are diagrams illustrating an example of a model with the bones and polygons in accordance with the invention and an example of a character displayed with only the polygons, respectively.
- the user may create the three dimensional animated character using the user interface 200 of the creator.
- the user has selected to have the bones 208 and joints 210 shown in window 206 with a polygon skin 212 placed over the bones and joints.
- the polygon skin may be generated by the creator application or by a third party piece of software.
- the polygon skin 212 forms the actual surface seen by a person when viewing the three dimensional animated character and any texture mapped onto these polygons.
- FIG. 10 shows the character with just the polygons 212 being displayed to the user in the user interface 200 .
- FIG. 10 accurately depicts what the three dimensional animated character may look like with the exception of a texture being placed on the polygons.
- a three dimensional animated character with a texture placed on the polygons will now be described with reference to FIGS. 11 and 12.
- FIG. 11 is a diagram illustrating an example of a rendered, unlighted character 220 in accordance with the invention within the viewing window 206 while FIG. 12 is a diagram illustrating an example of a rendered, lighted character 230 in accordance with the invention.
- the three dimensional character has been covered with textured polygons so that, for example, the character has long hair, eyebrows, a nose, a mouth with lips and eyes.
- the textures may be generated by a third party piece of software and then positioned onto the animated character by the user using the creator application.
- the entire unlighted character 220 is shown in FIG. 14.
- the same character has a chrome texture and has been lighted with a particular lighting model.
- a material window 232 may be displayed which permits the user to select a material/texture.
- the chrome material was selected.
- the material window may include a palette 234 of materials which may cover the polygons, such as a body material, a bottom teeth material, a chrome material, a hair material and a top teeth material.
- the lighting may be applied to the character by specifying a lighting model for the character. An example of a lighting model will now be described with reference to FIG. 13.
- FIG. 13 is a diagram illustrating an example of an environmental map lighting model 240 in accordance with the invention which may be applied to a three dimensional character such as the one shown in FIG. 12.
- the environmental map 240 is generated by having the selected lighting choice illuminate a surface of a sphere.
- the particular lighting model causes the sphere's surface to have a particular appearance made up of individual pixels having different intensities.
- the appearance of the character at a particular location is the same as the appearance of the sphere at the same location.
- the left side of the sphere has a bright spot 242 and the character, shown in FIG. 12, also has a bright spot 244 along the left side of the character.
- the lighting model for the character is generated by looking up the appearance of the lighting model on a particular location on a sphere and then mapping the sphere's lighting at the particular location onto the polygons of the animated character at the same particular location. In accordance with the invention, it is not necessary to calculate the lighting for each pixel of the character. In a typical three dimensional animation system, the lighting model may be transferred onto the character by calculating the lighting of each pixel on the character which is a slow process. Now, an example of a morphlink associated with the leg of a character will be described.
- FIGS. 15 and 16 are diagrams illustrating an example of the bones in a leg and the morphlink associated with the leg in accordance with the invention.
- FIG. 16 shows the display window 206 of the creator with a bottom half 250 of a character.
- a hip joint 252 , a knee joint 254 and an ankle joint 256 are shown inside of the character.
- Each pixel of each polygon forming the “skin” of the character may then have its motion determined by the creator of the character by setting the morphlinks that associate a particular pixel's movement with the contributions from each joint in the character.
- FIG. 16 illustrates the user interface 200 with the main window 202 , the geometry window 204 and the display window 206 .
- the user interface may also include a morphlink window 258 which contains a list of each polygon in the character along with the contributions of each joint's movement to the movement of that polygon.
- the right leg has moved upwards and a polygon 260 has been selected.
- the morphlink window 258 may then highlight the selected polygon (Body Geo Vert 188 in this example) along with the contribution from the knee joint which is 77.62% in this example. Thus if the knee moves 10′′ in the positive X direction, the selected polygon moves 7.762′′ in the positive X direction.
- each polygon may be influenced by one or more joints and the contributions of each joint are added together to determine the movement of that polygon.
- the morphlink permits a downloaded behavior files to specify only the movement of each joint and then the player on the client computer may determine the movement of each polygon based on the movement of the joints and the morphlinks. Now, the areas of influence of a joint will be described in more detail using the character shown in FIGS. 15 and 16.
- FIG. 17 is a diagram illustrating the character 250 of FIGS. 15 and 16 showing the area of influence for each joint.
- the areas of influence for each joint may include an inner region 270 and an outer region 272 .
- the user of the creator may adjust these inner and outer regions which adjusts the influence of that joint.
- the areas of influence for the hip joint 252 , the knee joint 254 and the ankle joint 256 are shown. The details about the areas of influence are described above and will not be described here. Now, an example of a behavior will be described.
- FIG. 18 is a diagram illustrating an example of a behavior 280 in accordance with the invention which may be downloaded to a client computer.
- the structure of the behavior may follow the structure of the objects within a character.
- the behavior may include a root node 282 , a dance hips node 284 specifying the movement of the hips of the character during the dance behavior, a dance-rt knee and dance-lft knee nodes 286 , 288 which specify the movement of the knees during the dance behavior, etc.
- the objects in the behavior map to the objects in the character since the objects in the behavior specify the movement of the objects in the character.
- the behavior may also specify a sound track associated with the behavior.
- the behavior may be broken down into one or more key periods so that the movement of each object during the behavior may change at each new key period and the system may interpolate between the key periods.
- the behavior may also be compressed in that, if an object, such as the head, is not moving or the object is not changing its movement during the behavior, the object for the head may be left out of the behavior file since the player will assume that any joint not in a behavior file will do the same time that it was doing before.
- an example of an object within the behavior will be described in more detail.
- the dance-hips object 284 will be described in more detail, although each object in the behavior may have a similar data structure even if each object moves in a different manner.
- the object 284 may specify the movement of the object in the three dimensions (X, Y, Z). As shown, the movement in each dimension may change during each key period. Between the key periods, the player may use interpolation to ensure a smooth movement. Another example of a behavior will now be described.
- FIGS. 19A, 19B and 19 C are diagrams illustrating another example of a behavior in accordance with the invention.
- the behavior is having a character's mouth form the syllable “Ah” and a corresponding sound track being played to simulate the character speaking the syllable “Ah”.
- FIG. 19A shows a structure 290 for the behavior which may include one or more time frames 292 to ensure synchronization of the character with the behavior at each key period. Following each time frame is the data about the movement of each object in the character during that key period including the position change of each object, the orientation change of each object and the scale change of each object.
- FIG. 19A shows a structure 290 for the behavior which may include one or more time frames 292 to ensure synchronization of the character with the behavior at each key period. Following each time frame is the data about the movement of each object in the character during that key period including the position change of each object, the orientation change of each object and the scale change of each object.
- a behavior object 294 for the “Ah” behavior including links to each object (joint) in the mouth that is moved during the behavior.
- FIG. 19C illustrates a sound track 296 associated with the “Ah” behavior.
- a behavior may also include a plurality of behaviors which may be selected based on various factors including a user's actions.
- the invention permits a class of behaviors to be downloaded to the player at one time. Now, a method for streaming behaviors in accordance with the invention will be described.
- the animation system and in particular the player application may start playing a behavior while the behavior is still being downloaded to the player application which may be known as behavior streaming.
- FIG. 20 is a flowchart illustrating a method 300 for streaming behaviors in accordance with the invention.
- the user may select a behavior and/or start the execution of a behavior by, for example, clicking on a button or an icon on a Web page.
- the player application determines if the behavior is a streaming behavior since each behavior associated with an animated character may be identified as a streaming behavior or not.
- a behavior may be identified as a streaming behavior when the behavior requires more than a predetermined download time, such as when the behavior includes sound data. If the behavior is not a streaming behavior, the method is completed.
- the player application may download the behavior objects, as described below, in step 306 and downloads the first predetermined sized chunk of the behavior data in step 308 .
- the chunk of behavior data may be sufficient data for ten seconds of the behavior.
- the rest of the behavior data may be downloaded asynchronously as the player application is executing the previously downloaded chunk of the behavior data. For a behavior that lasts a total of five minutes, the streaming behavior beings playing after only ten seconds and the rest of the behavior data may be downloaded as the behavior is being executed.
- the chunk being downloaded to the player application may always be a predetermined number of seconds (ten seconds in a preferred embodiment) ahead of the currently playing portion of the behavior.
- the predetermined chunk of behavior data downloaded before starting the behavior may be ten seconds of behavior time.
- the downloading of the rest of the streaming behavior data may also be ten seconds ahead of the currently playing behavior data.
- the ten second time takes into account that the Internet sometimes suffers congestion and therefore is delayed is delivering the behavior data.
- the player application started playing the streaming behavior as soon as it was downloaded, the user may experience interruptions in the animation.
- the ten second interval provides a buffer of data so that the system has some ability to compensate for when the Internet is congested.
- the behavior data must be compressed so that behavior data may be downloaded from the Internet at least as fast as the player application plays the behavior data. For example, if the user uses a modem to download the behavior data, the behavior data is highly compressed so that it requires less than one second to download one second worth of behavior data.
- the player application may execute the behavior in step 310 . While the initial chunk of behavior is being executed, the player application may determine if the downloaded behavior data is more than a predetermined number of seconds (twenty seconds in a preferred embodiment) ahead of the behavior data currently being played in step 312 . If the downloaded behavior data is more than twenty seconds ahead of the currently played behavior data, the player application may stop further downloads in step 314 until the new behavior data is less than a predetermined time (ten seconds in a preferred embodiment) ahead of the currently playing behavior data. Then, in step 316 , the player application may download the next chunk of behavior data.
- step 318 the player application may determine if there is more behavior data to download and either loop back to step 308 to download a next chunk of behavior data or complete the method if there is no more behavior data to be downloaded.
- FIG. 21 is a diagram illustrating a stream file 330 in accordance with the invention.
- the creator described above may generate special files for streaming behaviors known as stream files.
- the stream files 330 may include behavior objects 332 and one or more chunks of key frame and sound data 334 .
- the behavior objects may include all of the data that describes the components of the behavior, such as the geometry and the movement of each joint in the animated character during the behavior.
- the behavior objects does not contain any key flames of behavior data, such as sound data.
- Each of the one or more chunks of behavior data 334 may contain a predetermined amount of behavior data (such as two seconds of behavior data in a preferred embodiment).
- the first chunk of behavior data may contain all of the keyframes from all of the behavior tracks which occur during the first predetermined time interval along with the sound data that plays during the first predetermined time interval of the behavior.
- Each chunk of behavior data starts at the same time as the corresponding time in the behavior so that the initial chunk starts at the start of the behavior.
- FIG. 22 is a diagram illustrating an example of the structure of each chunk of behavior data 334 .
- Each chunk 334 may be divided into one or more tracks 336 wherein each track contains data about a particular portion of the animated character.
- the chunk also include the sound data 344 for that particular portion of the behavior data.
- FIG. 23 is a diagram illustrating more details of the chunk 334 of behavior data.
- the chunk of behavior data includes the tracks 338 , 340 and a timestamp 350 .
- the timestamp may indicate the time of the chunk within the behavior, such as that the chuck covers time t to time t+2 seconds of the behavior.
- Each track 338 , 340 may include an identifier 352 that identifies the behavior object with which the particular track is associated, a length of data field 354 (len) indicating that length of the data in the track and a data field 356 containing the keyframe and behavior data for that track.
- Each chunk 334 may end with a track identification of zero (id-0) indicating the end of the particular chunk of behavior data. Now, the operation of the player will be described.
- FIG. 24 is a flowchart illustrating the operation 400 of the player for each frame of images being displayed to the user of the client computer.
- the player may read any user inputs and react according to those user inputs.
- the player may determine any geometry changes in the character in step 404 based on any currently executing behaviors.
- the player may generate the polygons of the character in step 406 based on the above determined changes in the geometry and the morphlinks as described above.
- the data about the polygons of the character may be scan converted in step 408 so that the character and the polygons may be displayed on a display screen.
- the character may be transferred to the display memory, such as by using a well known BLTBLK routine, so that it may be displayed to the user. Now, the details about determining the geometry changes will be described.
- FIG. 25 is a flowchart illustrating a method 420 for determining geometry changes in accordance with the invention.
- the player may call any update scripts which may include commands about the current behavior being executed. If there are no other behaviors being executed, the player may execute the idle behavior.
- the player may determine if the update scripts contain any new behaviors and request the download of the new behavior in step 426 if a new behavior is needed.
- the player determines the geometry changes for the character based on the currently executing behavior.
Abstract
Description
- This invention relates generally to a system and method for animating a computer image on a computer display and in particular to a system and method for generating realistic three-dimensional animation of an object over a low bandwidth communications network.
- There are many different techniques for generating an animation of a three dimensional object on a computer display. Originally, the animated figures looked very much like stick figures or block figures since the animation was not very good. In particular, the user would see a block representing the arm move relative to a block representing the forearm. The problem was that there was no skin covering the blocks so that the figure looked unrealistic and not very lifelike. More recently, the figures for animation have improved so that a skin may cover the bones of the figure to provide a more realistic animated figure.
- Some techniques and systems, such as those used to generate three dimensional animation for a movie, are very high-end and expensive. In addition, these high-end three dimensional animations may be viewed by a consumer on a movie screen, for example, but cannot be interacted with in any way. In particular, the consumer may view the three dimensional animations which tell a story such as in a movie but the consumer cannot interact with the animations in any manner. These high-end animation systems are very useful for a movie, but cannot be readily used by the general public due to the costs of the system.
- Other animation systems, such as dedicated game playing systems or personal computers executing a software application, permit the user to interact with the animations. These systems, however, require a large amount of memory for storing the animation data and a fairly state-of-the-art processor or graphics coprocessor in order to produce realistic three dimensional animation. The problem with a dedicated game playing system is that it cannot be used for other computing related tasks and therefore are relatively expensive due to the limited functions that they perform. The problem with most personal computers is that the personal computer is not optimized to produce the animations and therefore usually requires an expensive graphics coprocessor and a sound board. In the above conventional animation systems, the user may interact with the animation during the game play, but the entire game with the graphics and animation are stored on a cartridge or on a hard disk or CD of the personal computer.
- Recently, a number of animation systems have been introduced which harness the Internet or the World Wide Web (the Web) to communicate the animation data to the user. In particular, the user may use a personal computer which is executing a browser software application. The user may direct the browser application to a particular uniform resource locator (URL) of a web site which then may download the animation data from the web site. The problem is that the web site typically downloads the entire animation data so that the amount of animation data downloaded is large. An example of this type of animation system uses the virtual reality markup language (VRML) protocol. For a user with a slow communications link, such as the Internet or the Web and a modem, the large amount of animation data leads to a very slow download. The slow downloading time in turn leads to the consumer waiting a long period of time before viewing the animation. This long period of waiting before the animation is not acceptable since people become bored during the waiting period, cancel the animation and thus never see the animation displayed. It is desirable, however, to provide a three dimensional animation system in which the animation data which may be downloaded rapidly over a very slow communications link and it is to this end that the present invention is directed.
- The invention provides a three dimensional animation system and method in which the animation data may be downloaded over a relative slow communications link, such as the Internet or Web and a modem, to a local computer in a relatively short amount of time. The local computer may then execute a downloaded software application to animate the object. The user of the local computer may interact with the animated object (i.e., change its behaviors or actions). In a preferred embodiment, the main portion of the system may reside as a plurality of software applications on a web server and a client computer may access the web server to download the animations. To accomplish the shorter download time, the system may generate an initial animation download package containing the data about the actual object (e.g., the persistent data) and a few basic actions of the object (e.g., behavior data). For example, each downloaded object may have an idle behavior associated with it which is executed any time that the object is not executed another behavior. Then, as additional actions or behaviors or sound tracks for the object are required, the system may stream the behavior data down to the client computer before the action is required so that the behaviors are asynchronously downloaded (i.e., the behaviors for the three dimensional animated object do not need to be downloaded at the same time as the three dimensional animated object is downloaded). In this manner, the client computer may more quickly begin the animation while other yet unneeded actions or behaviors are being downloaded. For example, as the animated object moves through a landscape, the object may cross a bounding box which causes an action that will be needed shortly to be downloaded to the client computer so that the action is available when needed. The one or more behavior or action files for a particular object may contain information about the movements of the joints in the object which correspond to a particular behavior and any other data necessary to execute the behavior. For example, a head nod behavior may involve the movement of the neck joint. As another example, a sound track for the object, such as saying “Hello”, may involve the movement of the various pieces of the lips and a sound track synchronized to the movement of the lips.
- In accordance with the invention, the total size of each behavior downloaded to the client computer is also relatively small in size so that the download time of the behavior, over a relatively slow communications link, is not too slow. To reduce the size of each behavior, the initial object downloaded to the client computer may include an object tree containing data about each portion of the object. For example, a person would include a leg object. Then, each piece of skin on the object (e.g., each polygon) may include a contribution chart which lists each joint in the object, such as the knee or ankle, and the contributions that the movement of each joint makes to movement of the particular polygon. For example, a polygon near the knee joint would probably be mostly influenced by knee movement while a polygon midway between the knee and ankle would be influenced by the movement of both the ankle and the knee. Thus, when a behavior commands the knee of the object to move, the client computer may easily determine the movement for each particular polygon based on the model of the object and the movement of the joints. Thus, for any downloaded behavior, only the movements of the joints in the object need to be specified in the downloaded behavior file since the movement of each piece of skin on the model may be determined by the model based on the movement of the joints. Thus, if the model has twelve joints and 6000 polygons, the downloaded behavior file may contain data about the movement of the twelve joints whereas the behavior may cause the 6000 polygons on the model to move.
- In accordance with the invention, the system also permits a downloaded behavior to be streamed to the player application residing on the user's computer. In particular, the behavior may start playing on the player application before the entire behavior is downloaded. For example, if a behavior is five minutes long, the user of the player application is not likely to wait 5 minutes for the behavior to be downloaded. Therefore, in accordance with the invention the system downloads a predetermined amount of the behavior (e.g., a two second portion of the behavior data) at a time so that the behavior may start executing the first predetermined portion of the behavior data while the second and subsequent portions of the behavior data are downloaded. Thus, a long behavior being downloaded to the player application does not prevent the animated character's behavior from being started.
- The system may also permit the user to interact with the animated objects in various different ways using actions and scripts which include one or more actions or behaviors. For each interaction with the object, there may be an action or behavior associated with that interaction. For example, when the user clicks on an animated door being displayed to the user, a behavior to open the door and show the door slowly opening with a creaking soundtrack may be downloaded to the client computer and executed. On the client computer, the user may see the door slowly open and hear the door creak as it opens. As another example, when the user drags a cursor over an object, such as a gray knife, a behavior to turn the knife red will be downloaded and executed so that the knife turns red when the user places the cursor over it. When the user moves the cursor off of the knife, the knife returns to its idle behavior which is having a gray color.
- To shade the animated object using a lighting model, the invention may use a spherical environmental map. In particular, the pixels resulting from a particular lighting condition is determined for a half-sphere and then the corresponding lighting for an object is determined based on the pixels in the half-sphere. The lighting model for the half-sphere may be downloaded to the client computer so that, for each pixel of the object, the client computer may look up the corresponding point on the half-sphere and apply the pixel value on that portion of the half-sphere to the object. Thus, in accordance with the invention, the system does not attempt to calculate, in real-time, the lighting for an object. The system may also provide a character cache in the client computer that permits the character data to be stored in the client computer so that it does not need to be constantly refreshed. In contrast, in a conventional Web based 3-D system, such as VRML, the character may be stored in the cache of the browser application which is periodically flushed. Thus, in accordance with the invention, a system for animating a character on a computer is provided, comprising a first computer and a second computer connected to the first computer. The first computer may store one or more pieces of data associated with a particular animated character. The pieces of data may include a persistent data file containing one or more of a geometric model of the animated character, a texture associated with the animated character and a sound associated with the animated character. The pieces of data may further comprise one or more behavior files wherein each behavior file contains data about a particular behavior of the animated character and each behavior specifies the movement of the model. The second computer may initially download the persistent data file from the first computer in order to begin the animation of the animated character on the second computer and then asynchronously download a behavior file from the first computer just prior to the execution of the behavior of the animated character by the second computer.
- FIG. 1 is a block diagram illustrating the three dimensional animation system in accordance with the invention;
- FIG. 2 is a flowchart illustrating a method for downloading three dimensional character files in accordance with the invention;
- FIG. 3 is a diagram illustrating an example of an object hierarchy in accordance with the invention;
- FIGS. 4A and 4B are diagrams illustrating an example of a three dimensional object in accordance with the invention;
- FIG. 5 is a diagram illustrating an articulation of a joint;
- FIG. 6 is a diagram illustrating a morphlink in accordance with the invention;
- FIG. 7 is a diagram illustrating an example of the area of influence of a joint in accordance with the invention;
- FIGS. 8A and 8B are diagrams illustrating an example of the bones within a model in accordance with the invention;
- FIGS. 9 and 10 are diagrams illustrating an example of a model with the bones and polygons in accordance with the invention;
- FIG. 11 is a diagram illustrating an example of a rendered, unlighted model in accordance with the invention;
- FIG. 12 is a diagram illustrating an example of a rendered, lighted model in accordance with the invention;
- FIG. 13 is a diagram illustrating an example of an environmental map lighting model in accordance with the invention;
- FIG. 14 is a diagram illustrating an example of the rendered model shown in FIGS. 11 and 12;
- FIGS. 15 and 16 are diagrams illustrating the bones in a leg model in accordance with the invention;
- FIG. 17 is a diagram illustrating the leg of FIGS. 15 and 16 showing the area of influence for each joint;
- FIG. 18 is a diagram illustrating an example of a behavior in accordance with the invention;
- FIGS. 19A, 19B and19C are diagrams illustrating another example of a behavior in accordance with the invention;
- FIG. 20 is a flowchart illustrating a method for streaming behaviors in accordance with the invention;
- FIG. 21 is a diagram illustrating the details of the streaming behaviors in accordance with the invention;
- FIG. 22 is a diagram illustrating more details of the streaming behavior shown in FIG. 21;
- FIG. 23 is a diagram illustrating a portion of a streamed behavior in accordance with the invention; and
- FIGS. 24 and 25 are flowcharts illustrating the operation of the player in accordance with the invention.
- The invention is particularly applicable to a Web-based three dimensional animation system and method and it is in this context that the invention will be described. It will be appreciated, however, that the system and method in accordance with the invention has greater utility, such as to other types of three dimensional animation systems including stand-alone computer systems and the like.
- FIG. 1 is a block diagram illustrating a three
dimensional animation system 40 in accordance with the invention. Thesystem 40 may include acharacter creator 42, aserver 44 and one or more client computers (CLIENT #1-CLIENT #N) 46. In this example, the client computers may be connected to the server by acommunications network 48 that may include various communication or computer networks such as the Internet, the World Wide Web (the Web), a local area network, a wide area network or other similar communications networks which connect computer systems together. Thecreator 42 may be used to generate a three dimensional animated character as described below. In a preferred embodiment, thecreator 42 may be a software application being executed by a computer system. Thecreator 42 may be stored on theserver 44 or may be executed by a separate computer system as shown in FIG. 1. Thecreator 42 may generate one or more web files, as described below, which may be downloaded to theserver 44 so that each client computer may then download the web files from the server. Each client computer may then interpret the downloaded web files and generate a three dimensional animation based on the web files as described in more detail below. Now, each portion of thesystem 40 will be described in more detail. - The
creator 42 may accept various information from either user input or other external files in order to generate a three dimensional realistic object or character which may be animated using the system. For example, the creator may receive threedimensional models 50 which may be, for example, wire frame models of an object generated by a third party modeling system. The generation of a three dimensional wire frame model from a three dimensional object in well known and therefore will not be described here. Thecreator 42 may also receivetexture map information 52 which may be used to place a texture over the polygons painted onto the three dimensional object. The texture may provide, for example, a realistic flesh color and texture for the skin of a human character or a realistic set of teeth. Thecreator 42 may also receive asound file 54 so that a sound track may be incorporated into a behavior file in accordance with the invention. The sound file may be generated by a third party system which receives a sound and generates a digital representation of the sound which may be incorporated into a file. The creator may also receive abehavior file 56. The behavior file may be combined with any sound file to generate a behavior for a three dimensional animation in accordance with the invention. The behavior file may be generated by the user using a separate software module of thecreator 42. Thecreator 42 may combine the three dimensional model information, the texture information, the sound file and the behavior file into one or more web files which are stored on the server. - In accordance with the invention, the
creator 42 may generate more than one file for each animated object. In particular, the creator may generate a file containing persistent data 58 and then one or more files containingbehavior data 60. The persistent data file may be initially downloaded to the client computer to begin the animation of the object and may include one or more of the three dimensional object (including joints and polygons), any textures, any morphlinks as described below, and an idle behavior for the three dimensional object. The idle behavior may be the action or movement of the animated object when no other behavior is being executed. For example, the idle behavior for an animated monster may be that the monster may breath which causes his chest to expand and contract. - The persistent data file may also include the morphlink data associated with each polygon in the model in accordance with the invention. The model may include one or more joints connected together by one or more bones and a skin of polygons which cover the bones and joints. For each polygon on the model, the morphlink data permits the client computer to easily determine the movement of the polygons based on the movement of the joints of the object. In a conventional animation system, the movement of each polygon must be independently calculated which is very slow. When a model is generated, the user of the creator defines the morphlinks so that the movement of a particular polygon based on the movement of the joints is determined by the user. Thus, the movement of each polygon relative to movement of the joints of the model is defined in the persistent data. Therefore, when a behavior of the three dimensional animation is executed, the behavior may contain only information about the movement of each joint in the model and the client computer, based on the morphlinks, may determine the movement of each polygon on the model. Thus, the size of the behavior file downloaded to each client computer in accordance with the invention is reduced which speeds up the download speed of each behavior. In addition, the speed with which the three dimensional animation may be animated is increased since the client computer does not need to calculate the movement of each polygon on the three dimensional object or character each time a behavior occurs. The persistent data file58 may also be compressed to further reduce the size of the persistent data. As an example, due to the compression and the structure of the persistent storage file in accordance with the invention, a persistent storage file may be approximately 10-200 Kb depending on the complexity of the three dimensional animation whereas a typical animation file may be approximately 1 Mb.
- The one or more behavior files60 may each contain a data structure which contains data specifying the movement of each joint in the three dimensional object or character during a behavior and any sound file which is associated with the particular behavior. In accordance with the invention, each different behavior of the three dimensional animation, such as smiling, talking about a particular subject, sitting, etc., may be contained in a separate behavior file. In accordance with the invention, each behavior file may be downloaded to the client computer only when the behavior is required. For example, a behavior to pick up an object from the ground for a game player may only be downloaded when the game player nears an object which may be picked up. As another example, a behavior to say “Good-bye” is only downloaded to the client computer when the user clicks on a good-bye button on the user interface. Thus, the system may download the behavior files asynchronously with the persistent data file. Due to the morphlinks in the persistent data, the size of the behavior files, as described above, is very small. It should be realized, however, that each behavior file is associated only with a particular persistent data file (since the structure of the behavior and the persistent storage are tied together) and therefore a walking behavior for two separate models will be slightly different. The behavior files may be generated by the
creator 42 in response to user input. The behavior files may be compressed in that the data for any joints which do not move during a predetermined time during the behavior is not downloaded to the client computer. - Once the behavior and persistent data files are generated, they may be downloaded to the
server 44 and stored in a character andbehavior storage device 62 which may be a persistent storage device such as a hard disk drive, a tape drive or the like. Theserver 44 may also include aplayer store 64 which contains a player web file. The player web file is the software application which is first downloaded to the client computer so that the client computer may then interprets the persistent data file as well as the behavior files. Thus, theserver 44 may first download the player software application to the client computer (if necessary) and then, based on user input into a Webuser interface application 66, download the persistent data file so that the animation may begin. Then, as behaviors of the three dimensional animation are needed by the client computer, the server may download the appropriate behavior file to be executed by the player on the client computer. - Each
client computer 46 may include aCPU 68, amemory 70 containing one or more software application to be executed by theCPU 68, a character cache 74 which may reside on a persistent storage device of the client computer and adisplay 76. The character cache 74 may store the persistent data 58 when it is downloaded to the client computer the first time so that it may not be necessary to downloaded the persistent data again when the particular client computer again wants to view the same animation. Unlike conventional animation systems in which the character data is stored in the cache of the browser application so that the character data is periodically flushed, the system has its own character cache. - The
memory 70 may store abrowser application 78 which permits the client computer to interact with theserver 44 by specifying a uniform resource locator (URL) of the server in order to receive the web files stored on the server using a hypertext transfer protocol (HTTP). The memory may also store theplayer software application 64 to interpret the persistent data file and the behavior files and generate the animated object, acharacter file 82 generated from the persistent data file, afirst behavior file 84 containing the idle behavior and asecond behavior file 86 that may contain a behavior that will be executed soon. Thus, the character file and the current behavior files are both stored in thememory 70 while being executed by the player. As new behaviors are needed, those behaviors are downloaded to the client computer and one of the old behaviors may be deleted to make room from the new behavior. The number of behaviors stored in the client computer at any time depends on the amount of memory space available for the animation system. The animation generated by theplayer 64 based on the persistent data and the behavior files may be displayed on thedisplay 76. Now, a method for downloading three dimensional character files in accordance with the invention will be described. - FIG. 2 is a flowchart illustrating a
method 100 for downloading three dimensional character files to a particular client computer in accordance with the invention. The downloading method permits the animation to begin rapidly since the downloading time is reduced due to the asynchronous download of the behavior files and the persistent data file. Instep 102, the server may determine if the player application has previously been downloaded to the client computer and may downloads the player software instep 104 if it has not already been downloaded. Once the player is downloaded to the client computer, the server may download the persistent data file (that may include the character file and an idle behavior file) to the client computer instep 106. The client computer may then create a character cache if one does not exist and store the character file and the idle behavior file in the character cache. Next, the player application is executed by the client computer and the player application may use the persistent data to animate the character and execute the idle behavior instep 108. In accordance with the invention, all of the behaviors for an animated object do not need to be initially downloaded so that the download time for the persistent data is reduced and the animation may begin more quickly. Next, the player application may determine if a new behavior for the animated object is needed instep 110 and continue to execute the idle behavior instep 112 if no new behavior is needed. If a new behavior is needed, then the server may download the new behavior instep 114 in response to a request by the player application. The player application may then determine if the new behavior as finished downloading instep 116 and continue to execute the prior behavior until the new behavior is downloaded. If the new behavior is downloaded, then the player may execute the new behavior instep 118 and return to step 108. In accordance with the invention, the behaviors are downloaded to the client computer as they are needed so that the start time of the animation is reduced. In addition, since the size of the behavior files is small due to the morphlinks, the total download time for any behavior file is also short. Now, an example of the object hierarchy in the three dimensional animation system will be described. - FIG. 3 is a diagram illustrating an example of an object hierarchy130 in accordance with the invention. As shown, the object hierarchy may include a tree of objects organized underneath a
root node 132. The objects used in the three dimensional animation system may include atexture object 134, asound object 136, ageometry object 138 and abehavior object 140. Each of these objects may then include sub-objects as shown for the geometry and behavior objects. In particular, thegeometry object 138 may include apolygon object 142 containing the polygons associated with a particular model, amorphlinks object 144 containing the morphlinks associated with each polygon in the model, aparticle system object 146 for storing smaller objects such as rocket ship exhaust and acamera object 148 for storing information about the camera position and angle with respect to the model. The geometry may further include additional information about the model as shown in FIG. 4B. - The
behavior object 140 may include atransform object 150 containing movement information for the joints of an object to transform the object, atexture track object 152 containing an animated texture of a model, asound object 154 containing a sound track associated with a behavior and ascript object 156 containing a sequence of behaviors combined together to form a new behavior. For example, a script may be for interacting with the animated character and may include a behavior for each response that the character makes to the user in response to user input. Each animated object may include one or more of the above objects. Now, an example of the object for a particular character/model will be described. - FIGS. 4A and 4B are diagrams illustrating an example of a character object and a three dimensional object associated with that object in accordance with the invention. FIG. 4A is a diagram illustrating a
geometry object 160 for a human character in which the object models the joints of the character. For example, the object shown includes a head, a neck, two elbows, two wrists, hips, two knees and two ankle joints. As described above, the movement of a character is specified by the movement of the joints of the character which may then be turned into movement of the polygons on the model based on the morphlinks. A threedimensional object tree 162, as shown in FIG. 4B, which models this human character has a similar structure. In particular, theobject tree 162 may include a WORLD node connected to a CAMERA node and a BODY node. The BODY node may further include various objects modeling the various joints in the body. As shown, the BODY node may include a HIP JOINT node, a LFT KNEE object and a RT KNEE object connected to the HIP JOINT node and a LFT ANKLE and RT ANKLE node connected to the appropriate knee nodes. Each object in the object tree that is connected to an object above it in the tree inherits the attributes of the object above it. For example, any movement of the LFT KNEE object may cause the LFT ANKLE object to inherit the same movement. This inheritance of the knee movement by the ankle provides a good model of the human body. Similarly, if the HIP JOINT object moves, both of knee objects and both of the ankle objects inherit the movement of the HIP JOINT object. Therefore, once the movement of the HIP JOINT object is specified in a behavior, the movements of the knees and ankles caused by the HIP JOINT movement do not need to be specified. Now, the articulation of a joint in accordance with the invention will be described. - FIG. 5 is a diagram illustrating an example of the articulation of joints of a character. In particular, an
arm 170 of a three dimensional character is shown for illustration purposes. The arm may include ashoulder joint 172, an elbow joint 174, a wrist joint 176, anupper arm 178, alower arm 180 and ahand 182. Similar to a human arm, each joint has six degrees of freedom since each joint may move in a positive or negative X direction, a positive or negative Y direction and/or in a positive or negative Z direction. For each predetermined period of time referred to as a key period, each joint may move in each of the six directions. The key period may preferably be {fraction (1/10)}th of a second and the player may interpolate the movement of the joint in each direction in between the key period to ensure that the motion of the joint is smooth. The key period may be set by the user. As each joint moves, the part of the body near that joint may also move. In a three dimensional animation character in accordance with the invention, the synchronization of the movement of the body part and the polygons which make up the “skin” of the body part with the joints of the character are accomplished by the morphlinks in accordance with the invention that will now be described in more detail. - FIG. 6 is a diagram illustrating a morphlink in accordance with the invention for the
upper arm 178 shown in FIG. 5. In this example, the polygons covering the upper arm may be influenced by both theshoulder joint 172 and theelbow joint 174. For this example, the movement of apolygon 184 on the outside of the upper arm based on the movement of the shoulder joint and the elbow joint will be described. If the movement of thepolygon 184 is influenced only by the shoulder joint's movement, thepolygon 184 may move to position x1. If the movement of thepolygon 184 is influenced only by the movement of the elbow joint, the polygon may move to position x2. To realistically model the movement of the polygon, however, neither of the above positions is accurate and would be perceived by the user viewing the animated character as an aberration. Therefore, thepolygon 184 should be influenced by both the shoulder joint and the elbow joint so that thepolygon 184 moves to position x3 when the influence of both joints are used. In accordance with the invention, the relationship of each polygon on the character to the joints in the character may be stored in the morphlink data which is stored with the persistent data file. The morphlink data permits a behavior file to only specify the movement of the joints in the character (a small amount of data) and then the player application on the client computer may determine the movement of each polygon of the character based on the morphlink data. The actual influence areas of each joint on particular polygons on the three dimensional object is controlled by the user during the creation of the three dimensional character so that once the character is created, each polygon on the character has a fixed movement relationship with respect to the joints of the character. Now, an example of the area of influence of a joint will be provided. - FIG. 7 is a diagram illustrating an example of the area of influence of a joint in accordance with the invention. In this example, a joint190 may have an inner area of
influence 192 and an outer area ofinfluence 194. Abody part 196 surrounding the joint 190 is shown by the dotted lines. For polygons on the body part within the inner area ofinfluence 192, the joint 190 contributes 100% of its movement to the movement of those polygons. In this example, the movement of polygons A and B have 100% contribution from the joint 190 so that, for example, if joint 190moves 1″ in the positive X direction, then polygons A and B also move 1″ in the positive X direction. Then, if anotherjoint moves 1″ in the Y direction and also contributes 100% to the movement of polygons A and B, polygons A and B also move 1″ in the Y direction in addition to the movement in the X direction. For polygons in the outer area ofinfluence 194, the amount of influence on the particular polygon decreases as the polygon is located farther away from the joint. Thus, at the periphery between the inner and outer areas of influence, the contribution is still 100% while the contribution for a polygon at the outside edge of the outer area of influence is 0%. Thus, for polygon C in this example, the joint contributes 50% to the movement of polygon C so that if thejoint moves 1″ in the positive X direction, polygon C moves {fraction (1/2)}″ in the positive X direction. For polygons D and E in this example, the joint 190 contributes 0% movement so that the movement of the joint does not affect those polygons. Thus, for each polygon on a three dimensional animated character, the contributions for each joint in the model is set by the user using the creator user interface and then stored in the morphlinks by the system in accordance with the invention. Now, an example of the process for creating a three dimensional animated character in accordance with the invention using the creator will be described. - FIGS. 8A and 8B are diagrams illustrating an example of the bones within a model in accordance with the invention. The bones in the model are generated using a
user interface 200 in the creator application. Theuser interface 200 may include anobject window 202 which lists the objects, such as the geometry and materials (textures) associated with the particular three dimensional character being generated. Theuser interface 200 may also include ageometry window 204 which lists all of the geometry associated with the particular three dimensional character such as the neck, the left shoulder (ShouldL), the left upper leg (UpLegL) and the like. To help the person creating the character, the geometry is listed in object order so that the associations of objects with other objects and the attributions of the characteristics (such as the association of the upper left arm with the left shoulder) may be viewed by glancing at the geometry window. The geometry window also permits the user to make certain portions of the character invisible, if desired. Theuser interface 200 may also include a three dimensionalcharacter viewing window 206 which shows the user the current view of the character. In this example, only one or more bones of thecharacter 208 and one ormore joints 210 are shown by clicking the appropriate locations in the geometry window. The bones and joints of the three dimensional model may be generated by the creator application or by a well known third party piece of software. Once the bones of the character have been laid out and set by the user, the user may place a “skin” of polygons over the bones as will now be described with reference to FIG. 9. - FIGS. 9 and 10 are diagrams illustrating an example of a model with the bones and polygons in accordance with the invention and an example of a character displayed with only the polygons, respectively. Once again, the user may create the three dimensional animated character using the
user interface 200 of the creator. In this figure, the user has selected to have thebones 208 andjoints 210 shown inwindow 206 with apolygon skin 212 placed over the bones and joints. The polygon skin may be generated by the creator application or by a third party piece of software. Thepolygon skin 212 forms the actual surface seen by a person when viewing the three dimensional animated character and any texture mapped onto these polygons. FIG. 10 shows the character with just thepolygons 212 being displayed to the user in theuser interface 200. The character shown in FIG. 10 accurately depicts what the three dimensional animated character may look like with the exception of a texture being placed on the polygons. A three dimensional animated character with a texture placed on the polygons will now be described with reference to FIGS. 11 and 12. - FIG. 11 is a diagram illustrating an example of a rendered,
unlighted character 220 in accordance with the invention within theviewing window 206 while FIG. 12 is a diagram illustrating an example of a rendered, lighted character 230 in accordance with the invention. As shown in FIG. 11, the three dimensional character has been covered with textured polygons so that, for example, the character has long hair, eyebrows, a nose, a mouth with lips and eyes. The textures may be generated by a third party piece of software and then positioned onto the animated character by the user using the creator application. The entireunlighted character 220 is shown in FIG. 14. - In FIG. 12, the same character has a chrome texture and has been lighted with a particular lighting model. To texture the character with the chrome surface, a material window232 may be displayed which permits the user to select a material/texture. In this example shown, the chrome material was selected. The material window may include a palette 234 of materials which may cover the polygons, such as a body material, a bottom teeth material, a chrome material, a hair material and a top teeth material. The lighting may be applied to the character by specifying a lighting model for the character. An example of a lighting model will now be described with reference to FIG. 13.
- FIG. 13 is a diagram illustrating an example of an environmental
map lighting model 240 in accordance with the invention which may be applied to a three dimensional character such as the one shown in FIG. 12. Theenvironmental map 240 is generated by having the selected lighting choice illuminate a surface of a sphere. Thus, as shown in FIG. 13, the particular lighting model causes the sphere's surface to have a particular appearance made up of individual pixels having different intensities. To transfer the lighting model onto a three dimensional character, the appearance of the character at a particular location is the same as the appearance of the sphere at the same location. For example, the left side of the sphere has abright spot 242 and the character, shown in FIG. 12, also has abright spot 244 along the left side of the character. Similarly, the sphere has alight shadow 246 near the top of the sphere and the character 230 has a correspondinglight shadow 248. Thus, in accordance with the invention, the lighting model for the character is generated by looking up the appearance of the lighting model on a particular location on a sphere and then mapping the sphere's lighting at the particular location onto the polygons of the animated character at the same particular location. In accordance with the invention, it is not necessary to calculate the lighting for each pixel of the character. In a typical three dimensional animation system, the lighting model may be transferred onto the character by calculating the lighting of each pixel on the character which is a slow process. Now, an example of a morphlink associated with the leg of a character will be described. - FIGS. 15 and 16 are diagrams illustrating an example of the bones in a leg and the morphlink associated with the leg in accordance with the invention. In particular, FIG. 16 shows the
display window 206 of the creator with abottom half 250 of a character. In this example, ahip joint 252, a knee joint 254 and an ankle joint 256 are shown inside of the character. Each pixel of each polygon forming the “skin” of the character may then have its motion determined by the creator of the character by setting the morphlinks that associate a particular pixel's movement with the contributions from each joint in the character. - FIG. 16 illustrates the
user interface 200 with themain window 202, thegeometry window 204 and thedisplay window 206. When viewing the morphlinks in the character, the user interface may also include amorphlink window 258 which contains a list of each polygon in the character along with the contributions of each joint's movement to the movement of that polygon. In this example, the right leg has moved upwards and a polygon 260 has been selected. Themorphlink window 258 may then highlight the selected polygon (Body Geo Vert 188 in this example) along with the contribution from the knee joint which is 77.62% in this example. Thus if the knee moves 10″ in the positive X direction, the selected polygon moves 7.762″ in the positive X direction. As described above, each polygon may be influenced by one or more joints and the contributions of each joint are added together to determine the movement of that polygon. The morphlink permits a downloaded behavior files to specify only the movement of each joint and then the player on the client computer may determine the movement of each polygon based on the movement of the joints and the morphlinks. Now, the areas of influence of a joint will be described in more detail using the character shown in FIGS. 15 and 16. - FIG. 17 is a diagram illustrating the
character 250 of FIGS. 15 and 16 showing the area of influence for each joint. The areas of influence for each joint may include aninner region 270 and anouter region 272. During the creation of a character, the user of the creator may adjust these inner and outer regions which adjusts the influence of that joint. In this example, the areas of influence for thehip joint 252, the knee joint 254 and the ankle joint 256 are shown. The details about the areas of influence are described above and will not be described here. Now, an example of a behavior will be described. - FIG. 18 is a diagram illustrating an example of a
behavior 280 in accordance with the invention which may be downloaded to a client computer. As shown, the structure of the behavior may follow the structure of the objects within a character. In particular, the behavior may include aroot node 282, adance hips node 284 specifying the movement of the hips of the character during the dance behavior, a dance-rt knee and dance-lft knee nodes - For purposes of illustration, the dance-
hips object 284 will be described in more detail, although each object in the behavior may have a similar data structure even if each object moves in a different manner. Thus, for each key period (shown in FIG. 18 as 0.1 second intervals), theobject 284 may specify the movement of the object in the three dimensions (X, Y, Z). As shown, the movement in each dimension may change during each key period. Between the key periods, the player may use interpolation to ensure a smooth movement. Another example of a behavior will now be described. - FIGS. 19A, 19B and19C are diagrams illustrating another example of a behavior in accordance with the invention. In this example, the behavior is having a character's mouth form the syllable “Ah” and a corresponding sound track being played to simulate the character speaking the syllable “Ah”. FIG. 19A shows a structure 290 for the behavior which may include one or
more time frames 292 to ensure synchronization of the character with the behavior at each key period. Following each time frame is the data about the movement of each object in the character during that key period including the position change of each object, the orientation change of each object and the scale change of each object. FIG. 19B illustrates abehavior object 294 for the “Ah” behavior including links to each object (joint) in the mouth that is moved during the behavior. FIG. 19C illustrates asound track 296 associated with the “Ah” behavior. In accordance with the invention, a behavior may also include a plurality of behaviors which may be selected based on various factors including a user's actions. Thus, the invention permits a class of behaviors to be downloaded to the player at one time. Now, a method for streaming behaviors in accordance with the invention will be described. - In accordance with the invention, it is desirable to being the animation of the animated character including its behaviors as soon as possible so that the user of the player application may begin viewing the animation as soon as possible. When a large behavior file needs to be downloaded to the player application, it is possible with conventional animation systems that the animation of the animated character is delayed which is not acceptable. For example, if a behavior contains a sound track, the sound data may be large enough that the behavior takes as long to download as it does to play. On a modem that transmits 28,800 bit per second, even compressed sound data requires 13,000 bit per second to download the sound data and the behavior key frames require the rest of the bandwidth. It is not reasonable to make a user wait five minutes for the five minute long behavior to download before it is played. In accordance with the invention, the animation system and in particular the player application may start playing a behavior while the behavior is still being downloaded to the player application which may be known as behavior streaming.
- FIG. 20 is a flowchart illustrating a method300 for streaming behaviors in accordance with the invention. In
step 302, the user may select a behavior and/or start the execution of a behavior by, for example, clicking on a button or an icon on a Web page. Instep 304, the player application determines if the behavior is a streaming behavior since each behavior associated with an animated character may be identified as a streaming behavior or not. A behavior may be identified as a streaming behavior when the behavior requires more than a predetermined download time, such as when the behavior includes sound data. If the behavior is not a streaming behavior, the method is completed. If the behavior is a streaming behavior, the player application may download the behavior objects, as described below, instep 306 and downloads the first predetermined sized chunk of the behavior data instep 308. In a preferred embodiment, the chunk of behavior data may be sufficient data for ten seconds of the behavior. The rest of the behavior data may be downloaded asynchronously as the player application is executing the previously downloaded chunk of the behavior data. For a behavior that lasts a total of five minutes, the streaming behavior beings playing after only ten seconds and the rest of the behavior data may be downloaded as the behavior is being executed. In accordance with the invention, the chunk being downloaded to the player application may always be a predetermined number of seconds (ten seconds in a preferred embodiment) ahead of the currently playing portion of the behavior. - In a preferred embodiment, the predetermined chunk of behavior data downloaded before starting the behavior may be ten seconds of behavior time. The downloading of the rest of the streaming behavior data may also be ten seconds ahead of the currently playing behavior data. The ten second time takes into account that the Internet sometimes suffers congestion and therefore is delayed is delivering the behavior data. If the player application started playing the streaming behavior as soon as it was downloaded, the user may experience interruptions in the animation. Thus, the ten second interval provides a buffer of data so that the system has some ability to compensate for when the Internet is congested. For the streaming of the behavior data to work, the behavior data must be compressed so that behavior data may be downloaded from the Internet at least as fast as the player application plays the behavior data. For example, if the user uses a modem to download the behavior data, the behavior data is highly compressed so that it requires less than one second to download one second worth of behavior data.
- Returning to the flowchart, after the initial chunk of behavior data is downloaded, the player application may execute the behavior in
step 310. While the initial chunk of behavior is being executed, the player application may determine if the downloaded behavior data is more than a predetermined number of seconds (twenty seconds in a preferred embodiment) ahead of the behavior data currently being played instep 312. If the downloaded behavior data is more than twenty seconds ahead of the currently played behavior data, the player application may stop further downloads instep 314 until the new behavior data is less than a predetermined time (ten seconds in a preferred embodiment) ahead of the currently playing behavior data. Then, instep 316, the player application may download the next chunk of behavior data. Instep 318, the player application may determine if there is more behavior data to download and either loop back to step 308 to download a next chunk of behavior data or complete the method if there is no more behavior data to be downloaded. Now, the streaming behavior files in accordance with the invention will be described. - FIG. 21 is a diagram illustrating a
stream file 330 in accordance with the invention. In particular, the creator described above, may generate special files for streaming behaviors known as stream files. The stream files 330 may include behavior objects 332 and one or more chunks of key frame andsound data 334. The behavior objects may include all of the data that describes the components of the behavior, such as the geometry and the movement of each joint in the animated character during the behavior. The behavior objects does not contain any key flames of behavior data, such as sound data. Each of the one or more chunks ofbehavior data 334 may contain a predetermined amount of behavior data (such as two seconds of behavior data in a preferred embodiment). For example, the first chunk of behavior data may contain all of the keyframes from all of the behavior tracks which occur during the first predetermined time interval along with the sound data that plays during the first predetermined time interval of the behavior. Each chunk of behavior data starts at the same time as the corresponding time in the behavior so that the initial chunk starts at the start of the behavior. Now, an example of the structure for each chunk of behavior data will be described. - FIG. 22 is a diagram illustrating an example of the structure of each chunk of
behavior data 334. Eachchunk 334 may be divided into one ormore tracks 336 wherein each track contains data about a particular portion of the animated character. In this example, there is ahead track 338, atorso track 340 and aknee track 342 as shown in FIG. 22 which are respectively associated with the head, torso and knee of the animated character geometry. As shown, the chunk also include thesound data 344 for that particular portion of the behavior data. - FIG. 23 is a diagram illustrating more details of the
chunk 334 of behavior data. In particular, the chunk of behavior data includes thetracks timestamp 350. The timestamp may indicate the time of the chunk within the behavior, such as that the chuck covers time t to time t+2 seconds of the behavior. Eachtrack identifier 352 that identifies the behavior object with which the particular track is associated, a length of data field 354 (len) indicating that length of the data in the track and adata field 356 containing the keyframe and behavior data for that track. Eachchunk 334 may end with a track identification of zero (id-0) indicating the end of the particular chunk of behavior data. Now, the operation of the player will be described. - FIGS. 24 and 25 are flowcharts illustrating the operation of the player in accordance with the invention. FIG. 24 is a flowchart illustrating the
operation 400 of the player for each frame of images being displayed to the user of the client computer. Instep 402, the player may read any user inputs and react according to those user inputs. Next, the player may determine any geometry changes in the character instep 404 based on any currently executing behaviors. Next, the player may generate the polygons of the character instep 406 based on the above determined changes in the geometry and the morphlinks as described above. Once the polygons are generated, the data about the polygons of the character may be scan converted instep 408 so that the character and the polygons may be displayed on a display screen. Instep 410, the character may be transferred to the display memory, such as by using a well known BLTBLK routine, so that it may be displayed to the user. Now, the details about determining the geometry changes will be described. - FIG. 25 is a flowchart illustrating a
method 420 for determining geometry changes in accordance with the invention. Instep 422, the player may call any update scripts which may include commands about the current behavior being executed. If there are no other behaviors being executed, the player may execute the idle behavior. Instep 424, the player may determine if the update scripts contain any new behaviors and request the download of the new behavior instep 426 if a new behavior is needed. Next, in step 428, the player determines the geometry changes for the character based on the currently executing behavior. - While the foregoing has been with reference to a particular embodiment of the invention, it will be appreciated by those skilled in the art that changes in this embodiment may be made without departing from the principles and spirit of the invention, the scope of which is defined by the appended claims.
Claims (64)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/357,672 US20030137516A1 (en) | 1999-06-11 | 2003-02-03 | Three dimensional animation system and method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/330,681 US6559845B1 (en) | 1999-06-11 | 1999-06-11 | Three dimensional animation system and method |
US10/357,672 US20030137516A1 (en) | 1999-06-11 | 2003-02-03 | Three dimensional animation system and method |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/330,681 Continuation US6559845B1 (en) | 1999-06-11 | 1999-06-11 | Three dimensional animation system and method |
Publications (1)
Publication Number | Publication Date |
---|---|
US20030137516A1 true US20030137516A1 (en) | 2003-07-24 |
Family
ID=23290838
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/330,681 Expired - Lifetime US6559845B1 (en) | 1999-06-11 | 1999-06-11 | Three dimensional animation system and method |
US10/357,672 Abandoned US20030137516A1 (en) | 1999-06-11 | 2003-02-03 | Three dimensional animation system and method |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/330,681 Expired - Lifetime US6559845B1 (en) | 1999-06-11 | 1999-06-11 | Three dimensional animation system and method |
Country Status (13)
Country | Link |
---|---|
US (2) | US6559845B1 (en) |
EP (2) | EP1059614A2 (en) |
JP (3) | JP3566909B2 (en) |
KR (1) | KR100496718B1 (en) |
CN (1) | CN100342368C (en) |
AU (2) | AU783271B2 (en) |
BR (1) | BR0009614A (en) |
CA (1) | CA2303548C (en) |
MX (1) | MXPA00003211A (en) |
NZ (1) | NZ503625A (en) |
SG (1) | SG101932A1 (en) |
TW (1) | TW462032B (en) |
WO (1) | WO2000077742A1 (en) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040257338A1 (en) * | 2003-04-28 | 2004-12-23 | Snecma Moteurs | Graphical interface system |
US20050104886A1 (en) * | 2003-11-14 | 2005-05-19 | Sumita Rao | System and method for sequencing media objects |
US20050190188A1 (en) * | 2004-01-30 | 2005-09-01 | Ntt Docomo, Inc. | Portable communication terminal and program |
US20060029913A1 (en) * | 2004-08-06 | 2006-02-09 | John Alfieri | Alphabet based choreography method and system |
US20060109274A1 (en) * | 2004-10-28 | 2006-05-25 | Accelerated Pictures, Llc | Client/server-based animation software, systems and methods |
US20080028312A1 (en) * | 2006-07-28 | 2008-01-31 | Accelerated Pictures, Inc. | Scene organization in computer-assisted filmmaking |
US20080024615A1 (en) * | 2006-07-28 | 2008-01-31 | Accelerated Pictures, Inc. | Camera control |
US7827034B1 (en) * | 2002-11-27 | 2010-11-02 | Totalsynch, Llc | Text-derived speech animation tool |
US20120021827A1 (en) * | 2010-02-25 | 2012-01-26 | Valve Corporation | Multi-dimensional video game world data recorder |
US20120028706A1 (en) * | 2010-02-24 | 2012-02-02 | Valve Corporation | Compositing multiple scene shots into a video game clip |
US20120028707A1 (en) * | 2010-02-24 | 2012-02-02 | Valve Corporation | Game animations with multi-dimensional video game data |
CN102934145A (en) * | 2010-05-10 | 2013-02-13 | 史克威尔·艾尼克斯有限公司 | Image processing device, image processing method, and image processing program |
CN105321195A (en) * | 2015-07-02 | 2016-02-10 | 苏州蜗牛数字科技股份有限公司 | Making method for whip system in 3D game |
Families Citing this family (49)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6947044B1 (en) * | 1999-05-21 | 2005-09-20 | Kulas Charles J | Creation and playback of computer-generated productions using script-controlled rendering engines |
US6559845B1 (en) * | 1999-06-11 | 2003-05-06 | Pulse Entertainment | Three dimensional animation system and method |
KR20000037456A (en) * | 2000-04-25 | 2000-07-05 | 박기현 | A system and method for character animation through computer network |
US7159008B1 (en) | 2000-06-30 | 2007-01-02 | Immersion Corporation | Chat interface with haptic feedback functionality |
US20020059624A1 (en) * | 2000-08-03 | 2002-05-16 | Kazuhiro Machida | Server based broadcast system, apparatus and method and recording medium and software program relating to this system |
US20040056878A1 (en) * | 2001-01-30 | 2004-03-25 | Lau Johnny Sya Chung | Digital assistants |
ITMI20010538A1 (en) * | 2001-03-14 | 2002-09-14 | Phoenix Tools S R L | SYSTEM FOR CREATING THE VISUALIZATION AND MANAGEMENT OF THREE-DIMENSIONAL OBJECTS ON WEB PAGES AND RELATED METHOD |
US20030067466A1 (en) * | 2001-10-09 | 2003-04-10 | Eastman Kodak Company | Method for using an animated insert which enhances the value of images in infoimaging |
KR100443553B1 (en) * | 2002-12-06 | 2004-08-09 | 한국전자통신연구원 | Method and system for controlling virtual character and recording medium |
JP2004295541A (en) * | 2003-03-27 | 2004-10-21 | Victor Co Of Japan Ltd | Image formation program and image reproduction program |
US7818658B2 (en) * | 2003-12-09 | 2010-10-19 | Yi-Chih Chen | Multimedia presentation system |
US7548243B2 (en) * | 2004-03-26 | 2009-06-16 | Pixar | Dynamic scene descriptor method and apparatus |
US7330185B2 (en) | 2004-05-10 | 2008-02-12 | Pixar | Techniques for processing complex scenes |
US7532212B2 (en) | 2004-05-10 | 2009-05-12 | Pixar | Techniques for rendering complex scenes |
US20050248573A1 (en) * | 2004-05-10 | 2005-11-10 | Pixar | Storing intra-model dependency information |
US7064761B2 (en) * | 2004-05-10 | 2006-06-20 | Pixar | Techniques for animating complex scenes |
US7683904B2 (en) * | 2004-05-17 | 2010-03-23 | Pixar | Manual component asset change isolation methods and apparatus |
CN100410923C (en) * | 2005-04-20 | 2008-08-13 | 文化传信科技(澳门)有限公司 | Multimedia transmitting method and system |
US20070291776A1 (en) * | 2005-07-28 | 2007-12-20 | Dilithium Networks, Inc. | Method and apparatus for billing for media during communications in channel-based media telecommunication protocols |
US20070291106A1 (en) * | 2005-07-28 | 2007-12-20 | Dilithium Networks, Inc. | Method and apparatus for providing interactive media during communication in channel-based media telecommunication protocols |
US9883028B2 (en) * | 2005-07-28 | 2018-01-30 | Onmobile Global Limited | Method and apparatus for providing interactive media during communication in channel-based media telecommunication protocols |
US7460730B2 (en) * | 2005-08-04 | 2008-12-02 | Microsoft Corporation | Video registration and image sequence stitching |
US8134552B2 (en) * | 2005-09-23 | 2012-03-13 | Samsung Electronics Co., Ltd. | Method, apparatus, and medium for efficiently rendering 3D object |
KR100727034B1 (en) * | 2005-12-09 | 2007-06-12 | 한국전자통신연구원 | Method for representing and animating 2d humanoid character in 3d space |
US7856125B2 (en) * | 2006-01-31 | 2010-12-21 | University Of Southern California | 3D face reconstruction from 2D images |
TWI322392B (en) * | 2006-12-14 | 2010-03-21 | Inst Information Industry | Apparatus, method, application program, and computer readable medium thereof capable of pre-storing data for generating self-shadow of a 3d object |
KR100868475B1 (en) * | 2007-02-16 | 2008-11-12 | 한국전자통신연구원 | Method for creating, editing, and reproducing multi-object audio contents files for object-based audio service, and method for creating audio presets |
US8315652B2 (en) * | 2007-05-18 | 2012-11-20 | Immersion Corporation | Haptically enabled messaging |
JP4519883B2 (en) * | 2007-06-01 | 2010-08-04 | 株式会社コナミデジタルエンタテインメント | Character display device, character display method, and program |
WO2009067560A1 (en) * | 2007-11-20 | 2009-05-28 | Big Stage Entertainment, Inc. | Systems and methods for generating 3d head models and for using the same |
CN101188025B (en) * | 2007-11-30 | 2010-08-11 | 电子科技大学 | A high-efficiency real time group animation system |
US8106910B2 (en) * | 2008-03-28 | 2012-01-31 | Vldimir Pugach | Method for correct reproduction of moving spatial images on a flat screen |
US8373704B1 (en) * | 2008-08-25 | 2013-02-12 | Adobe Systems Incorporated | Systems and methods for facilitating object movement using object component relationship markers |
US8683429B2 (en) | 2008-08-25 | 2014-03-25 | Adobe Systems Incorporated | Systems and methods for runtime control of hierarchical objects |
US20100088624A1 (en) * | 2008-10-03 | 2010-04-08 | The Provost, Fellows And Scholars Of The College Of The Holy And Undivided Trinity Of Queen Elizabe | Animation tool |
US20110293144A1 (en) * | 2009-02-02 | 2011-12-01 | Agency For Science, Technology And Research | Method and System for Rendering an Entertainment Animation |
JP5582135B2 (en) * | 2009-02-18 | 2014-09-03 | 日本電気株式会社 | OPERATION OBJECT CONTROL DEVICE, OPERATION OBJECT CONTROL SYSTEM, OPERATION OBJECT CONTROL METHOD, AND PROGRAM |
US8624898B1 (en) | 2009-03-09 | 2014-01-07 | Pixar | Typed dependency graphs |
JP5375696B2 (en) * | 2010-03-19 | 2013-12-25 | ブラザー工業株式会社 | Distribution system, terminal device, distribution method, and program |
CN102447688A (en) * | 2010-10-15 | 2012-05-09 | 盛绩信息技术(上海)有限公司 | Webpage game resource accelerator and acceleration method |
CN102609553B (en) * | 2011-01-25 | 2013-12-18 | 上海创翼动漫科技有限公司 | Method for drawing computer-assisted animation modeling |
US20130120378A1 (en) * | 2011-11-15 | 2013-05-16 | Trimble Navigation Limited | Progressively providing software components for browser-based 3d modeling |
NL2008437C2 (en) * | 2012-01-19 | 2013-07-22 | Clinical Graphics B V | Process to generate a computer-accessible medium comprising information on the functioning of a joint. |
JP5756969B2 (en) * | 2012-02-25 | 2015-07-29 | 株式会社クロダアンドパートナーズ | Method, system, server device, terminal device, and program for distributing data constituting three-dimensional figure |
CN102663795B (en) * | 2012-04-06 | 2014-11-19 | 谌方琦 | 2.5D character animation realization method based on webpage and system thereof |
WO2013186593A1 (en) * | 2012-06-14 | 2013-12-19 | Nokia Corporation | Audio capture apparatus |
KR101615371B1 (en) * | 2014-12-31 | 2016-05-12 | 에이알모드커뮤니케이션(주) | 3D Animation production methods |
US9786032B2 (en) * | 2015-07-28 | 2017-10-10 | Google Inc. | System for parametric generation of custom scalable animated characters on the web |
CN106201559A (en) * | 2016-08-24 | 2016-12-07 | 合肥凌翔信息科技有限公司 | A kind of graphical programming software |
Citations (79)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3174851A (en) * | 1961-12-01 | 1965-03-23 | William J Buehler | Nickel-base alloys |
US3351463A (en) * | 1965-08-20 | 1967-11-07 | Alexander G Rozner | High strength nickel-base alloys |
US3753700A (en) * | 1970-07-02 | 1973-08-21 | Raychem Corp | Heat recoverable alloy |
US4655771A (en) * | 1982-04-30 | 1987-04-07 | Shepherd Patents S.A. | Prosthesis comprising an expansible or contractile tubular body |
US4850980A (en) * | 1987-12-04 | 1989-07-25 | Fisher Scientific Company | I.V. pump cassette |
US4865516A (en) * | 1987-05-13 | 1989-09-12 | Focke & Co., Gmbh | Rotatable elevating carrier for a palletizer |
US4994069A (en) * | 1988-11-02 | 1991-02-19 | Target Therapeutics | Vaso-occlusion coil and method |
US5151105A (en) * | 1991-10-07 | 1992-09-29 | Kwan Gett Clifford | Collapsible vessel sleeve implant |
US5176661A (en) * | 1988-09-06 | 1993-01-05 | Advanced Cardiovascular Systems, Inc. | Composite vascular catheter |
US5203772A (en) * | 1989-01-09 | 1993-04-20 | Pilot Cardiovascular Systems, Inc. | Steerable medical device |
US5234437A (en) * | 1991-12-12 | 1993-08-10 | Target Therapeutics, Inc. | Detachable pusher-vasoocclusion coil assembly with threaded coupling |
US5250071A (en) * | 1992-09-22 | 1993-10-05 | Target Therapeutics, Inc. | Detachable embolic coil assembly using interlocking clasps and method of use |
US5261916A (en) * | 1991-12-12 | 1993-11-16 | Target Therapeutics | Detachable pusher-vasoocclusive coil assembly with interlocking ball and keyway coupling |
US5290552A (en) * | 1988-05-02 | 1994-03-01 | Matrix Pharmaceutical, Inc./Project Hear | Surgical adhesive material |
US5304195A (en) * | 1991-12-12 | 1994-04-19 | Target Therapeutics, Inc. | Detachable pusher-vasoocclusive coil assembly with interlocking coupling |
US5304194A (en) * | 1991-10-02 | 1994-04-19 | Target Therapeutics | Vasoocclusion coil with attached fibrous element(s) |
US5312415A (en) * | 1992-09-22 | 1994-05-17 | Target Therapeutics, Inc. | Assembly for placement of embolic coils using frictional placement |
US5350397A (en) * | 1992-11-13 | 1994-09-27 | Target Therapeutics, Inc. | Axially detachable embolic coil assembly |
US5363861A (en) * | 1991-11-08 | 1994-11-15 | Ep Technologies, Inc. | Electrode tip assembly with variable resistance to bending |
US5405379A (en) * | 1990-07-26 | 1995-04-11 | Lane; Rodney J. | Self expanding vascular endoprosthesis for aneurysms |
US5423829A (en) * | 1993-11-03 | 1995-06-13 | Target Therapeutics, Inc. | Electrolytically severable joint for endovascular embolic devices |
US5562619A (en) * | 1993-08-19 | 1996-10-08 | Boston Scientific Corporation | Deflectable catheter |
US5622836A (en) * | 1989-05-24 | 1997-04-22 | The University Of Sydney | Monoclonal antibodies which recognize malignant cells from bladder carcinomas |
US5624461A (en) * | 1995-06-06 | 1997-04-29 | Target Therapeutics, Inc. | Three dimensional in-filling vaso-occlusive coils |
US5639277A (en) * | 1995-04-28 | 1997-06-17 | Target Therapeutics, Inc. | Embolic coils with offset helical and twisted helical shapes |
US5649940A (en) * | 1994-09-28 | 1997-07-22 | Innovasive Devices, Inc. | Suture tensioning device |
US5690666A (en) * | 1992-11-18 | 1997-11-25 | Target Therapeutics, Inc. | Ultrasoft embolism coils and process for using them |
US5696892A (en) * | 1992-07-10 | 1997-12-09 | The Walt Disney Company | Method and apparatus for providing animation in a three-dimensional computer generated virtual world using a succession of textures derived from temporally related source images |
US5713917A (en) * | 1995-10-30 | 1998-02-03 | Leonhardt; Howard J. | Apparatus and method for engrafting a blood vessel |
US5792154A (en) * | 1996-04-10 | 1998-08-11 | Target Therapeutics, Inc. | Soft-ended fibered micro vaso-occlusive devices |
US5826597A (en) * | 1997-09-10 | 1998-10-27 | Chou; Chi-Hsiung | Hair band made with two differently colored pieces |
US5843118A (en) * | 1995-12-04 | 1998-12-01 | Target Therapeutics, Inc. | Fibered micro vaso-occlusive devices |
US5870683A (en) * | 1996-09-18 | 1999-02-09 | Nokia Mobile Phones Limited | Mobile station having method and apparatus for displaying user-selectable animation sequence |
US5880731A (en) * | 1995-12-14 | 1999-03-09 | Microsoft Corporation | Use of avatars with automatic gesturing and bounded interaction in on-line chat session |
US5889951A (en) * | 1996-05-13 | 1999-03-30 | Viewpoint Corporation | Systems, methods, and computer program products for accessing, leasing, relocating, constructing and modifying internet sites within a multi-dimensional virtual reality environment |
US5893647A (en) * | 1996-03-15 | 1999-04-13 | Isel Co., Ltd. | Bearing retainer for a sliding mechanism for use in a machine tool |
US5909218A (en) * | 1996-04-25 | 1999-06-01 | Matsushita Electric Industrial Co., Ltd. | Transmitter-receiver of three-dimensional skeleton structure motions and method thereof |
US5912675A (en) * | 1996-12-19 | 1999-06-15 | Avid Technology, Inc. | System and method using bounding volumes for assigning vertices of envelopes to skeleton elements in an animation system |
US5925038A (en) * | 1996-01-19 | 1999-07-20 | Ep Technologies, Inc. | Expandable-collapsible electrode structures for capacitive coupling to tissue |
US5936633A (en) * | 1996-07-23 | 1999-08-10 | International Business Machines Corporation | Rendering method and apparatus, and method and apparatus for smoothing intensity-value |
US5974238A (en) * | 1996-08-07 | 1999-10-26 | Compaq Computer Corporation | Automatic data synchronization between a handheld and a host computer using pseudo cache including tags and logical data elements |
US5983190A (en) * | 1997-05-19 | 1999-11-09 | Microsoft Corporation | Client server animation system for managing interactive user interface characters |
US6049341A (en) * | 1997-10-20 | 2000-04-11 | Microsoft Corporation | Edge cycle collision detection in graphics environment |
US6081278A (en) * | 1998-06-11 | 2000-06-27 | Chen; Shenchang Eric | Animation object having multiple resolution format |
US6088731A (en) * | 1998-04-24 | 2000-07-11 | Associative Computing, Inc. | Intelligent assistant for use with a local computer and with the internet |
US6121981A (en) * | 1997-05-19 | 2000-09-19 | Microsoft Corporation | Method and system for generating arbitrary-shaped animation in the user interface of a computer |
US6188788B1 (en) * | 1997-12-09 | 2001-02-13 | Texas Instruments Incorporated | Automatic color saturation control in video decoder using recursive algorithm |
US6201948B1 (en) * | 1996-05-22 | 2001-03-13 | Netsage Corporation | Agent based instruction system and method |
US6208357B1 (en) * | 1998-04-14 | 2001-03-27 | Avid Technology, Inc. | Method and apparatus for creating and animating characters having associated behavior |
US6215498B1 (en) * | 1998-09-10 | 2001-04-10 | Lionhearth Technologies, Inc. | Virtual command post |
US6216044B1 (en) * | 1993-03-16 | 2001-04-10 | Ep Technologies, Inc. | Medical device with three dimensional collapsible basket structure |
US6221066B1 (en) * | 1999-03-09 | 2001-04-24 | Micrus Corporation | Shape memory segmented detachable coil |
US6244610B1 (en) * | 1996-10-28 | 2001-06-12 | Klaus Kramer-Massow | Two wheeled vehicle, especially a bicycle |
US6249293B1 (en) * | 1994-09-05 | 2001-06-19 | Fujitsu Limited | Virtual world animation using status and response for interference and time schedule |
US6277126B1 (en) * | 1998-10-05 | 2001-08-21 | Cordis Neurovascular Inc. | Heated vascular occlusion coil development system |
US6280457B1 (en) * | 1999-06-04 | 2001-08-28 | Scimed Life Systems, Inc. | Polymer covered vaso-occlusive devices and methods of producing such devices |
US6285380B1 (en) * | 1994-08-02 | 2001-09-04 | New York University | Method and system for scripting interactive animated actors |
US6287318B1 (en) * | 1998-02-13 | 2001-09-11 | Target Therapeutics, Inc. | Vaso-occlusive device with attached polymeric materials |
US20010020943A1 (en) * | 2000-02-17 | 2001-09-13 | Toshiki Hijiri | Animation data compression apparatus, animation data compression method, network server, and program storage media |
US6349301B1 (en) * | 1998-02-24 | 2002-02-19 | Microsoft Corporation | Virtual environment bystander updating in client server architecture |
US6371972B1 (en) * | 1998-02-18 | 2002-04-16 | Target Therapeutics, Inc. | Vaso-occlusive member assembly with multiple detaching points |
US6377263B1 (en) * | 1997-07-07 | 2002-04-23 | Aesthetic Solutions | Intelligent software components for virtual worlds |
US6390371B1 (en) * | 1998-02-13 | 2002-05-21 | Micron Technology, Inc. | Method and system for displaying information uniformly on tethered and remote input devices |
US6397080B1 (en) * | 1998-06-05 | 2002-05-28 | Telefonaktiebolaget Lm Ericsson | Method and a device for use in a virtual environment |
US6409721B1 (en) * | 1998-02-19 | 2002-06-25 | Target Therapeutics, Inc. | Process for forming an occlusion in a body cavity |
US6414684B1 (en) * | 1996-04-25 | 2002-07-02 | Matsushita Electric Industrial Co., Ltd. | Method for communicating and generating computer graphics animation data, and recording media |
US6416541B2 (en) * | 1998-07-24 | 2002-07-09 | Micrus Corporation | Intravascular flow modifier and reinforcement device |
US6423085B1 (en) * | 1998-01-27 | 2002-07-23 | The Regents Of The University Of California | Biodegradable polymer coils for intraluminal implants |
US6425914B1 (en) * | 1997-08-29 | 2002-07-30 | Target Therapeutics, Inc. | Fast-detaching electrically insulated implant |
US6458127B1 (en) * | 1999-11-22 | 2002-10-01 | Csaba Truckai | Polymer embolic elements with metallic coatings for occlusion of vascular malformations |
US6468266B1 (en) * | 1997-08-29 | 2002-10-22 | Scimed Life Systems, Inc. | Fast detaching electrically isolated implant |
US20020154124A1 (en) * | 2001-02-22 | 2002-10-24 | Han Sang-Yong | System and method of enhanced computer user interaction |
US6475227B2 (en) * | 1997-12-24 | 2002-11-05 | Scimed Life Systems, Inc. | Vaso-occlusion apparatus having a mechanically expandable detachment joint and a method for using the apparatus |
US6478773B1 (en) * | 1998-12-21 | 2002-11-12 | Micrus Corporation | Apparatus for deployment of micro-coil using a catheter |
US6484684B2 (en) * | 2000-05-17 | 2002-11-26 | Man Nutzfahrzeuge Ag | Crankshaft rotational support arrangement for an internal combustion engine |
US6500149B2 (en) * | 1998-08-31 | 2002-12-31 | Deepak Gandhi | Apparatus for deployment of micro-coil using a catheter |
US6512520B1 (en) * | 1997-07-31 | 2003-01-28 | Matsushita Electric Industrial Co., Ltd. | Apparatus for and method of transmitting and receiving data streams representing 3-dimensional virtual space |
US6559845B1 (en) * | 1999-06-11 | 2003-05-06 | Pulse Entertainment | Three dimensional animation system and method |
US6922576B2 (en) * | 1998-06-19 | 2005-07-26 | Becton, Dickinson And Company | Micro optical sensor device |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR920022121A (en) * | 1991-05-08 | 1992-12-19 | 소오 유끼 | How to write animation |
JP3361844B2 (en) * | 1992-12-22 | 2003-01-07 | 松下電器産業株式会社 | Image editing device |
IL119928A (en) * | 1996-12-29 | 2001-01-11 | Univ Ramot | Model-based view extrapolation for interactive virtual reality systems |
JPH10320589A (en) * | 1997-05-22 | 1998-12-04 | Matsushita Electric Ind Co Ltd | Three-dimensional graphics display device |
JP3338382B2 (en) | 1997-07-31 | 2002-10-28 | 松下電器産業株式会社 | Apparatus and method for transmitting and receiving a data stream representing a three-dimensional virtual space |
-
1999
- 1999-06-11 US US09/330,681 patent/US6559845B1/en not_active Expired - Lifetime
-
2000
- 2000-03-27 NZ NZ503625A patent/NZ503625A/en unknown
- 2000-03-28 SG SG200001755A patent/SG101932A1/en unknown
- 2000-03-29 BR BR0009614-8A patent/BR0009614A/en not_active IP Right Cessation
- 2000-03-30 TW TW089105957A patent/TW462032B/en not_active IP Right Cessation
- 2000-03-30 CA CA002303548A patent/CA2303548C/en not_active Expired - Lifetime
- 2000-03-31 MX MXPA00003211A patent/MXPA00003211A/en active IP Right Grant
- 2000-03-31 KR KR10-2000-0016830A patent/KR100496718B1/en not_active IP Right Cessation
- 2000-03-31 AU AU25186/00A patent/AU783271B2/en not_active Ceased
- 2000-03-31 JP JP2000136435A patent/JP3566909B2/en not_active Expired - Lifetime
- 2000-03-31 CN CNB001053442A patent/CN100342368C/en not_active Expired - Lifetime
- 2000-04-25 EP EP00108781A patent/EP1059614A2/en not_active Withdrawn
- 2000-06-09 EP EP00942714A patent/EP1221142A4/en not_active Withdrawn
- 2000-06-09 AU AU57299/00A patent/AU5729900A/en not_active Abandoned
- 2000-06-09 WO PCT/US2000/015817 patent/WO2000077742A1/en active Application Filing
-
2003
- 2003-02-03 US US10/357,672 patent/US20030137516A1/en not_active Abandoned
- 2003-10-07 JP JP2003348692A patent/JP2004013918A/en active Pending
-
2010
- 2010-03-19 JP JP2010064859A patent/JP4994470B2/en not_active Expired - Lifetime
Patent Citations (87)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3174851A (en) * | 1961-12-01 | 1965-03-23 | William J Buehler | Nickel-base alloys |
US3351463A (en) * | 1965-08-20 | 1967-11-07 | Alexander G Rozner | High strength nickel-base alloys |
US3753700A (en) * | 1970-07-02 | 1973-08-21 | Raychem Corp | Heat recoverable alloy |
US4655771A (en) * | 1982-04-30 | 1987-04-07 | Shepherd Patents S.A. | Prosthesis comprising an expansible or contractile tubular body |
US4954126A (en) * | 1982-04-30 | 1990-09-04 | Shepherd Patents S.A. | Prosthesis comprising an expansible or contractile tubular body |
US4655771B1 (en) * | 1982-04-30 | 1996-09-10 | Medinvent Ams Sa | Prosthesis comprising an expansible or contractile tubular body |
US4954126B1 (en) * | 1982-04-30 | 1996-05-28 | Ams Med Invent S A | Prosthesis comprising an expansible or contractile tubular body |
US4865516A (en) * | 1987-05-13 | 1989-09-12 | Focke & Co., Gmbh | Rotatable elevating carrier for a palletizer |
US4850980A (en) * | 1987-12-04 | 1989-07-25 | Fisher Scientific Company | I.V. pump cassette |
US5290552A (en) * | 1988-05-02 | 1994-03-01 | Matrix Pharmaceutical, Inc./Project Hear | Surgical adhesive material |
US5176661A (en) * | 1988-09-06 | 1993-01-05 | Advanced Cardiovascular Systems, Inc. | Composite vascular catheter |
US4994069A (en) * | 1988-11-02 | 1991-02-19 | Target Therapeutics | Vaso-occlusion coil and method |
US5203772A (en) * | 1989-01-09 | 1993-04-20 | Pilot Cardiovascular Systems, Inc. | Steerable medical device |
US5622836A (en) * | 1989-05-24 | 1997-04-22 | The University Of Sydney | Monoclonal antibodies which recognize malignant cells from bladder carcinomas |
US5405379A (en) * | 1990-07-26 | 1995-04-11 | Lane; Rodney J. | Self expanding vascular endoprosthesis for aneurysms |
US5304194A (en) * | 1991-10-02 | 1994-04-19 | Target Therapeutics | Vasoocclusion coil with attached fibrous element(s) |
US5151105A (en) * | 1991-10-07 | 1992-09-29 | Kwan Gett Clifford | Collapsible vessel sleeve implant |
US5363861A (en) * | 1991-11-08 | 1994-11-15 | Ep Technologies, Inc. | Electrode tip assembly with variable resistance to bending |
US5304195A (en) * | 1991-12-12 | 1994-04-19 | Target Therapeutics, Inc. | Detachable pusher-vasoocclusive coil assembly with interlocking coupling |
US5234437A (en) * | 1991-12-12 | 1993-08-10 | Target Therapeutics, Inc. | Detachable pusher-vasoocclusion coil assembly with threaded coupling |
US5261916A (en) * | 1991-12-12 | 1993-11-16 | Target Therapeutics | Detachable pusher-vasoocclusive coil assembly with interlocking ball and keyway coupling |
US5696892A (en) * | 1992-07-10 | 1997-12-09 | The Walt Disney Company | Method and apparatus for providing animation in a three-dimensional computer generated virtual world using a succession of textures derived from temporally related source images |
US5250071A (en) * | 1992-09-22 | 1993-10-05 | Target Therapeutics, Inc. | Detachable embolic coil assembly using interlocking clasps and method of use |
US5312415A (en) * | 1992-09-22 | 1994-05-17 | Target Therapeutics, Inc. | Assembly for placement of embolic coils using frictional placement |
US5350397A (en) * | 1992-11-13 | 1994-09-27 | Target Therapeutics, Inc. | Axially detachable embolic coil assembly |
US5690666A (en) * | 1992-11-18 | 1997-11-25 | Target Therapeutics, Inc. | Ultrasoft embolism coils and process for using them |
US5718711A (en) * | 1992-11-18 | 1998-02-17 | Target Therapeutics, Inc. | Ultrasoft embolism devices and process for using them |
US6216044B1 (en) * | 1993-03-16 | 2001-04-10 | Ep Technologies, Inc. | Medical device with three dimensional collapsible basket structure |
US5562619A (en) * | 1993-08-19 | 1996-10-08 | Boston Scientific Corporation | Deflectable catheter |
US5423829A (en) * | 1993-11-03 | 1995-06-13 | Target Therapeutics, Inc. | Electrolytically severable joint for endovascular embolic devices |
US6285380B1 (en) * | 1994-08-02 | 2001-09-04 | New York University | Method and system for scripting interactive animated actors |
US6249293B1 (en) * | 1994-09-05 | 2001-06-19 | Fujitsu Limited | Virtual world animation using status and response for interference and time schedule |
US5649940A (en) * | 1994-09-28 | 1997-07-22 | Innovasive Devices, Inc. | Suture tensioning device |
US5639277A (en) * | 1995-04-28 | 1997-06-17 | Target Therapeutics, Inc. | Embolic coils with offset helical and twisted helical shapes |
US5624461A (en) * | 1995-06-06 | 1997-04-29 | Target Therapeutics, Inc. | Three dimensional in-filling vaso-occlusive coils |
US5713917A (en) * | 1995-10-30 | 1998-02-03 | Leonhardt; Howard J. | Apparatus and method for engrafting a blood vessel |
US5843118A (en) * | 1995-12-04 | 1998-12-01 | Target Therapeutics, Inc. | Fibered micro vaso-occlusive devices |
US5880731A (en) * | 1995-12-14 | 1999-03-09 | Microsoft Corporation | Use of avatars with automatic gesturing and bounded interaction in on-line chat session |
US5925038A (en) * | 1996-01-19 | 1999-07-20 | Ep Technologies, Inc. | Expandable-collapsible electrode structures for capacitive coupling to tissue |
US5893647A (en) * | 1996-03-15 | 1999-04-13 | Isel Co., Ltd. | Bearing retainer for a sliding mechanism for use in a machine tool |
US5792154A (en) * | 1996-04-10 | 1998-08-11 | Target Therapeutics, Inc. | Soft-ended fibered micro vaso-occlusive devices |
US5909218A (en) * | 1996-04-25 | 1999-06-01 | Matsushita Electric Industrial Co., Ltd. | Transmitter-receiver of three-dimensional skeleton structure motions and method thereof |
US6388670B2 (en) * | 1996-04-25 | 2002-05-14 | Matsushita Electric Industrial Co., Ltd. | Transmitter-receiver of three-dimensional skeleton structure motions and method thereof |
US20010007452A1 (en) * | 1996-04-25 | 2001-07-12 | Matsushita Electric Industrial Co., Ltd. | Transmitter-receiver of three-dimensional skeleton structure motions and method thereof |
US6414684B1 (en) * | 1996-04-25 | 2002-07-02 | Matsushita Electric Industrial Co., Ltd. | Method for communicating and generating computer graphics animation data, and recording media |
US6222560B1 (en) * | 1996-04-25 | 2001-04-24 | Matsushita Electric Industrial Co., Ltd. | Transmitter-receiver of three-dimensional skeleton structure motions and method thereof |
US5889951A (en) * | 1996-05-13 | 1999-03-30 | Viewpoint Corporation | Systems, methods, and computer program products for accessing, leasing, relocating, constructing and modifying internet sites within a multi-dimensional virtual reality environment |
US6201948B1 (en) * | 1996-05-22 | 2001-03-13 | Netsage Corporation | Agent based instruction system and method |
US5936633A (en) * | 1996-07-23 | 1999-08-10 | International Business Machines Corporation | Rendering method and apparatus, and method and apparatus for smoothing intensity-value |
US5974238A (en) * | 1996-08-07 | 1999-10-26 | Compaq Computer Corporation | Automatic data synchronization between a handheld and a host computer using pseudo cache including tags and logical data elements |
US5870683A (en) * | 1996-09-18 | 1999-02-09 | Nokia Mobile Phones Limited | Mobile station having method and apparatus for displaying user-selectable animation sequence |
US6244610B1 (en) * | 1996-10-28 | 2001-06-12 | Klaus Kramer-Massow | Two wheeled vehicle, especially a bicycle |
US5912675A (en) * | 1996-12-19 | 1999-06-15 | Avid Technology, Inc. | System and method using bounding volumes for assigning vertices of envelopes to skeleton elements in an animation system |
US5983190A (en) * | 1997-05-19 | 1999-11-09 | Microsoft Corporation | Client server animation system for managing interactive user interface characters |
US6121981A (en) * | 1997-05-19 | 2000-09-19 | Microsoft Corporation | Method and system for generating arbitrary-shaped animation in the user interface of a computer |
US6369821B2 (en) * | 1997-05-19 | 2002-04-09 | Microsoft Corporation | Method and system for synchronizing scripted animations |
US6377263B1 (en) * | 1997-07-07 | 2002-04-23 | Aesthetic Solutions | Intelligent software components for virtual worlds |
US6512520B1 (en) * | 1997-07-31 | 2003-01-28 | Matsushita Electric Industrial Co., Ltd. | Apparatus for and method of transmitting and receiving data streams representing 3-dimensional virtual space |
US6425914B1 (en) * | 1997-08-29 | 2002-07-30 | Target Therapeutics, Inc. | Fast-detaching electrically insulated implant |
US6468266B1 (en) * | 1997-08-29 | 2002-10-22 | Scimed Life Systems, Inc. | Fast detaching electrically isolated implant |
US5826597A (en) * | 1997-09-10 | 1998-10-27 | Chou; Chi-Hsiung | Hair band made with two differently colored pieces |
US6049341A (en) * | 1997-10-20 | 2000-04-11 | Microsoft Corporation | Edge cycle collision detection in graphics environment |
US6188788B1 (en) * | 1997-12-09 | 2001-02-13 | Texas Instruments Incorporated | Automatic color saturation control in video decoder using recursive algorithm |
US6475227B2 (en) * | 1997-12-24 | 2002-11-05 | Scimed Life Systems, Inc. | Vaso-occlusion apparatus having a mechanically expandable detachment joint and a method for using the apparatus |
US6423085B1 (en) * | 1998-01-27 | 2002-07-23 | The Regents Of The University Of California | Biodegradable polymer coils for intraluminal implants |
US6390371B1 (en) * | 1998-02-13 | 2002-05-21 | Micron Technology, Inc. | Method and system for displaying information uniformly on tethered and remote input devices |
US6287318B1 (en) * | 1998-02-13 | 2001-09-11 | Target Therapeutics, Inc. | Vaso-occlusive device with attached polymeric materials |
US6371972B1 (en) * | 1998-02-18 | 2002-04-16 | Target Therapeutics, Inc. | Vaso-occlusive member assembly with multiple detaching points |
US6409721B1 (en) * | 1998-02-19 | 2002-06-25 | Target Therapeutics, Inc. | Process for forming an occlusion in a body cavity |
US6349301B1 (en) * | 1998-02-24 | 2002-02-19 | Microsoft Corporation | Virtual environment bystander updating in client server architecture |
US6208357B1 (en) * | 1998-04-14 | 2001-03-27 | Avid Technology, Inc. | Method and apparatus for creating and animating characters having associated behavior |
US6088731A (en) * | 1998-04-24 | 2000-07-11 | Associative Computing, Inc. | Intelligent assistant for use with a local computer and with the internet |
US6397080B1 (en) * | 1998-06-05 | 2002-05-28 | Telefonaktiebolaget Lm Ericsson | Method and a device for use in a virtual environment |
US6081278A (en) * | 1998-06-11 | 2000-06-27 | Chen; Shenchang Eric | Animation object having multiple resolution format |
US6922576B2 (en) * | 1998-06-19 | 2005-07-26 | Becton, Dickinson And Company | Micro optical sensor device |
US6416541B2 (en) * | 1998-07-24 | 2002-07-09 | Micrus Corporation | Intravascular flow modifier and reinforcement device |
US6500149B2 (en) * | 1998-08-31 | 2002-12-31 | Deepak Gandhi | Apparatus for deployment of micro-coil using a catheter |
US6215498B1 (en) * | 1998-09-10 | 2001-04-10 | Lionhearth Technologies, Inc. | Virtual command post |
US6277126B1 (en) * | 1998-10-05 | 2001-08-21 | Cordis Neurovascular Inc. | Heated vascular occlusion coil development system |
US6478773B1 (en) * | 1998-12-21 | 2002-11-12 | Micrus Corporation | Apparatus for deployment of micro-coil using a catheter |
US6221066B1 (en) * | 1999-03-09 | 2001-04-24 | Micrus Corporation | Shape memory segmented detachable coil |
US6280457B1 (en) * | 1999-06-04 | 2001-08-28 | Scimed Life Systems, Inc. | Polymer covered vaso-occlusive devices and methods of producing such devices |
US6559845B1 (en) * | 1999-06-11 | 2003-05-06 | Pulse Entertainment | Three dimensional animation system and method |
US6458127B1 (en) * | 1999-11-22 | 2002-10-01 | Csaba Truckai | Polymer embolic elements with metallic coatings for occlusion of vascular malformations |
US20010020943A1 (en) * | 2000-02-17 | 2001-09-13 | Toshiki Hijiri | Animation data compression apparatus, animation data compression method, network server, and program storage media |
US6484684B2 (en) * | 2000-05-17 | 2002-11-26 | Man Nutzfahrzeuge Ag | Crankshaft rotational support arrangement for an internal combustion engine |
US20020154124A1 (en) * | 2001-02-22 | 2002-10-24 | Han Sang-Yong | System and method of enhanced computer user interaction |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7827034B1 (en) * | 2002-11-27 | 2010-11-02 | Totalsynch, Llc | Text-derived speech animation tool |
US7437684B2 (en) * | 2003-04-28 | 2008-10-14 | Snecma | Graphical interface system for manipulating a virtual dummy |
US20040257338A1 (en) * | 2003-04-28 | 2004-12-23 | Snecma Moteurs | Graphical interface system |
US20050104886A1 (en) * | 2003-11-14 | 2005-05-19 | Sumita Rao | System and method for sequencing media objects |
US20100073382A1 (en) * | 2003-11-14 | 2010-03-25 | Kyocera Wireless Corp. | System and method for sequencing media objects |
US7593015B2 (en) * | 2003-11-14 | 2009-09-22 | Kyocera Wireless Corp. | System and method for sequencing media objects |
US20050190188A1 (en) * | 2004-01-30 | 2005-09-01 | Ntt Docomo, Inc. | Portable communication terminal and program |
US20060029913A1 (en) * | 2004-08-06 | 2006-02-09 | John Alfieri | Alphabet based choreography method and system |
US20060109274A1 (en) * | 2004-10-28 | 2006-05-25 | Accelerated Pictures, Llc | Client/server-based animation software, systems and methods |
US7433760B2 (en) | 2004-10-28 | 2008-10-07 | Accelerated Pictures, Inc. | Camera and animation controller, systems and methods |
US20080024615A1 (en) * | 2006-07-28 | 2008-01-31 | Accelerated Pictures, Inc. | Camera control |
US20080028312A1 (en) * | 2006-07-28 | 2008-01-31 | Accelerated Pictures, Inc. | Scene organization in computer-assisted filmmaking |
US7880770B2 (en) | 2006-07-28 | 2011-02-01 | Accelerated Pictures, Inc. | Camera control |
US20120028706A1 (en) * | 2010-02-24 | 2012-02-02 | Valve Corporation | Compositing multiple scene shots into a video game clip |
US20120028707A1 (en) * | 2010-02-24 | 2012-02-02 | Valve Corporation | Game animations with multi-dimensional video game data |
US9381429B2 (en) * | 2010-02-24 | 2016-07-05 | Valve Corporation | Compositing multiple scene shots into a video game clip |
US20120021827A1 (en) * | 2010-02-25 | 2012-01-26 | Valve Corporation | Multi-dimensional video game world data recorder |
CN102934145A (en) * | 2010-05-10 | 2013-02-13 | 史克威尔·艾尼克斯有限公司 | Image processing device, image processing method, and image processing program |
CN105321195A (en) * | 2015-07-02 | 2016-02-10 | 苏州蜗牛数字科技股份有限公司 | Making method for whip system in 3D game |
Also Published As
Publication number | Publication date |
---|---|
TW462032B (en) | 2001-11-01 |
BR0009614A (en) | 2002-07-23 |
MXPA00003211A (en) | 2002-03-08 |
SG101932A1 (en) | 2004-02-27 |
JP2001043399A (en) | 2001-02-16 |
EP1059614A2 (en) | 2000-12-13 |
AU783271B2 (en) | 2005-10-06 |
AU2518600A (en) | 2000-12-14 |
CN100342368C (en) | 2007-10-10 |
JP4994470B2 (en) | 2012-08-08 |
JP2010160813A (en) | 2010-07-22 |
NZ503625A (en) | 2002-04-26 |
KR20010006939A (en) | 2001-01-26 |
CA2303548A1 (en) | 2000-12-11 |
WO2000077742A1 (en) | 2000-12-21 |
CN1277392A (en) | 2000-12-20 |
EP1221142A4 (en) | 2005-11-16 |
AU5729900A (en) | 2001-01-02 |
CA2303548C (en) | 2006-06-27 |
KR100496718B1 (en) | 2005-06-23 |
JP3566909B2 (en) | 2004-09-15 |
JP2004013918A (en) | 2004-01-15 |
US6559845B1 (en) | 2003-05-06 |
EP1221142A1 (en) | 2002-07-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6559845B1 (en) | Three dimensional animation system and method | |
Perlin et al. | Improv: A system for scripting interactive actors in virtual worlds | |
US8717359B2 (en) | Script control for camera positioning in a scene generated by a computer rendering engine | |
Gillies et al. | Comparing and evaluating real time character engines for virtual environments | |
US20110016004A1 (en) | Interactive character system | |
US20090091563A1 (en) | Character animation framework | |
JP2000512039A (en) | Programmable computer graphic objects | |
CN109035373A (en) | The generation of three-dimensional special efficacy program file packet and three-dimensional special efficacy generation method and device | |
Egges et al. | Presence and interaction in mixed reality environments | |
Thalmann et al. | Digital actors for interactive television | |
Neff et al. | Layered performance animation with correlation maps | |
Egges et al. | An interactive mixed reality framework for virtual humans | |
US11527032B1 (en) | Systems and methods to generate and utilize content styles for animation | |
Nedel | Simulating virtual humans | |
Fries et al. | A tool for designing MPEG-4 compliant expressions and animations on VRML cartoon-faces | |
FR2874724A1 (en) | Three dimensional avatar e.g. humanoid, temporal animation process for e.g. game, involves interpolating intermediate posture of avatar between initial posture and final posture taking into account of difference relative to initial posture | |
Magnenat-Thalmann | Presence and interaction in mixed reality environments | |
Capin et al. | Virtual Human Representation and Communication in Networked Virtual Environments | |
Vacchi et al. | Neo euclide: A low-cost system for performance animation and puppetry | |
Zhou | Application of 3D facial animation techniques for Chinese opera | |
Ruttkay et al. | Cartoon talking heads |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: PULSE ENTERTAINMENT, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HARVILL, YOUNG;BEAN, RICHARD;REEL/FRAME:013750/0812 Effective date: 19990609 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: LAASTRA TELECOM GMBH LLC, DELAWARE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PULSE ENTERTAINMENT, INC.;REEL/FRAME:023803/0086 Effective date: 20091215 |
|
AS | Assignment |
Owner name: HANGER SOLUTIONS, LLC, GEORGIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:INTELLECTUAL VENTURES ASSETS 158 LLC;REEL/FRAME:051486/0425 Effective date: 20191206 |
|
AS | Assignment |
Owner name: INTELLECTUAL VENTURES ASSETS 158 LLC, DELAWARE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CALLAHAN CELLULAR L.L.C.;REEL/FRAME:051727/0155 Effective date: 20191126 |