US20120309544A1 - Method for Synchronizing Character Object Information by Classified Data Type - Google Patents

Method for Synchronizing Character Object Information by Classified Data Type Download PDF

Info

Publication number
US20120309544A1
US20120309544A1 US13/578,606 US201113578606A US2012309544A1 US 20120309544 A1 US20120309544 A1 US 20120309544A1 US 201113578606 A US201113578606 A US 201113578606A US 2012309544 A1 US2012309544 A1 US 2012309544A1
Authority
US
United States
Prior art keywords
character
data
motion
character information
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/578,606
Inventor
Byong Soo Kang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BLUESIDE Inc
Original Assignee
BLUESIDE Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BLUESIDE Inc filed Critical BLUESIDE Inc
Assigned to BLUESIDE INC. reassignment BLUESIDE INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KANG, BYONG SOO
Publication of US20120309544A1 publication Critical patent/US20120309544A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F17/00Coin-freed apparatus for hiring articles; Coin-freed facilities or services
    • G07F17/32Coin-freed apparatus for hiring articles; Coin-freed facilities or services for games, toys, sports, or amusements
    • G07F17/326Game play aspects of gaming systems
    • G07F17/3272Games involving multiple players
    • G07F17/3281Games involving multiple players wherein game attributes are transferred between players, e.g. points, weapons, avatars
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • A63F13/358Adapting the game course according to the network or server load, e.g. for reducing latency due to different connection speeds between clients
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F17/00Coin-freed apparatus for hiring articles; Coin-freed facilities or services
    • G07F17/32Coin-freed apparatus for hiring articles; Coin-freed facilities or services for games, toys, sports, or amusements
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F17/00Coin-freed apparatus for hiring articles; Coin-freed facilities or services
    • G07F17/32Coin-freed apparatus for hiring articles; Coin-freed facilities or services for games, toys, sports, or amusements
    • G07F17/3225Data transfer within a gaming system, e.g. data sent between gaming machines and users
    • G07F17/3227Configuring a gaming machine, e.g. downloading personal settings, selecting working parameters
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/53Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of basic data processing
    • A63F2300/534Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of basic data processing for network load management, e.g. bandwidth optimization, latency reduction

Definitions

  • the present invention relates generally to synchronization of character information in on-line games, and more particularly to a method of synchronizing character information to provide coincidence of the character information between game users by classifying character information with data types.
  • the present invention is directed to provide a method of synchronizing character information between game users by data classification with position and motion data.
  • the present invention is also directed to provide a method of synchronizing character information between game users by data classification, in which the character information accords to the classification of data types generated only when characters' positions are changed or specific motions are carried out.
  • the present invention is also directed to provide a method of synchronizing character information between game users by data classification, in which position data are generated with predetermined intervals and represent procedures of positional changes by comparing just-prior position data to the latest position data.
  • a method of synchronizing character information by data classification in an on-line game system may comprise the steps of: transferring character information including position data, which is generated from a positional shift of a first user's character, or motion data, which is generated from a motional change of the character, to a second user; receiving the character information by the second user; determining a data type of the character information; and processing the character information with reference to the data type.
  • Transferring the character information may be comprised of: immediately transferring first position data to the second user if the character is changed over a predetermined distance in position from a stop state; and transferring position data, which is generated after a predetermined interval from just-prior position data that has been transferred, to the second user if the character is continuously shifted over the predetermined distance in position from a former state.
  • Processing the character information may be comprised of: shifting the character in position with reference to the first position data if the character is in a stop state; and shifting the character in position by calculating a shift distance obtained from comparing the just-prior position data with the latest position data received after the predetermined interval from the just-prior position data if the character is shifting in position.
  • the character may be shifted in position with a frame shift distance obtained from dividing the shift distance by the number of frames per the predetermined interval that is output from multiplying the predetermined interval by the number of frames per second of an animation representing the character.
  • the character's motions included in the motion data may be provided with motion numbers different each other and the motion data is transferred and received in forms of the motion numbers.
  • Transferring the character information may be comprised of immediately transferring the motion data to the second user if the character is changed in motion, and wherein the processing may be comprised of immediately changing the character in motion in correspondence with the received motion data.
  • the motion data may include still motion data operable only if the character is in a stop state, and shift motion data operable even if the character is shifting in position.
  • the motion data may include still motion data operable only if the character is in a stop state, and shift motion data operable even if the character is shifting in position; and wherein the processing may comprised of changing the character in motion after shifting the character in correspondence with a comparison between the just-prior position data and the latest position data if the still motion data is received within the predetermined interval.
  • This method of synchronizing character information by data classification has advantages as follows.
  • the character information is subdivided into position and motion data to save a capacity for individual information.
  • the character information is generated only when characters' positions are changed or motions are carried out, less dissipating the character information and reducing an operation load in the system.
  • the position data are generated in the predetermined intervals, the just-prior position data are compared with the latest position data, and the character animation frame is arranged according to the calculated ratio.
  • FIG. 1 is a flow chart showing a process of synchronizing character information between users by data classification according to an embodiment of the present invention.
  • FIG. 2 shows a status of a character to which position data is generated by an embodiment according to the present invention.
  • FIG. 3 is a schematic diagram showing a feature for changing a character' position by an embodiment according to the present invention.
  • FIG. 4 is a schematic diagram showing a feature for processing still motion data, when the still motion data is received while changing a character's position, by an embodiment according to the present invention.
  • FIG. 5 is a flow chart showing a general process of synchronizing character information by data classification according to an embodiment of the present invention.
  • the present invention provides a method for synchronizing character information between game users to render each user to play an on-line game on coincident screens, in which the character information is transferred from the user to other users when there is a change from positions or motions of the characters assigned to be operable by the user in the on-line game.
  • Character information according to the present invention contains position data generated responding to positional changes of characters, or motion data generated responding to various motional changes of characters.
  • position data are generated if characters are enforced to move by users' intent or passively shifted due to other objects, and motion data are generated when characters take specific motions.
  • Classifying character information into such position and motion data as such makes it possible to save a capacity for character information. And as data are generated only when there is an actual event of shift and motion, the system operating an on-line game can be lessened with efficiency in load.
  • FIG. 1 shows a process of synchronizing character information between users by data classification according to an embodiment of the present invention.
  • FIG. 1 exemplifying steps for performing an embodiment of the present invention, a process of synchronizing character information between first and second users will be detailed.
  • character information can be generated when a first user shifts his character or operates it to take s specific motion, or even when a first user's character is changed in position and motion by another user's character or an object given in the game.
  • the character information may be one of position and motion data, which is transferred to the second user as soon as it is generated.
  • This process is step S 1 to transfer the position and motion data of the first user's character to the second user.
  • the transferred character information is received at the second user, which is step S 2 to accept the character information by the second user.
  • the character information of the first user's character which is received as such, is determined whether it is position data or motion data. This is step S 3 to find out a data type of the character information.
  • step S 4 to process the character information in accordance with a data type determined from step S 3 .
  • FIG. 2 shows a status of a character to which position data is generated by an embodiment according to the present invention
  • FIG. 3 schematically illustrates a feature for changing a character' position by an embodiment according to the present invention.
  • character C operated by the first user is moving toward a specific position.
  • Character information generated during this operation is position data.
  • position data may include information about coordinate, direction and time.
  • position data is not generated when character C stops. But if the character's position is identified as being changed over a predetermined distance from the stop, first position data P is generated then.
  • the predetermined distance is a value preliminarily set and input into the system and first position data P is promptly transferred to the second user.
  • position data are generated in a predetermined interval after first position data P has been generated. That is, even though the first position data has been generated, position data can be created after the predetermined interval whenever a current position of character C is found as being shifted over the predetermined distance.
  • first position data P is generated at the time when character C shifts by 0.1 m from its stop state, and then immediately transferred to the second user.
  • a position of character C on the second user's screen is changed, through step S 4 , according to information by for example first position data P. Also even after that, if character C continues to shift, just-prior position data is compared to the latest position data and a position of character C is further changed along a displacement calculated from the comparison.
  • frames picturing animations of character C can be arranged with definite ratios in the predetermined intervals for the purpose of representing motions in a plausible and graceful form.
  • a frame rate per second means the number of pictures expressible in one second, using fps (frames per second) as the unit. As high as a value of fps, a motion can be represented smoother.
  • a frame rate per second is 20 fps
  • a predetermined interval is 0.2 seconds
  • character C is shifted by 4 m in the predetermined interval
  • 4 frames are used for the shift of character C within 0.2 seconds.
  • each frame can be arranged every 1 m. Therefore, it is possible to express the 4 m-shift of character C very plausibly in 0.2 seconds.
  • character information between the first and second users can be synchronized by way of the steps of processing such generated position data.
  • FIG. 4 schematically illustrates a feature for processing still motion data, when the still motion data is received while changing a character's position, by an embodiment according to the present invention. Now an embodiment about the motion data will be described with reference to FIG. 4 .
  • the motion data is character information generated when character C operated by the first user is acting a specific motion but shift. There are a variety of motions, but shift, permissible to character C, character information of which is called motion data.
  • motion data it could be inconvenient due to a volume of the motion data if the motion data are transferred and received as they are. For that reason, a motion number may be provided to each motion of character C, corresponding to the motion data, and then the motion data may be transferred and received to the second user in a form of the motion number. Thus, an amount of data can be reduced to process the motion data faster.
  • motion data may include still motion data conductible operable only when character C is in a stop state, and shift motion data operable even when character C is shifting in position.
  • the motion data may be classified into shift motion data. Otherwise, if it is impossible for character C to take a motion with shift, just capable of such as sitting on the ground or concentrating energy, the motion data may be classified into still motion data.
  • This motion data is transferred to the second user at step S 1 as soon as character information is generated by a motional change of character C. And in step S 4 , the character information received by the second user is immediately applied thereto.
  • still motion data is generated while character C is shifting in position
  • the still motion data is processed after processing position data. From FIG. 5 , it can be seen that still motion data is received within the predetermined interval that is a term for which position data is generated, and that the still motion data is processed after position data after receipt of the still motion data has been processed.
  • the still motion data is processed after processing the positional change of character C by comparing position data, which are before and after the generation of the still motion data, to each other.
  • FIG. 5 shows a general process of synchronizing character information by data classification according to an embodiment of the present invention.
  • character information is generated in response to a positional change or motion of a first user's character. Then, the character information is transferred to a second user and the second user receives the character information. This process corresponds to transfer step S 1 and reception step S 2 .
  • a step for identifying a type of the character information is carried out to determine whether the character information is position data or motion data.
  • this position data is further processed to be determined whether it is the first position data or not.
  • the character information is not position data, it is regarded as motion data. This motion data is then processed to be determined whether it is still motion data or shift motion data, which corresponds to determination step S 3 .
  • each of character information If the character information is identified to the first position data, the character is changed in position. Unless the character information is the first position data, the character is changed in position by using a varied distance calculated from comparing just-prior position data with current position data.
  • the character information is identified as shift motion data, the character is immediately changed in motion. If the character information is identified as still motion data, a processing time varies depending on whether the character is shifting or not in position.
  • the character information is still motion data
  • the character if the character is in stop, the character is immediately changed in motion. But in this case of still motion data, if the character is shifting in position, the character is changed in motion after the positional shift is completed. This corresponds to processing step S 4 .
  • the character information is synchronized between the first and second users through the aforementioned steps, offering a coincident screen about the character.

Abstract

A method of synchronizing character information by data classification in an on-line game system is disclosed, which includes the steps of: transferring character information including position data, which is generated from a positional shift of a first user's character, or motion data, which is generated from a motional change of the character, to a second user; receiving the character information by the second user; determining a data type of the character information; and processing the character information with reference to the data type.

Description

    TECHNICAL FIELD
  • The present invention relates generally to synchronization of character information in on-line games, and more particularly to a method of synchronizing character information to provide coincidence of the character information between game users by classifying character information with data types.
  • RELATED ART
  • Along a recent growth in the on-line game market, users are steadily increasing therewith. By evolving gaming functions as time goes on, they are scaling up broader and becoming more diverse. In those trends, the number of objects appearing at a game is increasing as also, developing various ways for processing such objects.
  • Especially, in on-line games, many users possess their own characters assigned to themselves, each user being capable of managing and operating his character for himself on desire. During this, if a user makes changes on his characters' information such as positions or motions, other users should be coincidentally provided with the changes of character information through their gaming screens.
  • There has been a conventional manner for synchronizing character information between users, in which character information is continuously transferred from a user toward other users regardless of the user's operations on the characters. Therefore, an amount of data can be exceedingly extended when data are present in multiplicity thereat.
  • Additionally, since character information can be transferred to other users even though characters' positions or actions have not ever been changed, it has a problem of unnecessarily dissipating information but valid data.
  • DETAILED DESCRIPTION OF THE INVENTION Technical Subjects
  • Accordingly, the present invention is directed to provide a method of synchronizing character information between game users by data classification with position and motion data.
  • The present invention is also directed to provide a method of synchronizing character information between game users by data classification, in which the character information accords to the classification of data types generated only when characters' positions are changed or specific motions are carried out.
  • The present invention is also directed to provide a method of synchronizing character information between game users by data classification, in which position data are generated with predetermined intervals and represent procedures of positional changes by comparing just-prior position data to the latest position data.
  • The subjects of the present invention will not be restricted to the aforementioned and other subjects can be understood by those skilled in the art through the following description.
  • Means for Solving the Subjects
  • In an embodiment, a method of synchronizing character information by data classification in an on-line game system may comprise the steps of: transferring character information including position data, which is generated from a positional shift of a first user's character, or motion data, which is generated from a motional change of the character, to a second user; receiving the character information by the second user; determining a data type of the character information; and processing the character information with reference to the data type.
  • Transferring the character information may be comprised of: immediately transferring first position data to the second user if the character is changed over a predetermined distance in position from a stop state; and transferring position data, which is generated after a predetermined interval from just-prior position data that has been transferred, to the second user if the character is continuously shifted over the predetermined distance in position from a former state.
  • Processing the character information may be comprised of: shifting the character in position with reference to the first position data if the character is in a stop state; and shifting the character in position by calculating a shift distance obtained from comparing the just-prior position data with the latest position data received after the predetermined interval from the just-prior position data if the character is shifting in position.
  • The character may be shifted in position with a frame shift distance obtained from dividing the shift distance by the number of frames per the predetermined interval that is output from multiplying the predetermined interval by the number of frames per second of an animation representing the character.
  • The character's motions included in the motion data may be provided with motion numbers different each other and the motion data is transferred and received in forms of the motion numbers.
  • Transferring the character information may be comprised of immediately transferring the motion data to the second user if the character is changed in motion, and wherein the processing may be comprised of immediately changing the character in motion in correspondence with the received motion data.
  • The motion data may include still motion data operable only if the character is in a stop state, and shift motion data operable even if the character is shifting in position.
  • The motion data may include still motion data operable only if the character is in a stop state, and shift motion data operable even if the character is shifting in position; and wherein the processing may comprised of changing the character in motion after shifting the character in correspondence with a comparison between the just-prior position data and the latest position data if the still motion data is received within the predetermined interval.
  • Effects of the Invention
  • This method of synchronizing character information by data classification has advantages as follows.
  • First, the character information is subdivided into position and motion data to save a capacity for individual information.
  • Second, the character information is generated only when characters' positions are changed or motions are carried out, less dissipating the character information and reducing an operation load in the system.
  • Third, lightening the total operation load in the system enables more character information to be processed at a time, so that a large quantity of characters can be practiced at the same time.
  • Fourth, it is available for plausible and graceful motions of characters because the position data are generated in the predetermined intervals, the just-prior position data are compared with the latest position data, and the character animation frame is arranged according to the calculated ratio.
  • Effects of the present invention will not be restrictive to the aforementioned and other effects not described here can be understood definitely by those skilled in the art through.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a flow chart showing a process of synchronizing character information between users by data classification according to an embodiment of the present invention.
  • FIG. 2 shows a status of a character to which position data is generated by an embodiment according to the present invention.
  • FIG. 3 is a schematic diagram showing a feature for changing a character' position by an embodiment according to the present invention.
  • FIG. 4 is a schematic diagram showing a feature for processing still motion data, when the still motion data is received while changing a character's position, by an embodiment according to the present invention.
  • FIG. 5 is a flow chart showing a general process of synchronizing character information by data classification according to an embodiment of the present invention.
  • THE BEST MODE FOR EMBODYING THE INVENTION
  • Hereinafter will now be described preferred embodiments for reducing the objects of the present invention with reference to the accompanying drawings. In this description, the same elements or components will be referred to the same names, numerals or signs, which are not duplicated by explanation.
  • The present invention provides a method for synchronizing character information between game users to render each user to play an on-line game on coincident screens, in which the character information is transferred from the user to other users when there is a change from positions or motions of the characters assigned to be operable by the user in the on-line game.
  • Character information according to the present invention contains position data generated responding to positional changes of characters, or motion data generated responding to various motional changes of characters. In other words, position data are generated if characters are enforced to move by users' intent or passively shifted due to other objects, and motion data are generated when characters take specific motions.
  • Classifying character information into such position and motion data as such makes it possible to save a capacity for character information. And as data are generated only when there is an actual event of shift and motion, the system operating an on-line game can be lessened with efficiency in load.
  • The details about position and motion data will be explained later.
  • FIG. 1 shows a process of synchronizing character information between users by data classification according to an embodiment of the present invention.
  • With reference to FIG. 1 exemplifying steps for performing an embodiment of the present invention, a process of synchronizing character information between first and second users will be detailed.
  • First, character information can be generated when a first user shifts his character or operates it to take s specific motion, or even when a first user's character is changed in position and motion by another user's character or an object given in the game.
  • During this, the character information may be one of position and motion data, which is transferred to the second user as soon as it is generated.
  • This process is step S1 to transfer the position and motion data of the first user's character to the second user.
  • The transferred character information is received at the second user, which is step S2 to accept the character information by the second user.
  • The character information of the first user's character, which is received as such, is determined whether it is position data or motion data. This is step S3 to find out a data type of the character information.
  • If the character information is position data, it is processed as position data. Otherwise if the character information is motion data, it is processed as motion data. This is step S4 to process the character information in accordance with a data type determined from step S3.
  • After the aforementioned steps, matters changed from the first user's character are applied and displayed on the second user's screen. Thus, a status of the first user's character is coincident between the first and second users' screens.
  • Now will be described a feature of transferring and processing the position and motion data.
  • FIG. 2 shows a status of a character to which position data is generated by an embodiment according to the present invention, and FIG. 3 schematically illustrates a feature for changing a character' position by an embodiment according to the present invention.
  • Referring to FIG. 2, character C operated by the first user is moving toward a specific position. Character information generated during this operation is position data. In general, position data may include information about coordinate, direction and time.
  • In detail, position data is not generated when character C stops. But if the character's position is identified as being changed over a predetermined distance from the stop, first position data P is generated then. Here, the predetermined distance is a value preliminarily set and input into the system and first position data P is promptly transferred to the second user.
  • If character C continues to change its position, position data are generated in a predetermined interval after first position data P has been generated. That is, even though the first position data has been generated, position data can be created after the predetermined interval whenever a current position of character C is found as being shifted over the predetermined distance.
  • For instance, referring to FIG. 3, in case that the predetermined is set to 0.1 m and the predetermined interval is set to 0.2 seconds, first position data P is generated at the time when character C shifts by 0.1 m from its stop state, and then immediately transferred to the second user.
  • Even after that, if character C is identified as being shifted, new position data is generated every 0.2 seconds and transferred to the second user.
  • Consequently, the second user receives such position data transferred thereto, then a position of character C on the second user's screen is changed, through step S4, according to information by for example first position data P. Also even after that, if character C continues to shift, just-prior position data is compared to the latest position data and a position of character C is further changed along a displacement calculated from the comparison.
  • During this, in processing shifts of character C in the predetermined intervals, frames picturing animations of character C can be arranged with definite ratios in the predetermined intervals for the purpose of representing motions in a plausible and graceful form.
  • Here, a frame rate per second means the number of pictures expressible in one second, using fps (frames per second) as the unit. As high as a value of fps, a motion can be represented smoother.
  • For example, referring to FIG. 3, assuming that a frame rate per second is 20 fps, a predetermined interval is 0.2 seconds, and character C is shifted by 4 m in the predetermined interval, 4 frames are used for the shift of character C within 0.2 seconds.
  • During this, as character C is shifted by 4 m in full, each frame can be arranged every 1 m. Therefore, it is possible to express the 4 m-shift of character C very plausibly in 0.2 seconds.
  • As described above, character information between the first and second users can be synchronized by way of the steps of processing such generated position data.
  • FIG. 4 schematically illustrates a feature for processing still motion data, when the still motion data is received while changing a character's position, by an embodiment according to the present invention. Now an embodiment about the motion data will be described with reference to FIG. 4.
  • The motion data is character information generated when character C operated by the first user is acting a specific motion but shift. There are a variety of motions, but shift, permissible to character C, character information of which is called motion data.
  • As many as motion data, it could be inconvenient due to a volume of the motion data if the motion data are transferred and received as they are. For that reason, a motion number may be provided to each motion of character C, corresponding to the motion data, and then the motion data may be transferred and received to the second user in a form of the motion number. Thus, an amount of data can be reduced to process the motion data faster.
  • In the meantime, motion data may include still motion data conductible operable only when character C is in a stop state, and shift motion data operable even when character C is shifting in position.
  • For instance, if it is possible for character C to flourish a weapon while shifting, the motion data may be classified into shift motion data. Otherwise, if it is impossible for character C to take a motion with shift, just capable of such as sitting on the ground or concentrating energy, the motion data may be classified into still motion data.
  • This motion data is transferred to the second user at step S1 as soon as character information is generated by a motional change of character C. And in step S4, the character information received by the second user is immediately applied thereto.
  • For instance, if the first user operates character C to brandish a weapon, this motion is displayed on the second user's screen, That is, all of shift and still motion data are transferred to the second user as soon as new character information by a motion of character C is generated.
  • Meanwhile, if still motion data is generated while character C is shifting in position, the still motion data is processed after processing position data. From FIG. 5, it can be seen that still motion data is received within the predetermined interval that is a term for which position data is generated, and that the still motion data is processed after position data after receipt of the still motion data has been processed.
  • In other words, if still motion data is generated while changing a position of character C, the still motion data is processed after processing the positional change of character C by comparing position data, which are before and after the generation of the still motion data, to each other.
  • With the explanation for transferring and processing position and motion data, now will be described a series of processing steps according to an embodiment of the present invention.
  • FIG. 5 shows a general process of synchronizing character information by data classification according to an embodiment of the present invention.
  • First, character information is generated in response to a positional change or motion of a first user's character. Then, the character information is transferred to a second user and the second user receives the character information. This process corresponds to transfer step S1 and reception step S2.
  • Subsequently, a step for identifying a type of the character information is carried out to determine whether the character information is position data or motion data.
  • If the character information is position data, this position data is further processed to be determined whether it is the first position data or not.
  • If the character information is not position data, it is regarded as motion data. This motion data is then processed to be determined whether it is still motion data or shift motion data, which corresponds to determination step S3.
  • Afterward, it continues to process each of character information. If the character information is identified to the first position data, the character is changed in position. Unless the character information is the first position data, the character is changed in position by using a varied distance calculated from comparing just-prior position data with current position data.
  • In addition, if the character information is identified as shift motion data, the character is immediately changed in motion. If the character information is identified as still motion data, a processing time varies depending on whether the character is shifting or not in position.
  • In case that the character information is still motion data, if the character is in stop, the character is immediately changed in motion. But in this case of still motion data, if the character is shifting in position, the character is changed in motion after the positional shift is completed. This corresponds to processing step S4.
  • As a result, the character information is synchronized between the first and second users through the aforementioned steps, offering a coincident screen about the character.
  • The foregoing is illustrative of exemplary embodiments and is not to be construed as limiting thereof. Although a few exemplary embodiments have been described, those skilled in the art will readily appreciate that many modifications are possible in exemplary embodiments without materially departing from the novel teachings and advantages. Accordingly, all such modifications are intended to be included within the scope of this invention as defined in the claims.

Claims (9)

1. A method of synchronizing character information by data classification in an on-line game system, the method comprising:
transferring character information including position data, which is generated from a positional shift of a first user's character, or motion data, which is generated from a motional change of the character, to a second user;
receiving the character information by the second user;
determining a data type of the character information; and
processing the character information with reference to the data type.
2. The method according to claim 1, wherein transferring the character information includes:
immediately transferring first position data to the second user if the character is shifted over a predetermined distance in position from a stop state; and
transferring position data, which is generated after a predetermined interval from just-prior position data that has been transferred, to the second user if the character is continuously shifted over the predetermined distance in position from a former state.
3. The method according to claim 2, wherein processing the character information includes:
changing the character in position with reference to the first position data if the character is in a stop state; and
changing the character in position by calculating a shift distance obtained from comparing the just-prior position data with the latest position data received after the predetermined interval from the just-prior position data if the character is shifting in position.
4. The method according to claim 3, wherein the character is shifted in position with a frame shift distance obtained from dividing the shift distance by the number of frames per the predetermined interval that is output from multiplying the predetermined interval by the number of frames per second of an animation representing the character.
5. The method according to claim 1, wherein the character's motions included in the motion data are provided with motion numbers different each other and the motion data is transferred and received in forms of the motion numbers.
6. The method according to claim 1, wherein transferring the character information includes immediately transferring the motion data to the second user if the character is changed in motion,
wherein processing the character information includes immediately changing the character in motion in correspondence with the received motion data.
7. The method according to claim 1, wherein the motion data includes still motion data operable only if the character is in a stop state, and shift motion data operable even if the character is shifting in position.
8. The method according to claim 3, wherein the motion data includes still motion data operable only if the character is in a stop state, and shift motion data operable even if the character is shifting in position,
wherein processing the character information includes changing the character in motion after shifting the character in correspondence with a comparison between the just-prior position data and the latest position data if the still motion data is received within the predetermined interval.
9. A program-embedded computer-readable recording medium operating the method as in one of claims 1-8.
US13/578,606 2010-02-12 2011-02-08 Method for Synchronizing Character Object Information by Classified Data Type Abandoned US20120309544A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
KR1020100013526A KR101099519B1 (en) 2010-02-12 2010-02-12 Method for Synchronizing Character Object Information by Classified Data Type
KR10-2010-0013526 2010-02-12
PCT/KR2011/000791 WO2011099731A2 (en) 2010-02-12 2011-02-08 Method for synchronising character information according to data-type classification

Publications (1)

Publication Number Publication Date
US20120309544A1 true US20120309544A1 (en) 2012-12-06

Family

ID=44368261

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/578,606 Abandoned US20120309544A1 (en) 2010-02-12 2011-02-08 Method for Synchronizing Character Object Information by Classified Data Type

Country Status (5)

Country Link
US (1) US20120309544A1 (en)
EP (1) EP2535860A4 (en)
JP (2) JP2013519427A (en)
KR (1) KR101099519B1 (en)
WO (1) WO2011099731A2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190015741A1 (en) * 2017-07-14 2019-01-17 GungHo Online Entertainment, Inc. Server device, program, and method

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101528491B1 (en) * 2013-01-22 2015-06-15 주식회사 넥슨코리아 Method and system for synchronizing game object infomation between local client and remote client based on game sever
KR101488698B1 (en) * 2013-01-22 2015-02-05 주식회사 넥슨코리아 Method and system for synchronizing game object infomation between local client and remote client based on game sever
KR101488653B1 (en) * 2013-01-22 2015-02-05 주식회사 넥슨코리아 Method and system for synchronizing game object infomation between local client and remote client
KR101670170B1 (en) 2015-04-21 2016-10-27 양원준 User terminal and method for estimating location on multi-player online game using there

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6801930B1 (en) * 2000-02-26 2004-10-05 Quazal Technologies Inc. Method and apparatus for maintaining information about users sharing the cells partitioning a computer-generated environment
US20070060233A1 (en) * 2005-08-26 2007-03-15 Liccardo Anthony J Video Game System and Method with Trade Based Characters, Weapons and Fight Environments
US20070078003A1 (en) * 2005-10-04 2007-04-05 Nintendo Co., Ltd. Communication game program and communication game system
US20080005172A1 (en) * 2006-06-30 2008-01-03 Robert Gutmann Dead reckoning in a gaming environment
US8508469B1 (en) * 1995-12-01 2013-08-13 Immersion Corporation Networked applications including haptic feedback

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5659691A (en) * 1993-09-23 1997-08-19 Virtual Universe Corporation Virtual reality network with selective distribution and updating of data to reduce bandwidth requirements
CA2180899A1 (en) * 1995-07-12 1997-01-13 Yasuaki Honda Synchronous updating of sub objects in a three dimensional virtual reality space sharing system and method therefore
US6826523B1 (en) * 2000-11-01 2004-11-30 Sony Computer Entertainment America Inc. Application development interface for multi-user applications executable over communication networks
JP3679350B2 (en) * 2001-05-28 2005-08-03 株式会社ナムコ Program, information storage medium and computer system
JP2004005044A (en) * 2002-05-30 2004-01-08 Sony Corp Information transmitter-receiver, information transmitting device and method, information receiving device and method, information processing device and method, information transmitting management device and method, information receiving management device and method, storage and program
KR20020073313A (en) * 2002-06-12 2002-09-23 조두금 Method and apparatus for producing avatar on terminal background screen and community communications method and system using the same, and method of performing games using avatar
JP4047874B2 (en) * 2005-06-28 2008-02-13 株式会社コナミデジタルエンタテインメント GAME SYSTEM, ITS CONTROL METHOD, GAME DEVICE, AND PROGRAM
JP2007206755A (en) * 2006-01-31 2007-08-16 Sony Computer Entertainment Inc Information processing system, information processor, information processing program, and computer-readable storage medium storing information processing program
JP5196729B2 (en) * 2006-04-11 2013-05-15 任天堂株式会社 Communication game system
JP2007301042A (en) * 2006-05-09 2007-11-22 Aruze Corp Server and game system
KR100841925B1 (en) * 2006-09-27 2008-06-30 한국전자통신연구원 System andd method for making digital character motion
JP2009070076A (en) * 2007-09-12 2009-04-02 Namco Bandai Games Inc Program, information storage medium, and image generation device
JP2009247555A (en) * 2008-04-04 2009-10-29 Namco Bandai Games Inc Image generating system, program, and information storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8508469B1 (en) * 1995-12-01 2013-08-13 Immersion Corporation Networked applications including haptic feedback
US6801930B1 (en) * 2000-02-26 2004-10-05 Quazal Technologies Inc. Method and apparatus for maintaining information about users sharing the cells partitioning a computer-generated environment
US20070060233A1 (en) * 2005-08-26 2007-03-15 Liccardo Anthony J Video Game System and Method with Trade Based Characters, Weapons and Fight Environments
US20070078003A1 (en) * 2005-10-04 2007-04-05 Nintendo Co., Ltd. Communication game program and communication game system
US20080005172A1 (en) * 2006-06-30 2008-01-03 Robert Gutmann Dead reckoning in a gaming environment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190015741A1 (en) * 2017-07-14 2019-01-17 GungHo Online Entertainment, Inc. Server device, program, and method
US10537793B2 (en) * 2017-07-14 2020-01-21 GungHo Online Entertainment, Inc. Server device, program, and method

Also Published As

Publication number Publication date
WO2011099731A3 (en) 2011-12-15
JP2013519427A (en) 2013-05-30
EP2535860A2 (en) 2012-12-19
KR20110093466A (en) 2011-08-18
EP2535860A4 (en) 2014-12-17
JP2015180326A (en) 2015-10-15
KR101099519B1 (en) 2011-12-28
WO2011099731A2 (en) 2011-08-18

Similar Documents

Publication Publication Date Title
US11321410B2 (en) Information recommendation method and apparatus, device, and storage medium
CA2880737C (en) A user recommendation method and a user recommendation system using the same
CN111343447B (en) Data providing apparatus, control method of data providing apparatus, and storage medium
US20120309544A1 (en) Method for Synchronizing Character Object Information by Classified Data Type
CN107360459A (en) A kind of processing method of barrage, device and storage medium
CN103218734B (en) The method for pushing of a kind of advertising message and device
US20110249904A1 (en) Face clustering device, face clustering method, and program
CN110073414A (en) Image processing equipment and method
CN109814933A (en) A kind of business data processing method and device
US11497991B2 (en) Information processing system, server, storage medium storing information processing program, and information processing method
CN105897811B (en) A kind of method of data synchronization and device
JP2010028184A (en) Video image navigation method, video image navigation system, and video image navigation program
CN109215037A (en) Destination image partition method, device and terminal device
US20220224983A1 (en) Video message generation method and apparatus, electronic device, and storage medium
CN110035329A (en) Image processing method, device and storage medium
CN110166799A (en) Living broadcast interactive method, apparatus and storage medium
CN110363663A (en) Batch data processing method, device, equipment and storage medium based on block chain
CN103530243B (en) For determining the system of the setting for equipment, method
CN101930367A (en) Implementation method of switching images and mobile terminal
WO2022178453A1 (en) Robust facial animation from video using neural networks
CN115170400A (en) Video repair method, related device, equipment and storage medium
CN110533693A (en) A kind of method for tracking target and target tracker
US20220215660A1 (en) Systems, methods, and media for action recognition and classification via artificial reality systems
CN113694533A (en) Game development system based on big data for game stage representation state
CN109729413B (en) Method and terminal for sending bullet screen

Legal Events

Date Code Title Description
AS Assignment

Owner name: BLUESIDE INC., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KANG, BYONG SOO;REEL/FRAME:028880/0095

Effective date: 20120809

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION