US20110143632A1 - Figure interactive systems and methods - Google Patents
Figure interactive systems and methods Download PDFInfo
- Publication number
- US20110143632A1 US20110143632A1 US12/782,733 US78273310A US2011143632A1 US 20110143632 A1 US20110143632 A1 US 20110143632A1 US 78273310 A US78273310 A US 78273310A US 2011143632 A1 US2011143632 A1 US 2011143632A1
- Authority
- US
- United States
- Prior art keywords
- interactive
- instruction set
- data corresponding
- scenario data
- interactive instruction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H3/00—Dolls
- A63H3/28—Arrangements of sound-producing means in dolls; Means in dolls for producing sounds
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H2200/00—Computerized interactive toys, e.g. dolls
Definitions
- the disclosure relates generally to figure interactive systems and methods, and more particularly, to systems and methods that detect a plurality of figures and dynamically generate an interactive instruction set for the detected figures.
- Figures or dolls are popular items.
- electronic figures In addition to static figures, electronic figures have been developed. Electronic figures can be manipulated by electronic signals to increase applications thereof.
- a figure device or a electronic figure supporting multiple instant communication software, can connect to a personal computer, such that notifications can be performed when messages or new email messages are received, and when the statuses of friends becomes on-line statuses in the instant communication software.
- required functions of an electronic rabbit figure can be set via a computer, and a server can transmit related data, such as weather forecasts, or head-line news to the electronic rabbit figure, so that data is displayed via the electronic rabbit figure.
- An embodiment of a figure interactive system includes at least a base device.
- the base device includes a storage unit, a detecting unit, and a processing unit.
- the storage unit stores a content database.
- the detecting unit respectively detects identification data of at least a first figure and a second figure.
- the processing unit respectively retrieves scenario data corresponding to the first figure and the second figure from the content database, dynamically generates an interactive instruction set for the first figure and the second figure according to the scenario data corresponding to the first figure and the second figure, and enables the first figure and the second figure to interact with each other according to the interactive instruction set.
- a figure interactive system includes at least a first figure and a second figure, a base device, and an electronic device.
- the base device includes at least a detecting unit, and a first communication unit, wherein the detecting unit respectively detects identification data of the first figure and the second figure.
- the electronic device at least includes a second communication unit which can communicate with the first communication unit via a communication connection, a storage unit storing a content database, and a processing unit which respectively retrieves scenario data corresponding to the first figure and the second figure from the content database.
- the processing unit dynamically generates an interactive instruction set for the first figure and the second figure according to the scenario data corresponding to the first figure and the second figure, and transmits the interactive instruction set to the base device.
- the base device enables the first figure and the second figure to interact with each other according to the interactive instruction set.
- identification data of at least a first figure and a second figure is respectively detected.
- scenario data corresponding to the first figure and the second figure are respectively retrieved from a content database.
- the interactive instruction set for the first figure and the second figure is dynamically generated according to the scenario data corresponding to the first figure and the second figure, and the first figure and the second figure are enabled to interact with each other according to the interactive instruction set.
- the identification data of the first figure and the second figure can be transmitted to a server via a network.
- the server can respectively retrieve the scenario data corresponding to the first figure and the second figure according to the identification data of the first figure and the second figure, and transmit the scenario data corresponding to the first figure and the second figure to the base device as the content database via the network.
- renewed scenario data corresponding to the first figure and the second figure can be also received via the network, and stored to the content database.
- the first figure can be selected from the start, and first interactive data for a specific topic is retrieved for the first figure from the scenario data corresponding to the first figure. Then, second interactive data corresponding to the first interactive data for the specific topic is retrieved for the second figure from the scenario data corresponding to the second figure. The first interactive data and the second interactive data can be added to the interactive instruction set.
- the selected figure, the specific topic, and/or the first interactive data can be randomly selected, or selected according to a specific order.
- Figure interactive methods may take the form of a program code embodied in a tangible media.
- the program code When the program code is loaded into and executed by a machine, the machine becomes an apparatus for practicing the disclosed method.
- FIG. 1A is a schematic diagram illustrating an embodiment of a figure interactive system of the invention
- FIG. 1B is a schematic diagram illustrating another embodiment of a figure interactive system of the invention.
- FIG. 2 is a schematic diagram illustrating an embodiment of the structure of a server of the invention
- FIG. 3 is a schematic diagram illustrating an embodiment of the structure of a base device of the invention.
- FIG. 4 is a flowchart of an embodiment of a figure interactive method of the invention.
- FIG. 5 is a flowchart of an embodiment of a method for generating an interactive instruction set of the invention.
- FIG. 1A is a schematic diagram illustrating an embodiment of a figure interactive system of the invention.
- the structure of the figure interactive system comprises a server 1000 and a base device 2000 .
- the base device 2000 can simultaneously detect identification data of a plurality of figures (such as F 1 and F 2 ), and connect to the server 1000 via a network 3000 . It is noted that, only two figures are disclosed in this embodiment, however, the invention is not limited thereto.
- FIG. 2 is a schematic diagram illustrating an embodiment of the structure of a server of the invention.
- the server 1000 may be a processor-based electronic device, such as general purpose computer, a personal computer, a notebook, or a workstation.
- the server 1000 at least comprises a scenario database 1100 .
- the scenario database 1100 can comprise scenario data corresponding to a plurality of figures respectively. It is understood that, in some embodiments, the scenario data can comprise dialogues, images, sound effects, music, light signals, and/or actions of the figures, such as swinging, vibrating, rotating, beating and movements, among others.
- FIG. 3 is a schematic diagram illustrating an embodiment of the structure of a base device of the invention.
- the base device 2000 can comprise a detecting unit 2100 , a storage unit 2200 , and a processing unit 2300 .
- the identification data of the figure can be detected by an RFID (Radio-Frequency Identification), an IR (Infrared) communication recognition system, a USB (Universal Serial Bus) wired/wireless communication recognition system, a 2-dimension/3-dimension barcode recognition system, recognition software and related communication interfaces, or other recognition systems/manners.
- the detecting unit 2100 can simultaneously detect the identification data of the figures.
- the storage unit 2200 can at least comprise a content database 2210 .
- the content database 2210 can store the scenario data (such as 2211 and 2212 ) corresponding to the respective figures. Similarly, in some embodiments, the scenario data can comprise dialogues, images, sound effects, music, light signals, and/or actions.
- the content database 2210 can further store an interactive instruction set 2220 corresponding to at least two figures. It is noted that, the interactive instruction set 2220 can be dynamically generated according to the scenario data in the content database 2210 . The generation and use of the interactive instruction set 2220 are discussed later.
- the processing unit 2300 performs the figure interactive method of the invention, which will be discussed further in the following paragraphs.
- FIG. 1B is a schematic diagram illustrating another embodiment of a figure interactive system of the invention.
- the figure interactive system comprises a server 1000 , a base device 2000 , and an electronic device 4000 .
- the base device 2000 can simultaneously detect the identification data of several figures, such as F 1 and F 2 , and communicate with the electronic device 4000 via a communication connection.
- the electronic device 4000 couples to the server 1000 via a network 3000 .
- the base device 2000 at least comprises the detecting unit 2100 in FIG. 3 , and a first communication unit (not shown), and the electronic device 4000 at least comprises the storage unit 2200 and the processing unit 2300 in FIG. 3 , and a second communication unit (not shown).
- the functions and features of the detecting unit 2100 , the storage unit 2200 , and the processing unit 2300 are similar to that disclosed in FIG. 3 , and omitted herefrom.
- the electronic device 4000 may be a general purpose computer, a personal computer, a notebook, a netbook, a handheld computer, or a PDA (Personal Digital Assistant).
- the communication connection may be an RS232 connection, a USB communication connection, or an RFID communication connection.
- the first/second communication unit corresponding to the above communication connections may be an RS232 interface, a USB communication interface, or an RFID communication interface.
- FIG. 4 is a flowchart of an embodiment of a figure interactive method of the invention.
- the figure interactive method of the invention can enable multiple figures to interact with each other. It is understood that, in this embodiment, a first figure and a second figure are used for explanation, but the invention is not limited thereto.
- step S 4100 the identification data of the first figure and the second figure is respectively detected by the base device.
- the detecting unit of the base device can simultaneously detect the identification data of the figures.
- step S 4200 the scenario data corresponding to the first figure and the second figure are respectively retrieved from a content database according to the identification data of the first figure and the second figure.
- the scenario data corresponding to the first figure and the second figure can be retrieved from the content database 2210 via the base device 2000 or the electronic device 4000 .
- the electronic device 4000 can transmit the scenario data corresponding to the first figure and the second figure to the base device 2000 .
- step S 4300 the interactive instruction set for the first figure and the second figure is dynamically generated according to the scenario data corresponding to the first figure and the second figure, and in step S 4400 , the first figure and the second figure are enabled to interact with each other according to the interactive instruction set.
- the identification data of the first figure and the second figure can be transmitted to the server via the network from the base device or the electronic device before step S 4200 .
- the server can respectively retrieve the corresponding scenario data from the scenario database according to the identification data of the first figure and the second figure, and transmits the scenario data corresponding to the first figure and the second figure to the base device or the electronic device, such that the scenario data corresponding to the first figure and the second figure is stored into the content database.
- each figure may have at least a drive component (not shown).
- the drive component can receive part of the interactive instruction set relating to the figure from the base device, and drive commands according to the actions in the received interactive instruction set, such that the figure and/or at least one component of the figure can be accordingly driven to perform an operation.
- the base device or the figure may comprise a display unit (not shown in FIG. 3 ), for displaying the dialogues (ex. texts), symbols, animations, colors, and/or images in the interactive instruction set.
- the base device or the figure may comprise a speaker (not shown in FIG. 3 ), to play the dialogues by voices, music, and/or sound effects in the interactive instruction set.
- the base device or the electronic device when the base device or the electronic device detects and recognize the figures, the base device or the electronic device can immediately transmit the identification data of the figures to the server, and the server searches, receives or generates the scenario data corresponding to the figures, dynamically generates the interactive instruction set according to the scenario data, and transmits the interactive instruction set back to the base device, such that the first figure and the second figure can interact with each other according to the interactive instruction set.
- the base device or the electronic device can store the scenario data corresponding to the figures in advance.
- the scenario data corresponding to the figures can be directly retrieved from the content database of the storage unit in the base device or the electronic device, and the interactive instruction set can be dynamically generated according to the retrieved scenario data.
- the base device or the electronic device can periodically or randomly receive renewed scenario data corresponding to the figures via the network, and store the renewed scenario data to the content database.
- FIG. 5 is a flowchart of an embodiment of a method for generating an interactive instruction set of the invention.
- a specific topic is determined. It is understood that, in some embodiments, the specific topic may be a different classification of scenario, such as a narrative, an emotional, or a combat scenario. In some embodiments, the specific topic can be determined randomly.
- a figure is selected from the figures for interaction. The selected figure may be an initial figure. Similarly, in some embodiments, the initial figure can be randomly selected from the figures.
- interactive data for the specific topic is retrieved for the initial figure from the scenario data corresponding to the initial figure. Similarly, in some embodiments, the interactive data for the initial figure can be randomly selected from the scenario data corresponding to the initial figure.
- the scenario data/interaction data can comprise dialogues, images, sound effects, and/or actions.
- step S 5400 interactive data, corresponding to the interactive data for the initial figure, for the specific topic is retrieved for another figure (called associated figure) from the scenario data corresponding to the associated figure.
- the scenario data/interaction data can respectively define a tag. Relationships among the scenario data/interaction data can be established via the tags.
- step S 5500 it is determined whether the generation of the interactive instruction set is complete. It is noted that, the determination of whether the generation of the interactive instruction set is complete may be different based on different requirements and applications.
- the scenario data corresponding to the initial figure and/or the associated figure may define a terminal tag.
- the generation of the interactive instruction set is complete.
- step S 5600 interactive data, corresponding to the interactive data for the associated figure, for the specific topic is retrieved for the initial figure from the scenario data corresponding to the initial figure, and the procedure returns to step S 5400 .
- step S 5700 the interactive data corresponding to the initial figure and the associated figure are combined as the interactive instruction set.
- FIG. 5 is used to generate the interactive instruction set for a specific topic.
- the specific topic is not needed to be determined in advance, and the interactive instruction set corresponding to the respective figures can be directly determined according to the respective scenario data.
- the figure interactive systems and methods can dynamically generate an interactive instruction set for multiple figures, and enable the figures to interact with each other according to the interactive instruction set.
- operating flexibility of the figures is increased.
- Figure interactive methods may take the form of a program code (i.e., executable instructions) embodied in tangible media, such as floppy diskettes, CD-ROMS, hard drives, or any other machine-readable storage medium, wherein, when the program code is loaded into and executed by a machine, such as a computer, the machine thereby becomes an apparatus for practicing the methods.
- program code When implemented on a general-purpose processor, the program code combines with the processor to provide a unique apparatus that operates analogously to the application of specific logic circuits.
Abstract
Figure interactive systems and methods are provided. The system includes at least a base device. The base device includes a storage unit, a detecting unit, and a processing unit. The storage unit stores a content database recording scenario data corresponding to a plurality of figures. The detecting unit respectively detects identification data of at least a first figure and a second figure. The processing unit respectively retrieves the scenario data of the first figure and the second figure from the content database, dynamically generates an interactive instruction set for the first figure and the second figure, and enables the first figure and the second figure to interact according to the interactive instruction set.
Description
- This application claims priority of Taiwan Patent Application No. 098142210, filed on Dec. 10, 2009, the entirety of which is incorporated by reference herein.
- 1. Field of the Invention
- The disclosure relates generally to figure interactive systems and methods, and more particularly, to systems and methods that detect a plurality of figures and dynamically generate an interactive instruction set for the detected figures.
- 2. Description of the Related Art
- Figures or dolls are popular items. In addition to static figures, electronic figures have been developed. Electronic figures can be manipulated by electronic signals to increase applications thereof.
- As an example, a figure device or a electronic figure, supporting multiple instant communication software, can connect to a personal computer, such that notifications can be performed when messages or new email messages are received, and when the statuses of friends becomes on-line statuses in the instant communication software. For example, required functions of an electronic rabbit figure can be set via a computer, and a server can transmit related data, such as weather forecasts, or head-line news to the electronic rabbit figure, so that data is displayed via the electronic rabbit figure.
- Generally, conventional electronic figures can only receive fixed messages, and perform related operations according to the received messages. Some electronic figures can perform related operations, such as music playback and dancing based on predefined programs. However, since these programs are fixed and burned into the electronic figures, operating flexibility of the electronic figures is limited, thus hindering popularity among users and development of the electronic figures. Accordingly, with limited variability, users often quickly lose interest in the electronic figures. Currently, there is no technology to automatically detect a plurality of figures and dynamically generate interactive content (not the fixed programs/operations in conventional electronic figures) in the field.
- Figure interactive systems and methods are provided.
- An embodiment of a figure interactive system includes at least a base device. The base device includes a storage unit, a detecting unit, and a processing unit. The storage unit stores a content database. The detecting unit respectively detects identification data of at least a first figure and a second figure. The processing unit respectively retrieves scenario data corresponding to the first figure and the second figure from the content database, dynamically generates an interactive instruction set for the first figure and the second figure according to the scenario data corresponding to the first figure and the second figure, and enables the first figure and the second figure to interact with each other according to the interactive instruction set.
- Another embodiment of a figure interactive system includes at least a first figure and a second figure, a base device, and an electronic device. The base device includes at least a detecting unit, and a first communication unit, wherein the detecting unit respectively detects identification data of the first figure and the second figure. The electronic device at least includes a second communication unit which can communicate with the first communication unit via a communication connection, a storage unit storing a content database, and a processing unit which respectively retrieves scenario data corresponding to the first figure and the second figure from the content database. Also, the processing unit dynamically generates an interactive instruction set for the first figure and the second figure according to the scenario data corresponding to the first figure and the second figure, and transmits the interactive instruction set to the base device. The base device enables the first figure and the second figure to interact with each other according to the interactive instruction set.
- In an embodiment of a figure interactive method, identification data of at least a first figure and a second figure is respectively detected. Then, scenario data corresponding to the first figure and the second figure are respectively retrieved from a content database. The interactive instruction set for the first figure and the second figure is dynamically generated according to the scenario data corresponding to the first figure and the second figure, and the first figure and the second figure are enabled to interact with each other according to the interactive instruction set.
- In some embodiments, the identification data of the first figure and the second figure can be transmitted to a server via a network. The server can respectively retrieve the scenario data corresponding to the first figure and the second figure according to the identification data of the first figure and the second figure, and transmit the scenario data corresponding to the first figure and the second figure to the base device as the content database via the network. In some embodiments, renewed scenario data corresponding to the first figure and the second figure can be also received via the network, and stored to the content database.
- In some embodiments, during the interactive instruction set is generated, the first figure can be selected from the start, and first interactive data for a specific topic is retrieved for the first figure from the scenario data corresponding to the first figure. Then, second interactive data corresponding to the first interactive data for the specific topic is retrieved for the second figure from the scenario data corresponding to the second figure. The first interactive data and the second interactive data can be added to the interactive instruction set. In some embodiments, the selected figure, the specific topic, and/or the first interactive data can be randomly selected, or selected according to a specific order.
- Figure interactive methods may take the form of a program code embodied in a tangible media. When the program code is loaded into and executed by a machine, the machine becomes an apparatus for practicing the disclosed method.
- The invention will become more fully understood by referring to the following detailed description with reference to the accompanying drawings, wherein:
-
FIG. 1A is a schematic diagram illustrating an embodiment of a figure interactive system of the invention; -
FIG. 1B is a schematic diagram illustrating another embodiment of a figure interactive system of the invention; -
FIG. 2 is a schematic diagram illustrating an embodiment of the structure of a server of the invention; -
FIG. 3 is a schematic diagram illustrating an embodiment of the structure of a base device of the invention; -
FIG. 4 is a flowchart of an embodiment of a figure interactive method of the invention; and -
FIG. 5 is a flowchart of an embodiment of a method for generating an interactive instruction set of the invention. - Figure interactive systems and methods are provided.
-
FIG. 1A is a schematic diagram illustrating an embodiment of a figure interactive system of the invention. - The structure of the figure interactive system comprises a
server 1000 and abase device 2000. Thebase device 2000 can simultaneously detect identification data of a plurality of figures (such as F1 and F2), and connect to theserver 1000 via anetwork 3000. It is noted that, only two figures are disclosed in this embodiment, however, the invention is not limited thereto. -
FIG. 2 is a schematic diagram illustrating an embodiment of the structure of a server of the invention. - The
server 1000 may be a processor-based electronic device, such as general purpose computer, a personal computer, a notebook, or a workstation. Theserver 1000 at least comprises ascenario database 1100. Thescenario database 1100 can comprise scenario data corresponding to a plurality of figures respectively. It is understood that, in some embodiments, the scenario data can comprise dialogues, images, sound effects, music, light signals, and/or actions of the figures, such as swinging, vibrating, rotating, beating and movements, among others. -
FIG. 3 is a schematic diagram illustrating an embodiment of the structure of a base device of the invention. - The
base device 2000 can comprise a detectingunit 2100, astorage unit 2200, and aprocessing unit 2300. It is understood that, in some embodiments, the identification data of the figure can be detected by an RFID (Radio-Frequency Identification), an IR (Infrared) communication recognition system, a USB (Universal Serial Bus) wired/wireless communication recognition system, a 2-dimension/3-dimension barcode recognition system, recognition software and related communication interfaces, or other recognition systems/manners. When several figures are placed on or close to thebase device 2000, the detectingunit 2100 can simultaneously detect the identification data of the figures. Thestorage unit 2200 can at least comprise acontent database 2210. Thecontent database 2210 can store the scenario data (such as 2211 and 2212) corresponding to the respective figures. Similarly, in some embodiments, the scenario data can comprise dialogues, images, sound effects, music, light signals, and/or actions. Thecontent database 2210 can further store aninteractive instruction set 2220 corresponding to at least two figures. It is noted that, theinteractive instruction set 2220 can be dynamically generated according to the scenario data in thecontent database 2210. The generation and use of theinteractive instruction set 2220 are discussed later. Theprocessing unit 2300 performs the figure interactive method of the invention, which will be discussed further in the following paragraphs. -
FIG. 1B is a schematic diagram illustrating another embodiment of a figure interactive system of the invention. The figure interactive system comprises aserver 1000, abase device 2000, and anelectronic device 4000. Thebase device 2000 can simultaneously detect the identification data of several figures, such as F1 and F2, and communicate with theelectronic device 4000 via a communication connection. Theelectronic device 4000 couples to theserver 1000 via anetwork 3000. It is noted that, in this embodiment, thebase device 2000 at least comprises the detectingunit 2100 inFIG. 3 , and a first communication unit (not shown), and theelectronic device 4000 at least comprises thestorage unit 2200 and theprocessing unit 2300 inFIG. 3 , and a second communication unit (not shown). The functions and features of the detectingunit 2100, thestorage unit 2200, and theprocessing unit 2300 are similar to that disclosed inFIG. 3 , and omitted herefrom. Theelectronic device 4000 may be a general purpose computer, a personal computer, a notebook, a netbook, a handheld computer, or a PDA (Personal Digital Assistant). The communication connection may be an RS232 connection, a USB communication connection, or an RFID communication connection. The first/second communication unit corresponding to the above communication connections may be an RS232 interface, a USB communication interface, or an RFID communication interface. -
FIG. 4 is a flowchart of an embodiment of a figure interactive method of the invention. The figure interactive method of the invention can enable multiple figures to interact with each other. It is understood that, in this embodiment, a first figure and a second figure are used for explanation, but the invention is not limited thereto. - In step S4100, the identification data of the first figure and the second figure is respectively detected by the base device. Similarly, when several figures are placed on or close to the base device, the detecting unit of the base device can simultaneously detect the identification data of the figures. In step S4200, the scenario data corresponding to the first figure and the second figure are respectively retrieved from a content database according to the identification data of the first figure and the second figure. For example, the scenario data corresponding to the first figure and the second figure can be retrieved from the
content database 2210 via thebase device 2000 or theelectronic device 4000. When the scenario data is retrieved via theelectronic device 4000, theelectronic device 4000 can transmit the scenario data corresponding to the first figure and the second figure to thebase device 2000. In step S4300, the interactive instruction set for the first figure and the second figure is dynamically generated according to the scenario data corresponding to the first figure and the second figure, and in step S4400, the first figure and the second figure are enabled to interact with each other according to the interactive instruction set. - In other embodiments, the identification data of the first figure and the second figure can be transmitted to the server via the network from the base device or the electronic device before step S4200. After the identification data of the first figure and the second figure is received, the server can respectively retrieve the corresponding scenario data from the scenario database according to the identification data of the first figure and the second figure, and transmits the scenario data corresponding to the first figure and the second figure to the base device or the electronic device, such that the scenario data corresponding to the first figure and the second figure is stored into the content database.
- It is understood that, in some embodiments, each figure may have at least a drive component (not shown). The drive component can receive part of the interactive instruction set relating to the figure from the base device, and drive commands according to the actions in the received interactive instruction set, such that the figure and/or at least one component of the figure can be accordingly driven to perform an operation. In some embodiments, the base device or the figure may comprise a display unit (not shown in
FIG. 3 ), for displaying the dialogues (ex. texts), symbols, animations, colors, and/or images in the interactive instruction set. In some embodiments, the base device or the figure may comprise a speaker (not shown inFIG. 3 ), to play the dialogues by voices, music, and/or sound effects in the interactive instruction set. - It is noted that, in some embodiments, when the base device or the electronic device detects and recognize the figures, the base device or the electronic device can immediately transmit the identification data of the figures to the server, and the server searches, receives or generates the scenario data corresponding to the figures, dynamically generates the interactive instruction set according to the scenario data, and transmits the interactive instruction set back to the base device, such that the first figure and the second figure can interact with each other according to the interactive instruction set. In some embodiments, the base device or the electronic device can store the scenario data corresponding to the figures in advance. When the identification data of the figures is detected, the scenario data corresponding to the figures can be directly retrieved from the content database of the storage unit in the base device or the electronic device, and the interactive instruction set can be dynamically generated according to the retrieved scenario data. In some embodiments, the base device or the electronic device can periodically or randomly receive renewed scenario data corresponding to the figures via the network, and store the renewed scenario data to the content database.
-
FIG. 5 is a flowchart of an embodiment of a method for generating an interactive instruction set of the invention. - In step S5100, a specific topic is determined. It is understood that, in some embodiments, the specific topic may be a different classification of scenario, such as a narrative, an emotional, or a combat scenario. In some embodiments, the specific topic can be determined randomly. In step S5200, a figure is selected from the figures for interaction. The selected figure may be an initial figure. Similarly, in some embodiments, the initial figure can be randomly selected from the figures. Then, in step S5300, interactive data for the specific topic is retrieved for the initial figure from the scenario data corresponding to the initial figure. Similarly, in some embodiments, the interactive data for the initial figure can be randomly selected from the scenario data corresponding to the initial figure. As described, in some embodiments, the scenario data/interaction data can comprise dialogues, images, sound effects, and/or actions. After the interactive data for the initial figure is determined, in step S5400, interactive data, corresponding to the interactive data for the initial figure, for the specific topic is retrieved for another figure (called associated figure) from the scenario data corresponding to the associated figure. It is understood that, in some embodiments, the scenario data/interaction data can respectively define a tag. Relationships among the scenario data/interaction data can be established via the tags. In step S5500, it is determined whether the generation of the interactive instruction set is complete. It is noted that, the determination of whether the generation of the interactive instruction set is complete may be different based on different requirements and applications. In some embodiments, the scenario data corresponding to the initial figure and/or the associated figure may define a terminal tag. When the interactive data corresponding to the initial figure and/or the associated figure has the terminal tag, the generation of the interactive instruction set is complete. When the generation of the interactive instruction set is not complete (No in step S5500), in step S5600, interactive data, corresponding to the interactive data for the associated figure, for the specific topic is retrieved for the initial figure from the scenario data corresponding to the initial figure, and the procedure returns to step S5400. When the generation of the interactive instruction set is complete (Yes in step S5500), in step S5700, the interactive data corresponding to the initial figure and the associated figure are combined as the interactive instruction set.
- It is noted that, the embodiment of
FIG. 5 is used to generate the interactive instruction set for a specific topic. However, in some embodiments, the specific topic is not needed to be determined in advance, and the interactive instruction set corresponding to the respective figures can be directly determined according to the respective scenario data. - Therefore, the figure interactive systems and methods can dynamically generate an interactive instruction set for multiple figures, and enable the figures to interact with each other according to the interactive instruction set. Thus operating flexibility of the figures is increased.
- Figure interactive methods, or certain aspects or portions thereof, may take the form of a program code (i.e., executable instructions) embodied in tangible media, such as floppy diskettes, CD-ROMS, hard drives, or any other machine-readable storage medium, wherein, when the program code is loaded into and executed by a machine, such as a computer, the machine thereby becomes an apparatus for practicing the methods. When implemented on a general-purpose processor, the program code combines with the processor to provide a unique apparatus that operates analogously to the application of specific logic circuits.
- While the invention has been described by way of example and in terms of preferred embodiment, it is to be understood that the invention is not limited thereto. Those who are skilled in this technology can still make various alterations and modifications without departing from the scope and spirit of this invention. Therefore, the scope of the present invention shall be defined and protected by the following claims and their equivalents.
Claims (20)
1. A figure interactive system, comprising:
at least a first figure and a second figure; and
a base device, comprising:
a storage unit storing a content database;
a detecting unit respectively detecting identification data of the first figure and the second figure; and
a processing unit respectively retrieving scenario data corresponding to the first figure and the second figure from the content database according to the identification data of the first figure and the second figure, dynamically generating an interactive instruction set for the first figure and the second figure according to the scenario data corresponding to the first figure and the second figure, and enabling the first figure and the second figure to interact with each other according to the interactive instruction set.
2. The system of claim 1 , wherein the processing unit further transmits the identification data of the first figure and the second figure to a server via a network, and the server respectively retrieves the scenario data corresponding to the first figure and the second figure according to the identification data of the first figure and the second figure, and transmits the scenario data corresponding to the first figure and the second figure to the content database of the storage unit via the network.
3. The system of claim 2 , wherein the processing unit further receives renewed scenario data corresponding to the first figure and the second figure from the server via the network, and stores the renewed scenario data to the content database.
4. The system of claim 1 , wherein the interactive instruction set further comprises an action drive command, and the first figure or the second figure respectively has at least a drive component for receiving the interactive instruction set from the base device, and driving the first figure or the second figure to perform an operation according to the action drive command in the interactive instruction set.
5. The system of claim 1 , wherein the base device further comprises a display unit for displaying dialogues or images in the interactive instruction set.
6. The system of claim 1 , wherein the base device further comprises a speaker for playing dialogues or sound effects in the interactive instruction set.
7. The system of claim 1 , wherein during the interactive instruction set is generated, the processing unit selects the first figure from the start, retrieves first interactive data for a specific topic for the first figure from the scenario data corresponding to the first figure, retrieves second interactive data corresponding to the first interactive data for the specific topic for the second figure from the scenario data corresponding to the second figure, and combines the first interactive data and the second interactive data as the interactive instruction set.
8. A figure interactive system, comprising:
at least a first figure and a second figure;
a base device, comprising a detecting unit and a first communication unit, wherein the detecting unit respectively detects identification data of the first figure and the second figure; and
an electronic device, comprising a second communication unit for communicating with the first communication unit via a communication connection, a storage unit for storing a content database, and a processing unit for respectively retrieving scenario data corresponding to the first figure and the second figure from the content database, dynamically generating an interactive instruction set for the first figure and the second figure according to the scenario data corresponding to the first figure and the second figure, and transmitting the interactive instruction set to the base device,
wherein the base device enables the first figure and the second figure to interact with each other according to the interactive instruction set.
9. The system of claim 8 , wherein the processing unit transmits the identification data of the first figure and the second figure to a server via a network, and the server respectively retrieves the scenario data corresponding to the first figure and the second figure according to the identification data of the first figure and the second figure, and transmits the scenario data corresponding to the first figure and the second figure to the content database of the storage unit via the network.
10. The system of claim 9 , wherein the processing unit further receives renewed scenario data corresponding to the first figure and the second figure from the server via the network, and stores the renewed scenario data to the content database.
11. The system of claim 8 , wherein the interactive instruction set further comprises an action drive command, and the first figure or the second figure respectively has at least a drive component for receiving the interactive instruction set from the base device, and driving the first figure or the second figure to perform an operation according to the action drive command in the interactive instruction set.
12. The system of claim 8 , wherein during the interactive instruction set is generated, the processing unit selects the first figure from the start, retrieves first interactive data for a specific topic for the first figure from the scenario data corresponding to the first figure, retrieves second interactive data corresponding to the first interactive data for the specific topic for the second figure from the scenario data corresponding to the second figure, and combines the first interactive data and the second interactive data as the interactive instruction set.
13. A figure interactive method, comprising:
respectively detecting identification data of a first figure and a second figure;
respectively retrieving scenario data corresponding to the first figure and the second figure from a content database according to the identification data of the first figure and the second figure;
dynamically generating an interactive instruction set for the first figure and the second figure according to the scenario data corresponding to the first figure and the second figure; and
enabling the first figure and the second figure to interact with each other according to the interactive instruction set.
14. The method of claim 13 , further comprising:
transmitting the identification data of the first figure and the second figure to a server via a network, wherein the server respectively retrieves the scenario data corresponding to the first figure and the second figure according to the identification data of the first figure and the second figure; and
receiving the scenario data corresponding to the first figure and the second figure via the network as the content database.
15. The method of claim 14 , further comprising:
receiving renewed scenario data corresponding to the first figure and the second figure from the server via the network; and
storing the renewed scenario data to the content database.
16. The method of claim 13 , further comprising:
respectively transmitting the interactive instruction set to at least a drive component of the first figure or the second figure; and
respectively driving the first figure or the second figure to perform an operation according to an action drive command in the interactive instruction set.
17. The method of claim 13 , further comprising displaying dialogues or images in the interactive instruction set via a display unit.
18. The method of claim 13 , further comprising playing dialogues or sound effects in the interactive instruction set via a speaker.
19. The method of claim 13 , wherein during the interactive instruction set is generated, the method further comprises the steps of:
selecting the first figure from the start;
retrieving first interactive data for a specific topic for the first figure from the scenario data corresponding to the first figure;
retrieving second interactive data corresponding to the first interactive data for the specific topic for the second figure from the scenario data corresponding to the second figure; and
combining the first interactive data and the second interactive data as the interactive instruction set.
20. A machine-readable storage medium comprising a computer program, which, when executed, causes a device to perform a figure interactive method, and the method comprises:
respectively detecting identification data of a first figure and a second figure;
respectively retrieving scenario data corresponding to the first figure and the second figure from a content database according to the identification data of the first figure and the second figure;
dynamically generating an interactive instruction set for the first figure and the second figure according to the scenario data corresponding to the first figure and the second figure; and
enabling the first figure and the second figure to interact with each other according to the interactive instruction set.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW098142210A TW201120670A (en) | 2009-12-10 | 2009-12-10 | Figure interaction systems and methods, and computer program products thereof |
TW98142210 | 2009-12-10 |
Publications (1)
Publication Number | Publication Date |
---|---|
US20110143632A1 true US20110143632A1 (en) | 2011-06-16 |
Family
ID=44143459
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/782,733 Abandoned US20110143632A1 (en) | 2009-12-10 | 2010-05-19 | Figure interactive systems and methods |
Country Status (2)
Country | Link |
---|---|
US (1) | US20110143632A1 (en) |
TW (1) | TW201120670A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2920749A1 (en) * | 2012-11-19 | 2015-09-23 | Nokia Technologies OY | Methods, apparatuses, and computer program products for synchronized conversation between co-located devices |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5636994A (en) * | 1995-11-09 | 1997-06-10 | Tong; Vincent M. K. | Interactive computer controlled doll |
US5752880A (en) * | 1995-11-20 | 1998-05-19 | Creator Ltd. | Interactive doll |
US6089942A (en) * | 1998-04-09 | 2000-07-18 | Thinking Technology, Inc. | Interactive toys |
US6110000A (en) * | 1998-02-10 | 2000-08-29 | T.L. Products Promoting Co. | Doll set with unidirectional infrared communication for simulating conversation |
US6309275B1 (en) * | 1997-04-09 | 2001-10-30 | Peter Sui Lun Fong | Interactive talking dolls |
US6352478B1 (en) * | 1997-08-18 | 2002-03-05 | Creator, Ltd. | Techniques and apparatus for entertainment sites, amusement parks and other information and/or entertainment dispensing sites |
US20080160877A1 (en) * | 2005-04-26 | 2008-07-03 | Steven Lipman | Toys |
US20090275408A1 (en) * | 2008-03-12 | 2009-11-05 | Brown Stephen J | Programmable interactive talking device |
-
2009
- 2009-12-10 TW TW098142210A patent/TW201120670A/en unknown
-
2010
- 2010-05-19 US US12/782,733 patent/US20110143632A1/en not_active Abandoned
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5636994A (en) * | 1995-11-09 | 1997-06-10 | Tong; Vincent M. K. | Interactive computer controlled doll |
US5752880A (en) * | 1995-11-20 | 1998-05-19 | Creator Ltd. | Interactive doll |
US6309275B1 (en) * | 1997-04-09 | 2001-10-30 | Peter Sui Lun Fong | Interactive talking dolls |
US20060009113A1 (en) * | 1997-04-09 | 2006-01-12 | Fong Peter S L | Interactive talking dolls |
US7068941B2 (en) * | 1997-04-09 | 2006-06-27 | Peter Sui Lun Fong | Interactive talking dolls |
US6352478B1 (en) * | 1997-08-18 | 2002-03-05 | Creator, Ltd. | Techniques and apparatus for entertainment sites, amusement parks and other information and/or entertainment dispensing sites |
US6110000A (en) * | 1998-02-10 | 2000-08-29 | T.L. Products Promoting Co. | Doll set with unidirectional infrared communication for simulating conversation |
US6089942A (en) * | 1998-04-09 | 2000-07-18 | Thinking Technology, Inc. | Interactive toys |
US20080160877A1 (en) * | 2005-04-26 | 2008-07-03 | Steven Lipman | Toys |
US20090275408A1 (en) * | 2008-03-12 | 2009-11-05 | Brown Stephen J | Programmable interactive talking device |
Non-Patent Citations (1)
Title |
---|
Copy of of the WO 01/012285 A1 (22 February 2001), Liu, Dexter 69 pages. * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2920749A1 (en) * | 2012-11-19 | 2015-09-23 | Nokia Technologies OY | Methods, apparatuses, and computer program products for synchronized conversation between co-located devices |
US10929336B2 (en) | 2012-11-19 | 2021-02-23 | Nokia Technologies Oy | Methods, apparatuses, and computer program products for synchronized conversation between co-located devices |
Also Published As
Publication number | Publication date |
---|---|
TW201120670A (en) | 2011-06-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2019018061A1 (en) | Automatic integration of image capture and recognition in a voice-based query to understand intent | |
CN113766253A (en) | Live broadcast method, device, equipment and storage medium based on virtual anchor | |
AU2016327275A1 (en) | Automatic performance of user interaction operations on a computing device | |
WO2005091185A1 (en) | Real-time sales support and learning tool | |
US20130325466A1 (en) | System and method for controlling interactive video using voice | |
CN107690639A (en) | The generation of metadata tag description | |
CN110830362B (en) | Content generation method and mobile terminal | |
EP3593346B1 (en) | Graphical data selection and presentation of digital content | |
CN108694947A (en) | Sound control method, device, storage medium and electronic equipment | |
CN105204886A (en) | Application program activating method, user terminal and server | |
EP2737400B1 (en) | Mode notifications | |
CN109684497A (en) | Image-text matching information sending method and device and electronic equipment | |
US20100005065A1 (en) | Icon processing apparatus and icon processing method | |
US11698927B2 (en) | Contextual digital media processing systems and methods | |
CN112765397B (en) | Audio conversion method, audio playing method and device | |
US20110143632A1 (en) | Figure interactive systems and methods | |
US11145306B1 (en) | Interactive media system using audio inputs | |
US11695807B2 (en) | Filtering video content items | |
CN111159472B (en) | Multimodal chat technique | |
US20130325619A1 (en) | Saving electronic coupons for future use | |
JP6014275B1 (en) | Server apparatus, program and method | |
US20120137280A1 (en) | Electronic figure and electronic figure system | |
US20210026885A1 (en) | Filtering video content items | |
JP5472661B2 (en) | Information registration presentation device, information registration presentation system, information registration presentation method, and information registration presentation processing program | |
CN112732379A (en) | Operation method of application program on intelligent terminal, terminal and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: INSTITUTE FOR INFORMATION INDUSTRY, TAIWAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LIN, SHENG-CHUN;CHANG, YU-CHUAN;HUNG, YU-SHIANG;REEL/FRAME:024425/0708 Effective date: 20100302 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |