US20100048090A1 - Robot and control method thereof - Google Patents

Robot and control method thereof Download PDF

Info

Publication number
US20100048090A1
US20100048090A1 US12/432,685 US43268509A US2010048090A1 US 20100048090 A1 US20100048090 A1 US 20100048090A1 US 43268509 A US43268509 A US 43268509A US 2010048090 A1 US2010048090 A1 US 2010048090A1
Authority
US
United States
Prior art keywords
key information
robot
story
audio data
recited
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/432,685
Inventor
Chuan-Hong Wang
Hsiao-Chung Chou
Li-Zhang Huang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hon Hai Precision Industry Co Ltd
Original Assignee
Hon Hai Precision Industry Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hon Hai Precision Industry Co Ltd filed Critical Hon Hai Precision Industry Co Ltd
Assigned to HON HAI PRECISION INDUSTRY CO., LTD. reassignment HON HAI PRECISION INDUSTRY CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHOU, HSIAO-CHUNG, HUANG, Li-zhang, WANG, CHUAN-HONG
Publication of US20100048090A1 publication Critical patent/US20100048090A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H3/00Dolls
    • A63H3/28Arrangements of sound-producing means in dolls; Means in dolls for producing sounds
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H2200/00Computerized interactive toys, e.g. dolls

Definitions

  • the disclosure relates to a robot and, more particularly, to a robot and a control method adapted for the robot.
  • FIG. 1 is a block diagram of a hardware infrastructure of a robot in accordance with an exemplary embodiment.
  • FIG. 2 is an example of an information story table of the robot of FIG. 1 .
  • FIG. 3 is an example of an action information table of the robot of FIG. 1 .
  • FIG. 4 is a flowchart illustrating a control method of the robot of FIG. 1 .
  • FIG. 1 is a block diagram of a hardware infrastructure of a robot in accordance with an exemplary embodiment.
  • the robot 1 includes a storage unit 10 , an input unit 20 , a processing unit 30 , a digital-to-analog (D/A) converter 40 , a speaker 50 , and at least one actuator 60 .
  • the storage unit 10 stores an audio database 11 , an information story table 12 , an action information table 13 , and an action database 14 .
  • the audio database 11 stores a list of audio data of stories that can be played by the robot 1 .
  • FIG. 2 is an example of the information story table 12 of the robot of FIG. 1 .
  • Each audio data includes a plurality of key information, and each of the key information is assigned a time period indicating when the key information should be fetched.
  • the information story table 12 stores relationships among audio data, time periods, and key information.
  • the information story table 12 includes an audio data column, a time period column, and a key information column.
  • the audio data column records a plurality of audio data of stories.
  • the time period column records a plurality of time periods.
  • the key information column records a plurality of key information.
  • the key information is selected from the group consisting of words, phrases, and a combination of words and phrases.
  • the audio data of story “S 1 ” includes the key information “A 2 ” (e.g., a key word “walk”) and “A 4 ” (e.g., a key phrase “sit down”), and the key word “walk” is assigned a first time period and the key phrase “sit down” is assigned a second time period.
  • a 2 e.g., a key word “walk”
  • a 4 e.g., a key phrase “sit down”
  • the key word “walk” should be fetched, and when the elapsed time of the audio data of story “S 1 ” reaches the second time period when it is being played, the key phrase “sit down” should be fetched.
  • FIG. 3 is an example of the action information table 13 of the robot of FIG. 1 .
  • Each of the key information of the audio data of stories is assigned one or more actions.
  • the action information table 13 shows relationships between each of key information and associated action, and includes a key information column and an action column.
  • the key information column records a plurality of key information.
  • the action column records a plurality of actions performed at the same time when the key information is fetched.
  • the key information “A 1 ” is assigned the actions “X A11 , X A12 , and X A13 ,” A 1 is a key word “bye-bye,” and the action XA 11 is “extend and wave the left hand,” the action XA 12 is “extend and wave the right hand,” and the action XA 13 is “blow a kiss.”
  • the action database 14 stores a list of actions that can be performed by the robot 1 .
  • the input unit 20 is configured for generating instructions for determining a story to be played in response to user input.
  • the processing unit 30 further includes an audio fetching module 31 , an audio outputting module 32 , a timer 33 , a relationship fetching module 34 , and an action performing module 35 .
  • the audio fetching module 31 is configured for fetching audio data from the audio database 11 according to an instruction generated from the input unit 20 when a user inputs an action request.
  • the audio outputting module 32 is configured for outputting the fetched audio data.
  • the D/A converter 40 is configured for converting the audio data into analog data.
  • the speaker 50 outputs the analog data, which in this embodiment is a story.
  • the timer 33 starts measuring time when the speaker 50 begins outputting the story.
  • the relationship fetching module 34 is configured for fetching each of the key information of the audio data when the elapsed time of the timer 33 reaches each time period of each of the key information from the information story table 12 , and fetching a corresponding action associated with each of the fetched key information from the action information table 13 .
  • the action performing module 35 is configured for fetching the action defined in the action database 14 , and controlling the actuator 60 to perform the action.
  • the actuator 60 performs the action via moving parts of the robot 1 .
  • the relationship fetching module 34 fetches the corresponding action associated with each of the fetched key information from the action information table 13 randomly.
  • the action performing module 35 further judges whether all actions corresponding to the story are performed. When all actions corresponding to the story are performed, the robot 1 finishes telling the story. In other words, a user selects and inputs a story, and then the robot 1 begins telling the story while accessing and performing the actions associated with the story. If the story has other key information and actions associations, those actions will also be performed during the course of the story.
  • FIG. 4 is a flowchart illustrating a control method of the robot of FIG. 1 .
  • the audio fetching module 31 receives the instruction generated from the input unit 20 and fetches the audio data of a story from the audio database 11 .
  • the audio outputting module 32 outputs the audio data
  • the D/A converter 40 converts the audio data into analog data
  • the speaker 50 begins telling a story associated with the audio data.
  • the timer 33 starts measuring time.
  • the elapsed time of the timer 33 reaches each time period of each of the key information of the audio data in the information story table 12 .
  • step S 440 the relationship fetching module 34 fetches the corresponding key information according to each time period from the information story table 12 .
  • step S 450 the relationship fetching module 34 fetches the action according to the fetched key information from the action information table 13 randomly.
  • step S 460 the action performing module 35 controls the actuator 60 to perform the action.
  • step S 470 the action performing module 35 judges whether all actions corresponding to the story are performed. If all actions aren't performed, the procedure returns to step S 420 , that is, when the elapsed time of the timer 33 reaches the next time period of the next key information of the audio data, the action performing module 35 controls the actuator 60 to perform the next action according to the next key information.
  • step S 480 if all actions are performed, all the key information of the story are fetched and all actions of all the key information are performed as a grand finale, and the robot 1 finishes telling the story.

Abstract

The present invention relates to a robot and a control method adapted for the robot. The robot stores audio data of stories, first relationships among the audio data, time periods, and key information of the stories, second relationships between the key information and actions, and actions. Each of the key information is assigned a time period indicating when the key information should be fetched. The method includes: a) telling a story; b) measuring time; c) fetching corresponding key information when an elapsed time reaches each time period of each of the key information; d) fetching an action according to the fetched key information; and e) performing the action.

Description

    BACKGROUND
  • 1. Technical Field
  • The disclosure relates to a robot and, more particularly, to a robot and a control method adapted for the robot.
  • 2. Description of the Related Art
  • There are many electronic toys that play audio books, and many robots for entertainment that can perform various actions. What is needed though, is a robot that can perform corresponding actions at the same time when telling a story.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The components in the drawings are not necessarily drawn to scale, the emphasis instead being placed upon clearly illustrating the principles of the robot. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.
  • FIG. 1 is a block diagram of a hardware infrastructure of a robot in accordance with an exemplary embodiment.
  • FIG. 2 is an example of an information story table of the robot of FIG. 1.
  • FIG. 3 is an example of an action information table of the robot of FIG. 1.
  • FIG. 4 is a flowchart illustrating a control method of the robot of FIG. 1.
  • DETAILED DESCRIPTION
  • FIG. 1 is a block diagram of a hardware infrastructure of a robot in accordance with an exemplary embodiment. The robot 1 includes a storage unit 10, an input unit 20, a processing unit 30, a digital-to-analog (D/A) converter 40, a speaker 50, and at least one actuator 60. The storage unit 10 stores an audio database 11, an information story table 12, an action information table 13, and an action database 14. The audio database 11 stores a list of audio data of stories that can be played by the robot 1.
  • FIG. 2 is an example of the information story table 12 of the robot of FIG. 1. Each audio data includes a plurality of key information, and each of the key information is assigned a time period indicating when the key information should be fetched. The information story table 12 stores relationships among audio data, time periods, and key information. The information story table 12 includes an audio data column, a time period column, and a key information column. The audio data column records a plurality of audio data of stories. The time period column records a plurality of time periods. The key information column records a plurality of key information. The key information is selected from the group consisting of words, phrases, and a combination of words and phrases.
  • For the purpose of better understanding the relationships among audio data, time periods, and key information, an example is described as follows. For example, the audio data of story “S1” includes the key information “A2” (e.g., a key word “walk”) and “A4” (e.g., a key phrase “sit down”), and the key word “walk” is assigned a first time period and the key phrase “sit down” is assigned a second time period. Accordingly, when an elapsed time of the audio data of story “S1” reaches the first time period when it is being played, the key word “walk” should be fetched, and when the elapsed time of the audio data of story “S1” reaches the second time period when it is being played, the key phrase “sit down” should be fetched.
  • FIG. 3 is an example of the action information table 13 of the robot of FIG. 1. Each of the key information of the audio data of stories is assigned one or more actions. The action information table 13 shows relationships between each of key information and associated action, and includes a key information column and an action column. The key information column records a plurality of key information. The action column records a plurality of actions performed at the same time when the key information is fetched. For example, the key information “A1” is assigned the actions “XA11, XA12, and XA13,” A1 is a key word “bye-bye,” and the action XA11 is “extend and wave the left hand,” the action XA12 is “extend and wave the right hand,” and the action XA13 is “blow a kiss.” The action database 14 stores a list of actions that can be performed by the robot 1.
  • The input unit 20 is configured for generating instructions for determining a story to be played in response to user input. The processing unit 30 further includes an audio fetching module 31, an audio outputting module 32, a timer 33, a relationship fetching module 34, and an action performing module 35. The audio fetching module 31 is configured for fetching audio data from the audio database 11 according to an instruction generated from the input unit 20 when a user inputs an action request. The audio outputting module 32 is configured for outputting the fetched audio data. The D/A converter 40 is configured for converting the audio data into analog data. The speaker 50 outputs the analog data, which in this embodiment is a story.
  • The timer 33 starts measuring time when the speaker 50 begins outputting the story. The relationship fetching module 34 is configured for fetching each of the key information of the audio data when the elapsed time of the timer 33 reaches each time period of each of the key information from the information story table 12, and fetching a corresponding action associated with each of the fetched key information from the action information table 13. The action performing module 35 is configured for fetching the action defined in the action database 14, and controlling the actuator 60 to perform the action. The actuator 60 performs the action via moving parts of the robot 1. In this embodiment, the relationship fetching module 34 fetches the corresponding action associated with each of the fetched key information from the action information table 13 randomly. The action performing module 35 further judges whether all actions corresponding to the story are performed. When all actions corresponding to the story are performed, the robot 1 finishes telling the story. In other words, a user selects and inputs a story, and then the robot 1 begins telling the story while accessing and performing the actions associated with the story. If the story has other key information and actions associations, those actions will also be performed during the course of the story.
  • FIG. 4 is a flowchart illustrating a control method of the robot of FIG. 1. In step S400, the audio fetching module 31 receives the instruction generated from the input unit 20 and fetches the audio data of a story from the audio database 11. In step S410, the audio outputting module 32 outputs the audio data, the D/A converter 40 converts the audio data into analog data, and the speaker 50 begins telling a story associated with the audio data. In step S420, the timer 33 starts measuring time. In step S430, the elapsed time of the timer 33 reaches each time period of each of the key information of the audio data in the information story table 12. In step S440, the relationship fetching module 34 fetches the corresponding key information according to each time period from the information story table 12. In step S450, the relationship fetching module 34 fetches the action according to the fetched key information from the action information table 13 randomly. In step S460, the action performing module 35 controls the actuator 60 to perform the action.
  • In step S470, the action performing module 35 judges whether all actions corresponding to the story are performed. If all actions aren't performed, the procedure returns to step S420, that is, when the elapsed time of the timer 33 reaches the next time period of the next key information of the audio data, the action performing module 35 controls the actuator 60 to perform the next action according to the next key information. In step S480, if all actions are performed, all the key information of the story are fetched and all actions of all the key information are performed as a grand finale, and the robot 1 finishes telling the story.
  • It is understood that the invention may be embodied in other forms without departing from the spirit thereof. Thus, the present examples and embodiments are to be considered in all respects as illustrative and not restrictive, and the invention is not to be limited to the details given herein.

Claims (16)

1. A robot, comprising:
a storage unit, configured for storing audio data of stories, first relationships among the audio data, time periods, and key information of the stories, second relationships between the key information and actions, and the actions, wherein each of the key information is assigned a time period indicating when the key information should be fetched;
a speaker, configured for telling a story associated with the audio data;
a timer, configured for measuring time when the speaker begins telling the story;
a relationship fetching module, configured for fetching corresponding key information when an elapsed time of the timer reaches each time period of each of the key information from the storage unit, and fetching an action according to the fetched key information from the storage unit; and
an actuator, configured for performing the action.
2. The robot as recited in claim 1, further comprising an input unit, configured for generating instructions for determining the story to be told in response to user input.
3. The robot as recited in claim 2, further comprising an audio fetching module, configured for fetching the audio data from the storage unit according to an instruction from the input unit.
4. The robot as recited in claim 3, further comprising an audio outputting performing module, configured for controlling the output of the audio data.
5. The robot as recited in claim 4, further comprising a digital-to-analog converter, configured for converting the audio data into analog data as the story.
6. The robot as recited in claim 1, further comprising an action performing module, configured for controlling the actuator to perform the action and judging whether all actions corresponding to the story are performed.
7. The robot as recited in claim 6, wherein when the action performing module has performed all actions associated with the key information of the audio data, the speaker finishes telling the story.
8. The robot as recited in claim 1, wherein the key information is selected from the group of consisting of words, phrases, and a combination of words and phrases.
9. The robot as recited in claim 1, wherein the relationship fetching module fetches the action according to the fetched key information randomly.
10. A control method for a robot, the robot storing audio data of stories, first relationships among the audio data, time periods, and key information of the stories, second relationships between the key information and actions, and the actions, wherein each of the key information is assigned a time period indicating when the key information should be fetched, the method comprising:
telling a story;
measuring time;
fetching corresponding key information when an elapsed time reaches each time period of each of the key information;
fetching an action according to the fetched key information; and
performing the action.
11. The control method as recited in claim 10, further comprising:
receiving an instruction and fetching audio data of the story.
12. The control method as recited in claim 11, further comprising:
converting the audio data into analog data as the story.
13. The control method as recited in claim 10, further comprising:
fetching the action according to the fetched key information randomly.
14. The control method as recited in claim 10, further comprising:
judging whether all actions corresponding to the story are performed.
15. The control method as recited in claim 14, further comprising:
finishing telling the story, after all actions associated with each of the key information of the audio data are performed.
16. The control method as recited in claim 10, wherein the key information is selected from the group of consisting of words, phrases, and a combination of words and phrases.
US12/432,685 2008-08-22 2009-04-29 Robot and control method thereof Abandoned US20100048090A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN200810304158A CN101653660A (en) 2008-08-22 2008-08-22 Type biological device for automatically doing actions in storytelling and method thereof
CN200810304158.2 2008-08-22

Publications (1)

Publication Number Publication Date
US20100048090A1 true US20100048090A1 (en) 2010-02-25

Family

ID=41696807

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/432,685 Abandoned US20100048090A1 (en) 2008-08-22 2009-04-29 Robot and control method thereof

Country Status (2)

Country Link
US (1) US20100048090A1 (en)
CN (1) CN101653660A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11113986B2 (en) * 2017-11-30 2021-09-07 Beijing Xiaomi Mobile Software Co., Ltd. Story machine, control method and control device therefor, storage medium and story machine player system

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107346107A (en) * 2016-05-04 2017-11-14 深圳光启合众科技有限公司 Diversified motion control method and system and the robot with the system
CN106128485A (en) * 2016-05-31 2016-11-16 深圳市贝美互动科技有限公司 Intelligent toy and the control method of broadcasting song thereof
CN109460548B (en) * 2018-09-30 2022-03-15 北京光年无限科技有限公司 Intelligent robot-oriented story data processing method and system

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4775352A (en) * 1986-02-07 1988-10-04 Lawrence T. Jones Talking doll with animated features
US4805328A (en) * 1986-09-29 1989-02-21 Marantz Company Talking doll
US4884972A (en) * 1986-11-26 1989-12-05 Bright Star Technology, Inc. Speech synchronized animation
US6442450B1 (en) * 1999-01-20 2002-08-27 Sony Corporation Robot device and motion control method
US20020120362A1 (en) * 2001-02-27 2002-08-29 Corinna E. Lathan Robotic apparatus and wireless communication system
US6572431B1 (en) * 1996-04-05 2003-06-03 Shalong Maa Computer-controlled talking figure toy with animated features
US6661418B1 (en) * 2001-01-22 2003-12-09 Digital Animations Limited Character animation system
US6898759B1 (en) * 1997-12-02 2005-05-24 Yamaha Corporation System of generating motion picture responsive to music
US7063591B2 (en) * 1999-12-29 2006-06-20 Sony Corporation Edit device, edit method, and recorded medium
US20070142965A1 (en) * 2005-12-19 2007-06-21 Chyi-Yeu Lin Robotic system for synchronously reproducing facial expression and speech and related method thereof
US20080255702A1 (en) * 2007-04-13 2008-10-16 National Taiwan University Of Science & Technology Robotic system and method for controlling the same
US7439699B1 (en) * 2005-04-26 2008-10-21 Dreamation, Inc. Animatronics systems and methods
US7478047B2 (en) * 2000-11-03 2009-01-13 Zoesis, Inc. Interactive character system
US20090029771A1 (en) * 2007-07-25 2009-01-29 Mega Brands International, S.A.R.L. Interactive story builder

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4775352A (en) * 1986-02-07 1988-10-04 Lawrence T. Jones Talking doll with animated features
US4805328A (en) * 1986-09-29 1989-02-21 Marantz Company Talking doll
US4884972A (en) * 1986-11-26 1989-12-05 Bright Star Technology, Inc. Speech synchronized animation
US6572431B1 (en) * 1996-04-05 2003-06-03 Shalong Maa Computer-controlled talking figure toy with animated features
US6898759B1 (en) * 1997-12-02 2005-05-24 Yamaha Corporation System of generating motion picture responsive to music
US6442450B1 (en) * 1999-01-20 2002-08-27 Sony Corporation Robot device and motion control method
US7063591B2 (en) * 1999-12-29 2006-06-20 Sony Corporation Edit device, edit method, and recorded medium
US7478047B2 (en) * 2000-11-03 2009-01-13 Zoesis, Inc. Interactive character system
US6661418B1 (en) * 2001-01-22 2003-12-09 Digital Animations Limited Character animation system
US20020120362A1 (en) * 2001-02-27 2002-08-29 Corinna E. Lathan Robotic apparatus and wireless communication system
US7439699B1 (en) * 2005-04-26 2008-10-21 Dreamation, Inc. Animatronics systems and methods
US20070142965A1 (en) * 2005-12-19 2007-06-21 Chyi-Yeu Lin Robotic system for synchronously reproducing facial expression and speech and related method thereof
US20080255702A1 (en) * 2007-04-13 2008-10-16 National Taiwan University Of Science & Technology Robotic system and method for controlling the same
US20090029771A1 (en) * 2007-07-25 2009-01-29 Mega Brands International, S.A.R.L. Interactive story builder

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11113986B2 (en) * 2017-11-30 2021-09-07 Beijing Xiaomi Mobile Software Co., Ltd. Story machine, control method and control device therefor, storage medium and story machine player system

Also Published As

Publication number Publication date
CN101653660A (en) 2010-02-24

Similar Documents

Publication Publication Date Title
US20100048090A1 (en) Robot and control method thereof
US20060184369A1 (en) Voice activated instruction manual
JP2008509431A (en) Method for a system that performs an interactive conversation with a user
JP2004054080A (en) Method and device for data input
JP6667855B2 (en) Acquisition method, generation method, their systems, and programs
JP2007232829A (en) Voice interaction apparatus, and method therefor and program
JP4729902B2 (en) Spoken dialogue system
JP6316214B2 (en) SYSTEM, SERVER, ELECTRONIC DEVICE, SERVER CONTROL METHOD, AND PROGRAM
CN101861621A (en) Automatic simultaneous interpretation system
JP2020049286A5 (en)
WO2018034169A1 (en) Dialogue control device and method
JP4508917B2 (en) Information presenting apparatus, information presenting method, and information presenting program
JP6495015B2 (en) Spoken dialogue control device, control method of spoken dialogue control device, and spoken dialogue device
US20100076597A1 (en) Storytelling robot associated with actions and method therefor
JP4905522B2 (en) Device control apparatus, device control method and program
US20100159431A1 (en) Electronic audio playing apparatus with an interactive function and method thereof
JP2007108524A (en) Voice input evaluation apparatus and method, and program
JP6641680B2 (en) Audio output device, audio output program, and audio output method
US20100174530A1 (en) Electronic audio playing apparatus with an interactive function and method thereof
JP2006058641A (en) Speech recognition device
JP7290154B2 (en) Information processing device, information processing method, and program
WO2017159207A1 (en) Processing execution device, method for controlling processing execution device, and control program
JP2002085834A (en) Game machine
JP6248717B2 (en) Elevator control device
JP2022081279A (en) Game program, recording medium, game processing method, and information processor

Legal Events

Date Code Title Description
AS Assignment

Owner name: HON HAI PRECISION INDUSTRY CO., LTD.,TAIWAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WANG, CHUAN-HONG;CHOU, HSIAO-CHUNG;HUANG, LI-ZHANG;REEL/FRAME:022616/0960

Effective date: 20090227

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION