US20100210332A1 - Computer-readable storage medium having stored therein drawing processing program, and information processing apparatus - Google Patents

Computer-readable storage medium having stored therein drawing processing program, and information processing apparatus Download PDF

Info

Publication number
US20100210332A1
US20100210332A1 US12/646,306 US64630609A US2010210332A1 US 20100210332 A1 US20100210332 A1 US 20100210332A1 US 64630609 A US64630609 A US 64630609A US 2010210332 A1 US2010210332 A1 US 2010210332A1
Authority
US
United States
Prior art keywords
sound
processing
computer
related processing
stored
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/646,306
Inventor
Daiji IMAI
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nintendo Co Ltd
Original Assignee
Nintendo Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nintendo Co Ltd filed Critical Nintendo Co Ltd
Assigned to NINTENDO CO., LTD. reassignment NINTENDO CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: IMAI, DAIJI
Publication of US20100210332A1 publication Critical patent/US20100210332A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • A63F13/10
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/424Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving acoustic input signals, e.g. by using the results of pitch or rhythm extraction or voice recognition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6063Methods for processing data by generating or executing the game program for sound processing
    • A63F2300/6072Methods for processing data by generating or executing the game program for sound processing of an input signal, e.g. pitch and rhythm extraction, voice recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0381Multimodal input, i.e. interface arrangements enabling the user to issue commands by simultaneous use of input devices of different nature, e.g. voice plus gesture on digitizer

Definitions

  • the present invention relates to a computer-readable storage medium having stored therein a drawing processing program which allows a player to draw a desired picture, and more particularly, to a computer-readable storage medium having stored therein a drawing processing program which allows a player to draw a desired picture by using a pointing device such as a touch panel.
  • an apparatus for editing an image by a user operating a touch pen to perform an input to a touch panel for example, Japanese Laid-Open Patent Publication No. 2003-191567, and Japanese Laid-Open Patent Publication No. 2006-129257.
  • Such an apparatus is capable of, by using the touch pen, editing (e.g., drawing graffiti on) an image obtained by shooting an object (user) itself.
  • the thickness or the line type of a pen can also be selected.
  • an apparatus as described above has the following problem.
  • drawing is performed while an input is being performed by the touch pen with respect to the touch panel. Therefore, the user can perform an operation as if the user performs drawing on paper by using a pen.
  • such an operation is commonplace, and therefore, does not give sufficient freshness to the user.
  • an object of the present invention is to provide a drawing processing program and an information processing apparatus which enable drawing to be performed through a nonconventional and novel way of operation.
  • a first aspect is a computer-readable storage medium having stored therein a drawing processing program which is executed by a computer of an information processing apparatus in which a pointing device for designating a position on a display screen, and sound input means can be used, the drawing processing program causing the computer to function as designated position detection means (S 22 ), sound detection means (S 51 ), and drawing-related processing execution means (S 56 ).
  • the designated position input coordinate detection means continuously obtains a designated position on the display screen, based on a designation performed by the pointing device.
  • the pointing device is a touch panel
  • a touch operation with respect to the touch panel corresponds to a designation operation.
  • the pointing device is an operation device including shooting means for shooting a shooting target and is capable of, based on shooting information obtained by the shooting means, designating any position on a screen, pressing down a predetermined button provided to the operation device while the position is being designated corresponds to a designation operation.
  • the sound detection means detects that a sound which satisfies a predetermined condition is inputted to the sound input means.
  • a determination using the predetermined condition may be a determination using a predetermined threshold value. That is, the predetermined condition may be such that when a sound having a magnitude of certain level is inputted, it is determined that a sound is inputted, or may be such that specified sound determination means described later determines that a predetermined sound is inputted.
  • the drawing-related processing execution means executes, while the sound detection means detects that the sound which satisfies the predetermined condition is inputted, predetermined drawing-related processing on a position based on the designated position obtained by the designated position detection means.
  • the drawing-related processing includes processing of drawing a line (straight line or curved line), a dot, or an image formed by a collection of the line or the dot on the display screen, and in addition, includes processing of working upon (editing) an image or the like which has been already drawn, and processing of erasing an image or the like which has been already displayed.
  • a painting program providing a novel way of operation can be provided.
  • the drawing-related processing execution means changes a content of the drawing-related processing to be executed, based on the sound detected by the sound detection means, and in accordance with a characteristic of the sound.
  • the characteristic of the sound is, for example, a volume, a frequency, a tone, or the like.
  • the drawing-related processing execution means sequentially changes a content of the drawing-related processing to be executed, in a coordinated manner with changes in chronological order in the characteristic of the sound repeatedly detected by the sound detection means.
  • the third aspect since a content of the drawing-related processing is changed in real time in accordance with a change in an inputted sound, a novel way of enjoyment can be provided to the player. Further, while drawing is performed based on the position detected by the designated position detection means, the content of the drawing-related processing is changed in accordance with an input from the sound input means which is means other than the designated position detection means. Therefore, the player can change the content of the drawing-related processing while the player continues to designate a position for drawing.
  • sound analysis means executes the predetermined drawing-related processing only when a volume of the inputted sound is equal to or larger than a predetermined threshold value.
  • the drawing-related processing is executed only when the volume of the inputted sound is larger than a certain degree, a novel way of enjoyment can be provided to the player.
  • the drawing-related processing execution means executes, as the drawing-related processing, processing of drawing a line which connects, in chronological order, the position based on the designated position sequentially obtained by the input coordinate detection means.
  • a handwritten image can be drawn on the display screen through a novel way of operation, a novel way of enjoyment can be provided to the player.
  • the sound analysis means changes at least one of a thickness of the line and a density of a color in which the line is drawn, in accordance with a volume of the inputted sound.
  • the thickness or the density of the line to be drawn is changed in accordance with the volume of the inputted sound, a novel way of enjoyment can be provided to the player.
  • the drawing-related processing execution means executes, as the drawing-related processing, processing of drawing one or more dots in a drawing range which is a predetermined range including therein the position based on the designated position.
  • a feeling of performing drawing by using a spray together with a novel way of operation, can be provided to the player.
  • the sound analysis means changes at least one of a size of the drawing range and a number of the dots to be drawn in the drawing range, in accordance with a volume of the inputted sound.
  • the drawing-related processing execution means draws the dots such that an area density of the number of the dots which are nearer to the position based on the designated position is higher, and that an area density of the number of the dots which are farther from the position based on the designated position is lower.
  • the drawing-related processing execution means draws the dots at random positions in the drawing area.
  • various types of drawing can be performed in accordance with the volume of the inputted sound.
  • the drawing-related processing execution means executes, as the drawing-related processing, processing of moving the dots drawn on the display screen in a predetermined direction, based on the position based on the designated position, and the sound input detected by the sound detection means.
  • the drawing-related processing execution means includes movement content calculation means for calculating: a direction of a line connecting each of the dots displayed on the display screen, with a reference point which is the position based on the designated position; and a distance from the reference point to each of the dots displayed on the display screen.
  • the drawing-related processing execution means moves the dots displayed on the screen, based on the direction and the distance calculated by the movement content calculation means.
  • dots which have been already drawn can be moved by using a sound input, and thereby a novel way of enjoyment can be provided to the player.
  • the drawing processing program further causes the computer to function as sound effect reproduction means (S 60 ) for causing predetermined sound output means to output a predetermined sound effect when the drawing-related processing execution means is executing the predetermined drawing-related processing.
  • the sound effect reproduction means changes a volume at which the sound effect is reproduced, in accordance with a characteristic of the sound detected by the sound detection means.
  • the player can intuitively recognize whether or not the drawing-related processing is being executed.
  • the drawing processing program further causes the computer to function as cursor display means (S 57 ) and animation display means (S 57 ).
  • the cursor display means displays a predetermined cursor image at the designated position.
  • the animation display means animates the cursor when the drawing-related processing execution means is executing the predetermined drawing-related processing.
  • the animation display means changes a speed of the animation in accordance with a characteristic of the sound detected by the sound detection means.
  • the player can visually recognize whether or not the drawing-related processing is being executed.
  • the pointing device is a touch panel.
  • an intuitive way of operation can be provided to the player.
  • the drawing processing program further causes the computer to function as shot image obtaining means (S 1 ), and shot image display means (S 21 ).
  • the shot image obtaining means obtains image data of an image shot by predetermined shooting means.
  • the shot image display means displays, on the display screen, the shot image.
  • the drawing-related processing execution means executes the drawing-related processing on the shot image.
  • editing of an image with respect to a shot image, or the like can be provided together with a novel way of operation.
  • the drawing processing program further causes the computer to function as specified sound determination means for determining whether or not the sound detected by the sound detection means is a predetermined sound.
  • the drawing-related processing execution means executes the drawing-related processing only when the specified sound determination means determines that the sound detected by the sound detection means is the predetermined sound.
  • the drawing-related processing by identifying a specified sound such as a sound of the player blowing breath, and thereby a novel way of enjoyment can be provided to the player.
  • a specified sound such as a sound of the player blowing breath
  • a twentieth aspect is an information processing apparatus capable of using a pointing device ( 13 ) for designating a position on a display screen ( 12 ), and of using sound input means ( 42 ), the information processing apparatus comprising input coordinate detection means ( 31 ), sound detection means ( 31 ), and drawing-related processing execution means ( 31 ).
  • the input coordinate detection means continuously obtains a designated position on the display screen, based on a designation performed by the pointing device.
  • the sound detection means detects that a sound which satisfies a predetermined condition is inputted to the sound input means.
  • the drawing-related processing execution means executes, while the sound detection means detects that the sound is inputted, predetermined drawing-related processing on a position based on the designated position.
  • the sound detection means is placed in proximity of the display screen.
  • the display screen and the sound detection means are placed at positions which are close to each other, it becomes possible to provide an effect of intuitive rendering, which is, for example, an effect of, when a sound is uttered toward the screen, performing drawing on the display screen in response to the sound.
  • a painting program and a painting game which allow the player to enjoy drawing through a novel way of operation can be provided.
  • FIG. 1 is an external view of a hand-held game apparatus 10 according to one embodiment of the present invention.
  • FIG. 2 is a block diagram of the hand-held game apparatus 10 according to the one embodiment of the present invention.
  • FIG. 3 shows an example of a screen of a game assumed in the present embodiment
  • FIG. 4 shows an example of a shot image
  • FIG. 5 shows an example of the screen of the game assumed in the present embodiment
  • FIG. 6 shows a relationship between the shot image and a canvas
  • FIG. 7 shows an example of the screen of the game assumed in the present embodiment
  • FIG. 8 is a drawing for illustrating an operation in the game assumed in the present embodiment
  • FIG. 9 shows an example of an image drawn in the game assumed in the present embodiment
  • FIG. 10 shows an example of the screen of the game assumed in the present embodiment
  • FIG. 11 shows an example of the screen of the game assumed in the present embodiment
  • FIG. 12 shows an example of the screen of the game assumed in the present embodiment
  • FIG. 13 is an illustrative diagram showing a memory map of a main memory 32 shown in FIG. 2 ;
  • FIG. 14 shows an example of a data configuration of a drawing tool master 327 ;
  • FIG. 15 shows an example of a data configuration of a spray table 332 ;
  • FIG. 16 is a drawing for illustrating a drawing area
  • FIG. 17 is a flowchart showing graffiti game processing according to the present embodiment of the present invention.
  • FIG. 18 is a flowchart showing the detail of the camera processing shown in step S 1 in FIG. 17 ;
  • FIG. 19 is a flowchart showing the detail of graffiti processing shown in step S 2 in FIG. 17 ;
  • FIG. 20 is a flowchart showing the detail of pen processing shown in step S 29 in FIG. 19 ;
  • FIG. 21 is a flowchart showing the detail of spray drawing processing shown in step S 43 in FIG. 20 ;
  • FIG. 22 is a flowchart showing the detail of eraser processing shown in step S 31 in FIG. 19 ;
  • FIG. 23 is a flowchart showing the detail of spray eraser processing shown in step S 73 in FIG. 22 ;
  • FIG. 24 is a drawing for illustrating the processing outline of blow-off processing
  • FIG. 25 is a drawing for illustrating the processing outline of the blow-off processing
  • FIG. 26 is a drawing for illustrating the processing outline of the blow-off processing
  • FIG. 27 is a flowchart showing the detail of the blow-off processing.
  • FIG. 28 shows an example of a spray line formed by a plurality of colors.
  • FIG. 1 is an external view of a game apparatus 1 which executes a color conversion program according to the present invention.
  • a hand-held game apparatus is shown.
  • the game apparatus 1 has a camera, and thus functions as a shooting apparatus for shooting an image with the camera, displaying the shot image on a screen, and saving data of the shot image.
  • the game apparatus 1 is a foldable hand-held game apparatus and is shown in a state (opened state) where the game apparatus 1 is opened.
  • the game apparatus 1 is configured so as to have a size which allows a user to hold the game apparatus 1 with both hands or one hand even in the state where the game apparatus 1 is opened.
  • the game apparatus 1 includes a lower housing 11 and an upper housing 21 .
  • the lower housing 11 and the upper housing 21 are connected to each other such that the game apparatus 1 can be opened or closed (folded).
  • the lower housing 11 and the upper housing 21 are each formed in a plate-like shape of a horizontally long rectangle, and connected to each other rotatably around long-side portions thereof.
  • the user uses the game apparatus 1 in the opened state.
  • the user keeps the game apparatus 1 in a closed state.
  • FIG. 1 In the example shown in FIG.
  • the game apparatus 1 in addition to the closed state and the opened state, the game apparatus 1 is capable of maintaining an angle between the lower housing 11 and the upper housing 21 at any angle ranging between the closed state and the opened state by frictional force generated at a connection portion, and the like.
  • the upper housing 21 can be stationary at any angle with respect to the lower housing 11 .
  • a lower LCD (Liquid Crystal Display) 12 is provided in the lower housing 11 .
  • the lower LCD 12 has a horizontally long shape, and is located such that a long-side direction thereof corresponds to a long-side direction of the lower housing 11 .
  • an LCD is used as a display device provided in the game apparatus 1 in the present embodiment, any other display devices such as a display device using an EL (Electro Luminescence), and the like may be used.
  • the game apparatus 1 can use a display device of any resolution.
  • the lower LCD 12 is used mainly for displaying, in real time, an image to be shot by an inner camera 23 or an outer camera 25 .
  • operation buttons 14 A to 14 K, and a touch panel 13 are provided as input devices.
  • the direction input button 14 A, the operation button 14 B, the operation button 14 C, the operation button 14 D, the operation button 14 E, the power button 14 F, the start button 14 G, and the select button 14 H are provided on an inner main surface of the lower housing 11 which is located inside when the upper housing 21 and the lower housing 11 are folded.
  • the direction input button 14 A is used, for example, for a selection operation, and the like.
  • the operation buttons 14 B to 14 E are used, for example, for a determination operation, a cancellation operation, and the like.
  • the power button 14 F is used for turning on/off the power of the game apparatus 1 .
  • the direction input button 14 A and the power button 14 F are provided on the inner main surface of the lower housing 11 and to one of the left and the right (to the left in FIG. 1 ) of the lower LCD 12 provided in the vicinity of a center of the inner main surface of the lower housing 11 .
  • the operation buttons 14 B to 14 E, the start button 14 G, and the select button 14 H are provided on the inner main surface of the lower housing 11 and to the other one of the left and the right (to the right in FIG. 1 ) of the lower LCD 12 .
  • the direction input button 14 A, the operation buttons 14 B to 14 E, the start button 14 G, and the select button 14 H are used for performing various operations with respect to the game apparatus 1 .
  • buttons 14 I to 14 K are omitted in FIG. 1 .
  • the L button 14 I is provided at a left end portion of an upper side surface of the lower housing 11
  • the R button 14 J is provided at a right end portion of the upper side surface of the lower housing 11 .
  • the L button 14 I and the R button 14 J are used, for example, for performing a shooting instruction operation (shutter operation) with respect to the game apparatus 1 .
  • the volume button 14 K is provided on a left side surface of the lower housing 11 .
  • the volume button 14 K is used for adjusting the volume of speakers of the game apparatus 1 .
  • the game apparatus 1 further includes the touch panel 13 as an input device other than the operation buttons 14 A to 14 K.
  • the touch panel 13 is mounted on the lower LCD 12 so as to cover a screen of the lower LCD 12 .
  • a resistive film type touch panel is used as the touch panel 13 .
  • the touch panel 13 is not limited to the resistive film type, and any press-type touch panel may be used.
  • the touch panel 13 having the same resolution (detection accuracy) as that of the lower LCD 12 is used, for example.
  • the resolutions of the touch panel 13 and the lower LCD 12 may not necessarily be the same as each other.
  • an insertion opening (indicated by a dotted line in FIG. 1 ) is provided.
  • the insertion opening is capable of accommodating a touch pen 27 which is used for performing an operation with respect to the touch panel 13 .
  • a finger of the user instead of the touch pen 27 , can be used for operating the touch panel 13 .
  • an insertion opening (indicated by a two-dot chain line in FIG. 1 ) is formed for accommodating a memory card 28 .
  • a connector (not shown) is provided inside the insertion opening for electrically connecting the game apparatus 1 to the memory card 28 .
  • the memory card 28 is, for example, an SD (Secure Digital) memory card, and detachably mounted to the connector.
  • the memory card 28 is used, for example, for storing (saving) an image shot by the game apparatus 1 , and for loading an image generated by other apparatuses into the game apparatus 1 .
  • an insertion opening (indicated by a chain line in FIG. 1 ) is formed for accommodating a cartridge 29 . Also inside the insertion opening, a connector (not shown) is provided for electrically connecting the game apparatus 1 to the cartridge 29 .
  • the cartridge 29 is a storage medium storing the color conversion program, a game program, and the like, and is detachably mounted in the insertion opening provided in the lower housing 11 .
  • Three LEDs 15 A to 15 C are mounted to a left side part of the connection portion where the lower housing 11 and the upper housing 21 are connected to each other.
  • the game apparatus 1 is capable of performing wireless communication with another apparatus, and the first LED 15 A is lit up while the power of the game apparatus 1 is ON.
  • the second LED 15 B is lit up while the game apparatus 1 is being charged.
  • the third LED 15 C is lit up while wireless communication is established.
  • the three LEDs 15 A to 15 C can notify the user of a state of ON/OFF of the power of the game apparatus 1 , a state of charge of the game apparatus 1 , and a state of communication establishment of the game apparatus 1 .
  • an upper LCD 22 is provided in the upper housing 21 .
  • the upper LCD 22 has a horizontally long shape, and is located such that a long side direction thereof corresponds to a long side direction of the upper housing 21 .
  • a display device which is of any other type or has any other resolution may be used instead of the upper LCD 22 .
  • a touch panel may be provided so as to cover the upper LCD 22 .
  • an operation illustration screen is displayed for teaching roles of the operation buttons 14 A to 14 K and the touch panel 13 to the user.
  • the inner camera 23 is mounted in an inner main surface of the upper housing 21 and at the vicinity of the connection portion.
  • the outer camera 25 is mounted in a surface opposite to the inner main surface in which the inner camera 23 is mounted, namely, in an outer main surface of the upper housing 21 (which is a surface located on the outside of the game apparatus 1 in the closed state, and which is a back surface of the upper housing 21 shown in FIG. 1 ).
  • the outer camera 25 is indicated by a dashed line.
  • the inner camera 23 is capable of shooting an image in a direction in which the inner main surface of the upper housing 21 faces
  • the outer camera 25 is capable of shooting an image in a direction opposite to the shooting direction of the inner camera 23 , namely, in a direction in which the outer main surface of the upper housing 21 faces.
  • the two cameras 23 and 25 are provided such that the shooting directions thereof are opposite to each other.
  • the user can shoot a view seen from the game apparatus 1 toward the user with the inner camera 23 as well as a view seen from the game apparatus 1 in a direction opposite to a direction toward the user with the outer camera 25 .
  • a microphone (a microphone 42 shown in FIG. 2 ) is accommodated as a sound input device.
  • a microphone hole 16 is formed to allow the microphone 42 to detect a sound from outside the game apparatus 1 .
  • the position at which the microphone 42 is accommodated and the position of the microphone hole 16 are not necessarily in the connection portion, and, for example, the microphone 42 may be accommodated in the lower housing 11 and the microphone hole 16 may be formed in the lower housing 11 so as to correspond to the accommodating position of the microphone 42 .
  • a fourth LED 26 (indicated by a dashed line in FIG. 1 ) is mounted in the outer main surface of the upper housing 21 .
  • the fourth LED 26 is lit up at a time when shooting is performed by the outer camera 25 (when the shutter button is pressed).
  • the fourth LED 26 is lit up while a moving picture is shot by the outer camera 25 .
  • the fourth LED 26 can notify a person to be shot and people around the person that shooting has been performed (is being performed) by the game apparatus 1 .
  • sound holes 24 are formed in the inner main surface of the upper housing 21 and to the left and right of the upper LCD 22 provided in the vicinity of a center of the inner main surface of the upper housing 21 .
  • the speakers are accommodated in the upper housing 21 and at the back of the sound holes 24 .
  • the sound holes 24 are holes for releasing a sound from the speakers to the outside of the game apparatus 1 therethrough.
  • the inner camera 23 and the outer camera 25 which are configurations for shooting an image
  • the upper LCD 22 which is display means for displaying, for example, an operation illustration screen upon shooting
  • the input devices for performing an operation input with respect to the game apparatus 1 the touch panel 13 and the buttons 14 A to 14 K
  • the lower LCD 12 which is display means for displaying a game screen are provided in the lower housing 11 .
  • the user can hold the lower housing 11 and perform an input with respect to the input device while viewing a shot image (an image shot by the camera) displayed on the lower LCD 12 .
  • FIG. 2 is a block diagram showing an example of the internal configuration of the game apparatus 1 .
  • the game apparatus 1 includes electronic components including a CPU 31 , a main memory 32 , a memory control circuit 33 , a storage data memory 34 , a preset data memory 35 , a memory card interface (memory card I/F) 36 , a cartridge I/F 44 , a wireless communication module 37 , a local communication module 38 , a real time clock (RTC) 39 , a power circuit 40 , an interface circuit (I/F circuit) 41 , and the like.
  • These electronic components are mounted on an electronic circuit substrate and accommodated in the lower housing 11 (or may be accommodated in the upper housing 21 ).
  • the CPU 31 is information processing means for executing a predetermined program.
  • the predetermined program is stored in a memory (e.g., the storage data memory 34 ) in the game apparatus 1 , or in the memory cards 28 and/or 29 , and the CPU 31 executes later-described graffiti processing by executing the predetermined program.
  • a program executed by the CPU 31 may be stored in advance in a memory in the game apparatus 1 , may be obtained from the memory card 28 and/or the cartridge 29 , or may be obtained from another apparatus by means of communication with the other apparatus.
  • the program may be downloaded and obtained from a predetermined server via the Internet, or a predetermined program stored in a stationary game apparatus may be downloaded and obtained by performing communication with the stationary game apparatus.
  • the main memory 32 , the memory control circuit 33 , and the preset data memory 35 are connected to the CPU 31 .
  • the storage data memory 34 is connected to the memory control circuit 33 .
  • the main memory 32 is storage means used as a work area and a buffer area of the CPU 31 .
  • the main memory 32 stores various data used in the graffiti processing, and stores a program obtained from the outside (the memory cards 28 and 29 , another apparatus, or the like).
  • a PSRAM Pseudo-SRAM
  • the storage data memory 34 is storage means for storing a program executed by the CPU 31 , data of an image shot by the inner camera 23 and the outer camera 25 , and the like.
  • the storage data memory 34 is constructed of a nonvolatile storage medium, which is, in the present embodiment, a NAND flash memory, for example.
  • the memory control circuit 33 is a circuit for controlling reading of data from the storage data memory 34 or writing of data to the storage data memory 34 in accordance with an instruction from the CPU 31 .
  • the preset data memory 35 is storage means for storing data (preset data) of various parameters which are set in advance in the game apparatus 1 , and the like.
  • a flash memory connected to the CPU 31 via an SPI (Serial Peripheral Interface) bus can be used as the preset data memory 35 .
  • SPI Serial Peripheral Interface
  • the memory card I/F 36 is connected to the CPU 31 .
  • the memory card I/F 36 reads data from the memory card 28 mounted to the connectors or writes data to the memory card 28 , in accordance with an instruction from the CPU 31 .
  • data of images shot by the outer camera 25 is written to the memory card 28
  • image data stored in the memory card 28 is read from the memory card 28 to be stored in the storage data memory 34 .
  • the cartridge I/F 44 is connected to the CPU 31 .
  • the cartridge I/F 44 reads out data from the cartridge 29 mounted to the connector or writes data to the cartridge 29 , in accordance with an instruction from the CPU 31 .
  • an application program which can be executed by the information processing apparatus 10 is read out from the cartridge 29 to be executed by the CPU 31 , and data associated with the application program (e.g., saved data in a game) is written to the cartridge 29 .
  • the graffiti game program according to the present invention may be supplied to a computer system not only from an external storage medium such as the cartridge 29 , but also via a wired or wireless communication line.
  • the graffiti game program may be stored in advance in a nonvolatile storage unit in the computer system.
  • an information storage medium for storing the color conversion program is not limited to the above nonvolatile storage unit, and may be a CD-ROM, a DVD, or an optical disc-shaped storage medium similar to them.
  • the wireless communication module 37 has a function of connecting to a wireless LAN, for example, by a method conformed to the standard of IEEE802.11.b/g.
  • the local communication module 38 has a function of wirelessly communicating with a game apparatus of the same type by a predetermined communication method.
  • the wireless communication module 37 and the local communication module 38 are connected to the CPU 31 .
  • the CPU 31 is capable of receiving data from and transmitting data to another apparatus via the Internet by using the wireless communication module 37 , and capable of receiving data from and transmitting data to another game apparatus of the same type by using the local communication module 38 .
  • the RTC 39 and the power circuit 40 are connected to the CPU 31 .
  • the RTC 39 counts time, and outputs the time to the CPU 31 .
  • the CPU 31 is capable of calculating current time (date) and the like, based on the time counted by the RTC 39 .
  • the power circuit 40 controls electric power from a power supply (typically, a battery accommodated in the lower housing 11 ) of the game apparatus 1 to supply the electric power to each electronic component of the game apparatus 1 .
  • the game apparatus 1 includes the microphone 42 and an amplifier 43 .
  • the microphone 42 and the amplifier 43 are connected to the I/F circuit 41 .
  • the microphone 42 detects a voice produced by the user toward the game apparatus 1 , and outputs a sound signal indicating the voice to the I/F circuit 41 .
  • the amplifier 43 amplifies the sound signal from the I/F circuit 41 , and causes the speakers (not shown) to output the sound signal.
  • the I/F circuit 41 is connected to the CPU 31 .
  • the touch panel 13 is connected to the I/F circuit 41 .
  • the I/F circuit 41 includes a sound control circuit for controlling the microphone 42 and the amplifier (the speakers), and a touch panel control circuit for controlling the touch panel 13 .
  • the sound control circuit performs A/D conversion or D/A conversion with respect to the sound signal, and converts the sound signal into sound data in a predetermined format.
  • the touch panel control circuit generates touched position data in a predetermined format, based on a signal from the touch panel 13 , and outputs the touched position data to the CPU 31 .
  • the touched position data is data indicating coordinate of a position at which an input is performed with respect to an input surface of the touch panel 13 .
  • the touch panel control circuit reads a signal from the touch panel 13 and generates touched position data, once every predetermined time period.
  • the CPU 31 is capable of recognizing a position at which an input is performed with respect to the touch panel 13 by obtaining the touched position data via the I/F circuit 41 .
  • An operation button 14 includes the above operation buttons 14 A to 14 K, and is connected to the CPU 31 .
  • the operation button 14 outputs, to the CPU 31 , operation data indicating an input state with respect to each of the buttons 14 A to 14 K (whether or not each button is pressed).
  • the CPU 31 obtains the operation data from the operation button 14 , and executes processing in accordance with an input with respect to the operation button 14 .
  • the inner camera 23 and the outer camera 25 are connected to the CPU 31 .
  • Each of the inner camera 23 and the outer camera 25 shoots an image in accordance with an instruction from the CPU 31 , and outputs data of the shot image to the CPU 31 .
  • the CPU 31 gives a shooting instruction to the inner camera 23 or the outer camera 25 , and the camera which has received the shooting instruction shoots an image and transmits image data to the CPU 31 .
  • the lower LCD 12 and the upper LCD 22 are connected to the CPU 31 .
  • Each of the lower LCD 12 and the upper LCD 22 displays an image thereon in accordance with an instruction from the CPU 31 .
  • FIG. 3 is an example of a screen of the game assumed in the present embodiment.
  • a game screen is displayed on the lower LCD 12
  • a toolbar 103 is displayed at the top of the game screen
  • a canvas 101 which covers most of the game screen, is displayed under the toolbar 103 .
  • a drawing tool icon 111 is displayed on the lower LCD 12
  • a line-type icon 112 is displayed on the toolbar 103 .
  • FIG 3 shows a state in which the canvas 101 is being touched by the touch pen 27 , and a cursor 102 is displayed at the touched position on the canvas 101 .
  • the player can draw a picture on the canvas 101 by moving the touch pen 27 on the canvas 101 , and the present invention provides a novel way of operation of the drawing operation, as described later.
  • the player can enjoy drawing graffiti on an image shot by the outer camera (or image shot by the inner camera 23 ) of the game apparatus 1 .
  • the shot image is placed as a “base picture” in the area of the canvas 101 so as to overlap with the canvas 101 , whereby the player can enjoy drawing graffiti on the shot image as shown in FIG. 5 .
  • FIG. 6 is a schematic view showing a positional relationship between the shot image and the canvas 101 .
  • the concept of the application is that two layers of a base-picture layer 105 and a handwriting layer 106 are used, and the shot image corresponds to the base-picture layer 105 .
  • the canvas 101 corresponds to the handwriting layer 106 .
  • the handwriting layer 106 is, as it were, a transparent sheet, and conceptually, processing in which the transparent sheet is overlapped on the shot image and graffiti is drawn on the sheet is performed. In other words, processing in which the shot image is directly edited (graffiti is directly drawn on the shot image) is not performed in the present embodiment.
  • a drawing operation with respect to the canvas 101 in the application will be described.
  • a picture can be drawing by moving the touch pen 27 on the canvas 101 .
  • two types i.e., a “pen” and an “eraser” can be used as types of drawing tools used for drawing in a game.
  • the “pen” is a tool for drawing something on the canvas
  • the “eraser” is a tool for erasing a content drawn on the canvas 101 .
  • a line of uniform thickness, or a “spray” can be selected as a type (line type) of a drawn line.
  • the “pen” or the “eraser” can be selected by operating the drawing tool icon 111 on the toolbar 103 .
  • the line type of the selected tool can be selected by operating the line-type icon 112 .
  • use of the line of uniform thickness or use of the “spray” can be selected.
  • the thickness of the line of uniform thickness can also be designated, and of the four icons of the line-type icon 112 , left three icons represent the respective thicknesses.
  • the rightmost icon on which a picture of a propeller is displayed represents the “spray”.
  • the drawing color color of the line or the spray
  • the line of uniform thickness can be drawn at a touched position as shown in FIG. 3 and FIG. 5 .
  • a line is drawn at the same time as touch is performed.
  • the drawing is not performed by only touching the touch panel, unlike in the case of using the line of uniform thickness.
  • FIG. 7 to FIG. 10 a drawing operation performed when the “spray” is selected as the line type will be described.
  • the player operates the toolbar 103 to select the “pen” as a drawing tool. Specifically, every time the player touches the drawing tool icon 111 , the drawing tool switches to “pen” ⁇ “eraser” ⁇ “pen” . . .
  • an image content of the drawing tool icon 111 also switches between an image of a pen tip and an image of an eraser.
  • the player touches the rightmost icon of a propeller image among the four icons of the line-type icon 112 , and thereby can select the “spray” as the line-type.
  • the cursor 102 whose image represents a propeller is displayed at the touched position as shown in FIG. 7 .
  • nothing is drawn on the canvas 101 (in the case of using the “pen”, at least a “dot” is drawn at the touched position as of this moment). Therefore, in this state, even if the player moves the touch pen 27 while touching the canvas 101 with the touch pen 27 , nothing is drawn on the canvas 101 .
  • the thickness and the density of the spray line change in accordance with the strength at which the player blows breath. For example, when the player weakly blows breath, the spray line which is thin and dilute (has the reduced number of dots) as shown in FIG. 11 can be drawn. When the player strongly blows breath, the spray line which is thick and dense (has the increased number of dots) as shown in FIG. 12 can be drawn. In addition, how strongly the player blows is reflected in the spray line in real time. For example, when the player desires to draw one spray line, if the player strongly blows at the beginning of the drawing and thereafter, the strength at which the player blows is gradually decreased, the spray line is drawn such that the increased number of dots are drawn when the drawing begins, as shown in FIG. 12 , and that the number of dots is gradually decreased with the progression of the drawing. In addition, in the application, when such a spray line is drawn, a spraying sound is reproduced as sound effect.
  • the outline (principle) of drawing processing of the spray line performed in the present embodiment will be described.
  • FIG. 8 when the player blows on the cursor 102 (touch panel 13 ) while touching the canvas 101 with the touch pen 27 , a sound produced by the player blowing breath is inputted to the microphone 42 .
  • the volume of a sound hereinafter, referred to as microphone input sound
  • the thickness and the like of the spray line are determined in accordance with the magnitude of the detected volume, and the spray line is drawn on the canvas 101 .
  • the drawing processing of the spray line is executed.
  • the volume of the sound effect reproduced when the spray line is drawn is varied in accordance with the magnitude of the detected microphone input sound.
  • the “eraser” will be described.
  • the case where the player operates the toolbar 103 to select the eraser as the drawing tool, and to select the line of uniform thickness as the “line type”, will be described.
  • the cursor 102 which is of eraser type is displayed.
  • An operation performed in this case conforms with that performed in the case where the “pen” is selected and the line of uniform thickness is selected, and a line (uniform line and spray line) drawn at the touched position can be erased.
  • the “spray” is selected as the “line-type” will be described.
  • the cursor 102 which is of propeller type is displayed as in the case where the “pen” is selected and the “spray” is selected as the line type. Then, when the player blows on the cursor 102 , the spray line or the line of uniform thickness drawn within a predetermined range can be erased in accordance with the strength (that is, the magnitude of the microphone input sound) at which the player blows breath, and the touched position.
  • processing is performed such that drawing on the canvas 101 can be performed only after a touch input and an operation (sound input to the microphone 42 ) of blowing breath are performed, as in the case of the “spray”.
  • FIG. 13 is an illustrative diagram showing a memory map of the main memory 32 shown in FIG. 2 .
  • the main memory 32 includes a program storage area 321 and a data storage area 325 . Data in the program storage area 321 and a part of data in the data storage area 325 are obtained by copying, onto the main memory 32 , data stored in advance in a ROM of the cartridge 29 .
  • the programs and the data may be stored in, for example, the save data memory 37 instead of the cartridge 29 , and may be copied from the save data memory 37 onto the main memory 32 when the programs are executed.
  • the latest input coordinate and the input coordinate just prior to the latest input coordinate can be saved as touched position data 3262 .
  • the game apparatus 1 repeatedly detects an input to the touch panel 13 at intervals of a unit of time. When an input is detected, data which has been saved as the latest input coordinate is saved as input coordinate just prior to the latest input coordinate. When the player is touching the touch panel 13 , data indicating coordinate of the touched position is saved as the latest input coordinate in the touched position data 3262 . When the player is not touching the touch panel 13 , the latest input coordinate indicating NULL is saved as in the touched position data 3262 .
  • the program storage area 321 stores a program which is to be executed by the CPU 31 and which includes a main processing program 322 , a camera processing program 323 , a graffiti processing program 324 , and the like.
  • a main processing program 322 is a program corresponding to processing shown by a flowchart in FIG. 17 described later.
  • a camera processing program 323 is a program for causing the CPU 31 to execute processing for obtaining a shot image by using the outer camera 25
  • a graffiti processing program 324 is a program for causing the CPU 31 to execute the processing, shown referring to FIG. 5 and the like, for drawing graffiti on the shot image.
  • the data storage area 325 stores operation data 326 , a drawing tool master 327 , drawing color data 328 , shot image data 329 , current tool data 330 , sound effect data 331 , a spray table 332 , sound characteristic data 333 , and the like.
  • the operation data 326 is data indicating a content of an operation performed by the player with respect to the game apparatus 1 , and includes the operation button data 3261 and the touched position data 3262 .
  • the operation button data 3261 is data indicating an input state of each of the operation buttons 14 A to 14 K.
  • the touched position data 3262 is data indicating coordinate (input coordinate) of a touched position inputted to the touch panel 13 .
  • the input coordinate is repeatedly obtained and saved as the touched position data 3262 . Note that in the present embodiment, it is possible to save the latest input coordinate and input coordinate just prior to the latest input coordinate as the touched position data 3262 .
  • the drawing tool master 327 is a table associated with the drawing tools described above.
  • FIG. 14 shows an example of a table configuration of the drawing tool master 327 .
  • the drawing tool master 327 is a type 3271 , a line type 3272 , and a cursor image data 3273 in FIG. 14 showing an example of a data configuration of the drawing tool master 327 .
  • the drawing tool master 327 shown in FIG. 14 includes the type 3271 , the line type 3272 , and the cursor image data 3273 .
  • the type 3271 is data indicating drawing tools, which are, in the present embodiment, the “pen” and the “eraser”.
  • the line type 3272 is data indicating the line types, which are, in the present embodiment, a line (hereinafter, referred to as uniform line) of uniform thickness, and the “spray”.
  • the cursor image data is image data to be displayed as the cursor 102 .
  • image data indicating an image of a pen tip is stored as the cursor image data.
  • the cursor image data stores image data of a propeller as described above.
  • the drawing color data 328 is data indicating the color of a line or the like drawn on the canvas 101 when the type of the drawing tool is the “pen”.
  • the shot image data 329 is data indicating an image shot by the outer camera 25 .
  • the current tool data 330 is data indicating the type of the drawing tool (pen or eraser) currently selected and the line type (uniform line or spopray).
  • the sound effect data 331 is data of a sound effect to be reproduced upon drawing.
  • the spray table 332 is a table which defines the size of an area in which drawing is performed and the number of dots to be drawn, so as to associate the size and the number with the volume of the above-described microphone input sound.
  • FIG. 15 shows an example of a data configuration of the spray table 332 .
  • the spray table shown in FIG. 15 includes a volume 3321 , an area size 3322 , and a dot number 3323 .
  • the volume 3321 indicates a range of magnitudes of the volume of the microphone input sound. Note that in the present embodiment, the magnitude of the volume is represented as a value from 0 to 100.
  • the area size 3322 indicates a drawing area in which the spray line is drawn by performing the drawing processing once.
  • the shape of the drawing area is circular, and a value indicating the radius of the drawing area is defined as a value of the area size 3322 .
  • the dot number 3323 defines the number of dots to be drawn in the drawing area. As shown by an example in FIG. 15 , for example, when the volume 3321 indicates “11 to 30”, dots whose number is indicated by the dot number 3323 are drawn in a circular area 201 having a size shown in FIG. 16 ( a ). When the volume 3321 indicates “31 to 50”, an increased number of dots are drawn in the circular area 201 having an increased size, as shown in FIG. 16 ( b ) (note that dotted lines indicating the circular area 201 in FIG. 16 are just drawn as a matter of convenience, and are not displayed on the screen).
  • the sound characteristic data 333 is data indicating a characteristic of a sound inputted to the microphone 42 , and specifically, is data indicating the volume, frequency, tone, and the like. Note that in the present embodiment, data indicating the volume of the microphone input sound is stored as the sound characteristic data 333 .
  • the data storage area 325 stores, in addition to the above-described data, various flags such as a reproduction flag used for indicating whether or not reproduction of a sound effect is being performed, various image data, and the like.
  • FIG. 17 is a flowchart showing the flow of the graffiti game processing executed by the game apparatus 1 .
  • the CPU 31 of the game apparatus 1 executes a starting program stored in a boot ROM which is not shown, thereby initializing each unit such as the main memory 32 .
  • a game program stored in the cartridge 29 is read into the main memory 32 , thereby starting execution of the application program.
  • a game image is displayed on the lower LCD 12 , and thereby the application is started.
  • the CPU 31 displays an inquiry screen for inquiring whether or not to execute camera processing (step S 1 ). That is, in the application processing, it is also possible to execute the graffiti processing described later without performing shooting by using the camera. In other words, it is also possible to perform drawing processing with respect to the canvas 101 which is blank, as shown in FIG. 3 , without using the base picture shown in FIG. 5 and the like.
  • the CPU 31 obtains the operation data 326 from the main memory 32 (step S 2 ). Next, it is determined whether or not a content indicated by the operation data is an instruction of executing camera processing (step S 3 ). As a result, if the content is not an instruction of executing camera processing (NO in step S 3 ), the CPU 31 proceeds to processing in step S 5 described later. On the other hand, if the content is an instruction of executing camera processing (YES in step S 3 ), the CPU 31 executes camera processing (step S 4 ). In the camera processing, processing for shooting an image which is to be used as the base picture by using the outer camera 25 and saving the shot image is executed. Next, the CPU 31 executes the graffiti processing (step S 5 ). In the graffiti processing, processing for displaying the screen as shown in FIG. 5 and enabling graffiti to be drawn on the image shot by the camera is executed.
  • FIG. 18 is a flowchart showing the detail of the camera processing shown in step S 4 .
  • the CPU 31 performs initialization processing (step S 11 ).
  • initialization processing various parameters (shooting magnification, exposure time, and the like) for shooting which are defined as initial values in advance are set.
  • the CPU 31 displays, on the lower LCD 12 , a video being shot by the outer camera 25 (step 12 ).
  • the CPU 31 obtains the operation data 326 from the main memory 32 (step S 13 ). Thereafter, the CPU 31 determines whether or not a content of an operation performed by the player which is indicated by the operation data 326 indicates that the shutter button is pressed down (step S 14 ). As a result of the determination, if the shutter is pressed down (YES in step S 14 ), the CPU 31 performs processing of storing an image shot by the outer camera 25 . That is, the image shot by the outer camera 25 is stored as the shot image data 329 in the main memory 32 (step S 15 ). Thereafter, the CPU 31 returns to the processing in step S 12 to repeat the processing.
  • step S 14 if the shutter is not pressed (NO in step S 14 ), next, the CPU 31 determines whether or not a content of an operation indicated by the operation data 326 is an operation of an instruction of ending the camera processing (step S 16 ). As a result, if the content is an instruction of ending the camera processing (YES in step S 16 ), the CPU 31 ends the camera processing. On the other hand, if the content is not an instruction of ending the camera processing (NO in step S 16 ), the CPU 31 executes other processing based on the operation data 326 (step S 17 ). For example, the CPU 31 executes setting of control of zoom magnification, exposure control, or the like. Thereafter, the CPU 31 returns to step S 12 to repeat processing therefrom. Description of the camera processing is finished here.
  • FIG. 19 is a flowchart showing the detail of the graffiti processing shown in step S 2 .
  • the CPU 31 executes initial processing with respect to the graffiti processing (step S 21 ). Specifically, the CPU 31 generates and displays a game screen as shown in FIG. 3 and the like.
  • the CPU 31 reads out the shot image data 329 from the main memory 32 , and displays, as a “base picture”, a shot image which has been shot through the camera processing, on the canvas 101 . At this time, if the camera processing has not been performed, nothing is stored in the shot image data 329 , and therefore, in this case, the CPU 31 displays nothing on the canvas 101 .
  • the CPU 31 sets, as an initial value of the current tool data 330 , data indicating that the drawing tool is the “pen” and the line type is the “uniform line”. That is, at the start of the graffiti processing, the “pen” whose line-type is the “uniform line” is selected as the drawing tool.
  • the CPU 31 obtains the operation data 326 from the main memory 32 (step S 22 ). Thereafter, the CPU 31 determines whether or not a content of an operation indicated by the operation data 326 is an instruction of ending the graffiti processing (step S 23 ). As a result of the determination, if the content is an instruction of ending the graffiti processing (YES in step S 23 ), the CPU 31 ends the graffiti processing.
  • step S 24 determines whether or not the content of an operation is an operation of selecting the type of the drawing tool.
  • step S 24 determines whether or not the content of an operation is an operation of selecting the type of the drawing tool.
  • step S 25 processing of selecting the drawing tool is executed based on the content of the operation data 326 (step S 25 ).
  • the player touches the drawing tool icon 111 on the screen as shown in FIG. 3 and the like. Every time this operation of touching the drawing tool icon 111 is detected, the CPU 31 alternately sets the “pen” and the “eraser” as the drawing tool in the current tool data 330 .
  • the drawing tool switches between the “pen” and the “eraser”. Moreover, the player touches one of the four icons of the line-type icon 112 .
  • the CPU 31 detects the touched icon (more accurately, the coordinate where the icon is displayed), the line type corresponding to a content of the icon is set in the current tool data 330 .
  • the “spray” is set as the line type in the current tool data 330 .
  • the “uniform line” is set as the line type, and data indicating the thickness of the line is also set in the current tool data 330 in accordance with the touched icon. Referring to an example in FIG.
  • the CPU 31 executes processing of setting data indicating the drawing tool in the current tool data 330 based on operation data, and thereafter, the CPU 31 returns to step S 22 to repeat processing therefrom.
  • step S 24 if the content of an operation is not an operation of selecting the drawing tool (NO in step S 24 ), next, the CPU 31 refers to the operation data 326 and thereby determines whether or not a touch input (more accurately, touch input to an area, of the touch panel 13 , corresponding to an area in which the canvas 101 is displayed) to the canvas 101 is being performed (step S 26 ). Specifically, the CPU 31 refers to the latest input coordinate stored in the touched position data 3262 , and determines whether or not the latest input coordinate is in the area in which the canvas 101 is displayed.
  • the CPU 31 executes processing of displaying the cursor 102 at the position where the touch input is being performed (step S 27 ). More specifically, first, the CPU 31 refers to the drawing tool master 327 and obtains a piece of the cursor image data 3272 which corresponds to the drawing tool currently selected. Then, the CPU 31 refers to the touched position data 3262 and displays, as the cursor 102 , an image based on the piece of the cursor image data 3272 at the position where the touch input is being performed.
  • FIG. 20 is a flowchart showing the detail of the pen processing.
  • the CPU 31 determines whether or not the line type indicated by the current tool data 330 is the “spray” (step S 41 ).
  • the CPU executes pen drawing processing (step S 42 ). That is, based on the touched position, processing of drawing a uniform line of a thickness currently selected is executed. Thereafter, the CPU 31 ends the pen processing.
  • step S 41 if the line type is the “spray” (YES in step S 41 ), the CPU 31 executes spray drawing processing for drawing a spray line as described above referring to FIG. 10 and the like (step S 43 ), and thereafter, ends the pen processing.
  • FIG. 21 is a flowchart showing the detail of the spray drawing processing shown in step S 43 .
  • the CPU 31 detects the volume of a sound (microphone input sound) inputted to the microphone 42 , and stores the volume as the sound characteristic data 333 (step S 51 ).
  • the CPU 31 determines whether or not the volume indicated by the sound characteristic data 333 is equal to or larger than a predetermined threshold value which is set in advance (step S 52 ). As a result of the determination, if the volume of the microphone input sound is equal to or larger than the predetermined threshold value (YES in step S 52 ), the CPU 31 refers to the spray table 332 , and determines the size of an area in which a spray line is drawn, and the number of dots to be drawn, based on the magnitude of the volume (step S 53 ).
  • the CPU 31 determines a volume at which a sound effect reproduced upon drawing a spray line is reproduced (step S 54 ).
  • the CPU 31 sets a speed at which an animation display of the cursor 102 is reproduced (step S 55 ).
  • a spray line is drawn
  • an animation in which a propeller rotates is displayed as the cursor 102 which is of propeller type.
  • the CPU 31 executes processing in which the speed at which the propeller rotates is set in accordance with the magnitude of the volume of the microphone input sound. For example, the speed at which the animation is reproduced is set such that if the volume of the microphone input sound is larger, the propeller rotates faster.
  • setting may be performed such that when the volume of the microphone input sound is large, the image may be changed for every one frame, and that when the volume of the microphone input sound is not large, the image may be redrawn for every ten frames.
  • the CPU 31 places the above-described drawing area such that the center of the drawing area coincides with the touched position, and draws dots to form a spray line on the canvas 101 (in the drawing area) in accordance with a content of the determination in step S 53 (step S 56 ).
  • the dots to form a spray line may be randomly placed in the drawing area, or may be drawn around the touched position such that the density of the dots is greatest at the touched position and that the density of the dots gradually becomes lesser as the dots become more distant from the touched position.
  • the CPU 31 displays an animation (animation in which a propeller rotates) of the cursor 102 in accordance with the speed, set in step S 55 , at which the animation is reproduced (step S 57 ).
  • the CPU 31 determines whether or not the reproduction flag is set at OFF (step S 58 ).
  • the reproduction flag is a flag indicating whether or not a sound effect is being reproduced, and when a sound effect is not being reproduced, the reproduction flag is set at OFF.
  • the CPU 31 sets the reproduction flag to ON (step S 59 ).
  • the CPU 31 refers to the sound effect data 331 , and starts reproducing a sound effect (spraying sound of a spray) for drawing of a spray line at a volume set in step S 54 (step S 60 ). Thereafter, the CPU 31 ends the spray processing.
  • step S 58 if the CPU 31 determines that the reproduction flag is not OFF (NO in step S 58 ), since it is considered that a sound effect is being reproduced, the CPU 31 ends the spray processing without executing the processing in steps S 59 and S 60 .
  • step S 52 processing (NO in step S 52 ) performed when, as a result of the determination in step S 52 , the volume indicated by the sound characteristic data 333 is smaller than the predetermined threshold value, will be described.
  • the CPU 31 determines whether or not the reproduction flag is set at ON (step S 61 ). As a result, if the reproduction flag is ON (YES in step S 61 ), the CPU 31 stops the reproduction of the sound effect which has been started in step S 60 . Then, the CPU 31 sets the reproduction flag to OFF (step S 63 ).
  • step S 61 if the reproduction flag is not ON (NO in step S 61 ), the CPU 31 ends the spray drawing processing without executing the processing in steps S 62 and S 63 . Description of the spray drawing processing will be finished here.
  • step S 28 if the current tool data 330 does not indicate the “pen” (NO in step S 28 ), next, the CPU 31 determines whether or not the drawing tool indicated by the current tool data 330 is the “eraser” (step S 30 ). As a result of the determination, if the drawing tool is not the “eraser” (NO in step S 30 ), the CPU 31 returns to step S 22 to repeat processing therefrom.
  • step S 31 if the current tool data 330 indicates the “eraser” (YES in step S 30 ), the CPU 31 executes eraser processing (step S 31 ).
  • the eraser processing processing in which, when a microphone input sound of a volume equal to or larger than a predetermined value is inputted, a spray line and the like which are drawn at the touched position are erased, is executed.
  • FIG. 22 is a flowchart showing the detail of the eraser processing. As shown in FIG. 22 , first, the CPU 31 determines whether or not the line type indicated by the current tool data 330 is the “spray” (step S 71 ).
  • step S 42 the CPU 31 executes pen eraser processing (step S 42 ). That is, the CPU 31 executes processing of, based on the touched position, erasing a uniform line or a spray line at the thickness of a line currently selected. Thereafter, the CPU 31 ends the eraser processing.
  • FIG. 23 is a flowchart showing the detail of the spray eraser processing shown in step S 73 . Since, in FIG. 23 , the processing in steps S 51 and S 52 , and the processing in steps S 57 to S 63 are the same as the processing in the corresponding steps described referring to FIG. 21 , detailed description thereof is omitted and the other processing will mainly be described here.
  • step S 52 the CPU 31 determines whether or not the volume of the microphone input sound is equal to or larger than a predetermined threshold value, and as a result, if the volume is equal to or larger than the predetermined threshold value (YES in step S 52 ), the CPU 31 determines the size of an area (hereinafter, referred to as erasing area) to be erased, in accordance with the magnitude of the volume (step S 81 ).
  • a method for determining the erasing area conforms to a method for determining the drawing area for the spray line. That is, the CPU 31 refers to the spray table 332 and obtains the area size 3322 corresponding to the magnitude of the volume. Then, based on this size, the CPU 31 determines the size of the erasing area. Note that, similarly to the above-described drawing area, the shape of the erasing area is circular.
  • the CPU 31 determines the volume at which the sound effect for erasing the spray line or the like is reproduced, in accordance with the magnitude of the volume (step S 82 ).
  • the CPU 31 determines the speed at which an animation of the cursor for the erasing is reproduced (step S 83 ). That is, as in step S 55 , the CPU 31 determines the speed at which the propeller rotates, in accordance with the magnitude of the volume of the microphone input sound.
  • the CPU 31 places the erasing area such that the center of the erasing area coincides with the touched position, and erases the spray line drawn within the erasing area (step S 84 ).
  • step S 57 the CPU 31 displays the animation of the cursor (step S 57 ), and proceeds to processing of determining whether or not the reproduction flag is OFF (step S 58 ). Since processing in step S 58 and subsequent steps is the same as the processing in the respective steps described above referring to FIG. 21 , detailed description thereof is omitted. Description of the spray eraser processing is finished here.
  • step S 33 the CPU 31 determines whether or not touch off has been performed, based on the operation data 326 (step S 33 ). That is, the CPU 31 determines whether a state (state in which a touch continues to be performed) in which a touch input continues to be performed is interrupted, or a state in which a touch is not being performed continues. As a result of the determination, touch off has been performed (YES in step S 33 ), the CPU 31 erases the cursor 102 (step S 34 ). Thereafter, the CPU 31 returns to step S 22 to repeat processing therefrom.
  • step S 33 the CPU 31 directly returns to step S 22 to repeat processing therefrom. Description of the graffiti processing is finished here. Specifically, if the touched position data 3262 indicates that the latest input coordinate is NULL and the input coordinate just prior to the latest input coordinate is the coordinate of the touched position, the CPU 31 determines that touch off has been performed, and if the touched position data 3262 indicates that the latest input coordinate is NULL and the input coordinate just prior to the latest input coordinate is also NULL, the CPU 31 determines that a state in which a touch is not being performed continues.
  • drawing on the canvas 101 can be performed when two types of inputs, that is, a touch input to the canvas 101 and a sound input to the microphone 42 are performed.
  • two types of inputs that is, a touch input to the canvas 101 and a sound input to the microphone 42 are performed.
  • the spray line is drawn while a touch input and a sound input to the microphone 42 continue to be performed (the player continues to blow). Therefore, by changing the strength at which the player blows breath, the thickness of the spray line can be changed in real time. Thus, it becomes possible to provide a novel way of enjoyment in which, depending on how the player blows breath, the thickness (corresponding to so-called pen pressure) of the spray line can be changed, that is, a content to be drawn can be changed.
  • blow-off processing processing in which dots forming the spray line which is present near or at a touched position are blown off in accordance with blowing breath on the touch panel may be executed.
  • FIG. 24 shows, as an example, a state in which there are five dots 211 a to 211 e above a touched position 210 .
  • the distance of a movement of a dot nearer to the touched position 210 is longer, and the distance of a movement of a dot farther from the touched position 210 is shorter. That is, in the processing, a dot nearer to a position (in this case, touched position 210 ) on which the player blows is subjected to stronger blow and thereby blown off farther.
  • FIG. 27 is a flowchart showing the detail of the blow-off processing.
  • the blow-off processing is executed in place of the spray eraser processing in step S 73 .
  • the blow-off processing may be executed together with the spray eraser processing.
  • the CPU 31 detects a volume (step S 51 ), and determines whether or not the detected volume is equal to or larger than a predetermined value (step S 52 ). Since this processing is the same as the processing in steps S 51 and S 52 in FIG. 23 , detailed description thereof is omitted.
  • step S 52 if the volume is smaller than the predetermined threshold value (NO in step S 52 ), the CPU 31 proceeds to processing in step S 61 . Since processing to be performed in this case is also the same as the processing from step S 61 in FIG. 23 , description thereof is omitted.
  • step S 52 if the CPU 31 determines that the volume is equal to or larger than the predetermined threshold value (YES in step S 52 ), the CPU 31 determines the size of an area (hereinafter, referred to as blow-off area) for the blow-off processing, in accordance with the magnitude of the volume (step S 81 ).
  • the size of the area is determined by referring to the spray table 332 and obtaining the area size 3322 corresponding to the magnitude of the volume, as in step S 53 .
  • the CPU 31 calculates straight lines (see FIG. 25 ) which connect, with a touched position, the respective dots forming a spray line present within the blow-off area (step S 83 ).
  • the CPU 31 moves the dots within the blow-off area in accordance with the directions and the lengths of the respective calculated straight lines (see FIG. 26 ). Note that at this time, if a moved dot overlaps with a position of another dot, a dot which is nearer to the touched position before the movement is drawn over the other dot.
  • step S 58 Since the processing from step S 58 is the same as the corresponding processing described above referring to FIG. 21 , description thereof is omitted.
  • the CPU 31 detects a sound produced when the player blows breath on the touch panel 13 and performs processing based on the volume thereof, and at this time, any other sound can be used (for example, a sound of clapping hands can be used). That is, the type and the content of the sound are not identified.
  • the present invention is not limited thereto, and a sound of “blow” may be identified.
  • a method of detection or identification of the “sound of blow” may be of any type.
  • a waveform pattern of a sound segment included in a sound (sound of breath) of blow is stored in advance, the stored sound segment and a sound segment of an inputted sound are compared with each other, and thereby it is determined whether or not the player has blown.
  • FFT fast fourier transform processing
  • a characteristic of an inputted sound such as tone or frequency, may be calculated or identified, and a content of drawing processing may be changed in accordance with the characteristic of the inputted sound.
  • the reproduction of the sound effect may be executed such that a fade-in/fade-out effect is used upon start and end of the reproduction of the sound effect. This can prevent a noise (for example, noise of a sound “putsu”) upon start of reproduction from being generated.
  • a noise for example, noise of a sound “putsu”
  • a drawing color used upon drawing only one color may be used, or a plurality of colors may be used at the same time.
  • an edged line (whose edge has a color different from a color of part of the edged line other than the edge) may be used if the “pen” is used as the drawing tool, for example.
  • dots forming a spray line may have colors different from each other. For example, when “gray” and “black” are designated as the drawing colors, the spray line which is formed by both a gray dot and a black dot may be drawn (see FIG. 28 ).
  • the dots may be displayed as one dot having a color obtained by mixing the colors thereof with each other.
  • the spray line which includes various colors in a mixed manner and cannot be predicted by the player can be displayed through the above-described blow-off processing, whereby a new way of enjoyment using the blow-off processing can be provided to the player.
  • a translucence effect may be used for erasing the spray line or the like. That is, instead of erasing the spray line or the like at the moment when the player blows breath, processing in which the spray line or the like is gradually diluted, and finally, cleanly erased may be executed.
  • the drawing processing using the “uniform line” may also be executed only when a microphone input sound is being inputted.
  • the pen pressure may be changed in accordance with the magnitude of the microphone input sound. For example, when the magnitude of the microphone input sound is small (when the strength at which the player blows breath is weak), a “faded line” or a “line of a dilute color” may be drawn, and when the magnitude of the microphone input sound is large (when the strength at which the player blows breath is strong), a “clear line” or a “line of a deep color” may be drawn.
  • drawing of the “uniform line” with the “pen” may be executed without a microphone input sound, and the thickness of the line may be changed in real time by a breath being blown on the touch panel 13 while the “uniform line” is being drawn with the “pen”.
  • drawing of the spray line may continue during about 1 to 2 seconds, for example, instead of immediately stopping drawing of the spray line. That is, processing in which even if the player stops blowing breath, rotation of the propeller continues during a short time and the spray line is drawn during the short time, may be executed.
  • the image created in the above embodiment, on which graffiti has been drawn may be saved.
  • only data corresponding to the above-described handwriting layer 106 may be saved, or data obtained by combining data corresponding to handwriting layer 106 with a shot image may be saved as a composite image.
  • the present invention is not limited thereto, and the present invention is applicable to general painting software which does not use the outer camera 25 , that is, which does not allow graffiti to be drawn on a shot image or the like.
  • a hand-held game apparatus having two display devices is described as an example.
  • the present invention is applicable to a hand-held terminal which has a single display device and has a touch panel on a screen of the display device.
  • a touch panel is used as an example of a device which detects a designated position, in an operation area, designated by the player.
  • a so-called pointing device which allows the player to designate a position in a predetermined area may be used as the device.
  • a mouse which is capable of designating any position on a screen, or a tablet which designates any position on an operation surface having no display screen.
  • a pointing device in which: a device including shooting means for remotely shooting, for example, a display screen, or a marker positioned in the periphery of the display screen obtains a shot image by pointing toward the display screen; and from the position of the display screen or the marker on the shot image, coordinate, on the display screen corresponding to the position on the display screen at which the device has pointed, is calculated.

Abstract

First, a designated position on a display screen is continuously obtained based on a designation performed by a pointing device. Next, it is detected that a sound which satisfies a predetermined condition is inputted to sound input means. Then, while it is detected that the sound which satisfies the predetermined condition is inputted, predetermined drawing-related processing is executed at a position based on the obtained designated position.

Description

    CROSS REFERENCE TO RELATED APPLICATION
  • The disclosure of Japanese Patent Application No. 2009-000443, filed on Jan. 5, 2009, is incorporated herein by reference.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates to a computer-readable storage medium having stored therein a drawing processing program which allows a player to draw a desired picture, and more particularly, to a computer-readable storage medium having stored therein a drawing processing program which allows a player to draw a desired picture by using a pointing device such as a touch panel.
  • 2. Description of the Background Art
  • Conventionally, there has been known an apparatus for editing an image by a user operating a touch pen to perform an input to a touch panel (for example, Japanese Laid-Open Patent Publication No. 2003-191567, and Japanese Laid-Open Patent Publication No. 2006-129257). Such an apparatus is capable of, by using the touch pen, editing (e.g., drawing graffiti on) an image obtained by shooting an object (user) itself. In addition, at this time, the thickness or the line type of a pen can also be selected.
  • However, an apparatus as described above has the following problem. In the above apparatus, drawing is performed while an input is being performed by the touch pen with respect to the touch panel. Therefore, the user can perform an operation as if the user performs drawing on paper by using a pen. On the other hand, such an operation is commonplace, and therefore, does not give sufficient freshness to the user.
  • SUMMARY OF THE INVENTION
  • Therefore, an object of the present invention is to provide a drawing processing program and an information processing apparatus which enable drawing to be performed through a nonconventional and novel way of operation.
  • The present invention has the following features to achieve the objects mentioned above. Note that reference numerals, supplementary explanations, and the like in the parentheses indicate an example of the correspondence relationship with an embodiment described below in order to aid in understanding the present invention and are not intended to limit, in any way, the scope of the present invention.
  • A first aspect is a computer-readable storage medium having stored therein a drawing processing program which is executed by a computer of an information processing apparatus in which a pointing device for designating a position on a display screen, and sound input means can be used, the drawing processing program causing the computer to function as designated position detection means (S22), sound detection means (S51), and drawing-related processing execution means (S56). The designated position input coordinate detection means continuously obtains a designated position on the display screen, based on a designation performed by the pointing device. Here, upon obtaining the designated position, if, for example, the pointing device is a touch panel, a touch operation with respect to the touch panel corresponds to a designation operation. Alternatively, if the pointing device is an operation device including shooting means for shooting a shooting target and is capable of, based on shooting information obtained by the shooting means, designating any position on a screen, pressing down a predetermined button provided to the operation device while the position is being designated corresponds to a designation operation. The sound detection means detects that a sound which satisfies a predetermined condition is inputted to the sound input means. Here, a determination using the predetermined condition may be a determination using a predetermined threshold value. That is, the predetermined condition may be such that when a sound having a magnitude of certain level is inputted, it is determined that a sound is inputted, or may be such that specified sound determination means described later determines that a predetermined sound is inputted. The drawing-related processing execution means executes, while the sound detection means detects that the sound which satisfies the predetermined condition is inputted, predetermined drawing-related processing on a position based on the designated position obtained by the designated position detection means. Here, the drawing-related processing includes processing of drawing a line (straight line or curved line), a dot, or an image formed by a collection of the line or the dot on the display screen, and in addition, includes processing of working upon (editing) an image or the like which has been already drawn, and processing of erasing an image or the like which has been already displayed.
  • According to the first aspect, a painting program providing a novel way of operation can be provided.
  • In a second aspect based on the first aspect, the drawing-related processing execution means changes a content of the drawing-related processing to be executed, based on the sound detected by the sound detection means, and in accordance with a characteristic of the sound. Here, the characteristic of the sound is, for example, a volume, a frequency, a tone, or the like.
  • According to the second aspect, since a content of the drawing-related processing to be executed is changed depending on a content of an inputted sound, a novel way of enjoyment can be provided to the player.
  • In a third aspect based on the second aspect, the drawing-related processing execution means sequentially changes a content of the drawing-related processing to be executed, in a coordinated manner with changes in chronological order in the characteristic of the sound repeatedly detected by the sound detection means.
  • According to the third aspect, since a content of the drawing-related processing is changed in real time in accordance with a change in an inputted sound, a novel way of enjoyment can be provided to the player. Further, while drawing is performed based on the position detected by the designated position detection means, the content of the drawing-related processing is changed in accordance with an input from the sound input means which is means other than the designated position detection means. Therefore, the player can change the content of the drawing-related processing while the player continues to designate a position for drawing.
  • In a fourth aspect based on the second aspect, sound analysis means executes the predetermined drawing-related processing only when a volume of the inputted sound is equal to or larger than a predetermined threshold value.
  • According to the fourth aspect, since the drawing-related processing is executed only when the volume of the inputted sound is larger than a certain degree, a novel way of enjoyment can be provided to the player.
  • In a fifth aspect based on the second aspect, the drawing-related processing execution means executes, as the drawing-related processing, processing of drawing a line which connects, in chronological order, the position based on the designated position sequentially obtained by the input coordinate detection means.
  • According to the fifth aspect, since a handwritten image can be drawn on the display screen through a novel way of operation, a novel way of enjoyment can be provided to the player.
  • In a sixth aspect based on the fifth aspect, the sound analysis means changes at least one of a thickness of the line and a density of a color in which the line is drawn, in accordance with a volume of the inputted sound.
  • According to the sixth aspect, since the thickness or the density of the line to be drawn is changed in accordance with the volume of the inputted sound, a novel way of enjoyment can be provided to the player.
  • In a seventh aspect based on the second aspect, the drawing-related processing execution means executes, as the drawing-related processing, processing of drawing one or more dots in a drawing range which is a predetermined range including therein the position based on the designated position.
  • According to the seventh aspect, for example, a feeling of performing drawing by using a spray, together with a novel way of operation, can be provided to the player.
  • In an eighth aspect based on the second aspect, the sound analysis means changes at least one of a size of the drawing range and a number of the dots to be drawn in the drawing range, in accordance with a volume of the inputted sound.
  • In a ninth aspect based on the eighth aspect, the drawing-related processing execution means draws the dots such that an area density of the number of the dots which are nearer to the position based on the designated position is higher, and that an area density of the number of the dots which are farther from the position based on the designated position is lower.
  • In a tenth aspect based on the eighth aspect, the drawing-related processing execution means draws the dots at random positions in the drawing area.
  • According to the eighth to the tenth aspects, various types of drawing can be performed in accordance with the volume of the inputted sound.
  • In an eleventh aspect based on the seventh aspect, the drawing-related processing execution means executes, as the drawing-related processing, processing of moving the dots drawn on the display screen in a predetermined direction, based on the position based on the designated position, and the sound input detected by the sound detection means.
  • In a twelfth aspect based on the eleventh aspect, the drawing-related processing execution means includes movement content calculation means for calculating: a direction of a line connecting each of the dots displayed on the display screen, with a reference point which is the position based on the designated position; and a distance from the reference point to each of the dots displayed on the display screen. In addition, the drawing-related processing execution means moves the dots displayed on the screen, based on the direction and the distance calculated by the movement content calculation means.
  • According to the eleventh and twelfth aspects, dots which have been already drawn can be moved by using a sound input, and thereby a novel way of enjoyment can be provided to the player.
  • In a thirteenth aspect based on the first aspect, the drawing processing program further causes the computer to function as sound effect reproduction means (S60) for causing predetermined sound output means to output a predetermined sound effect when the drawing-related processing execution means is executing the predetermined drawing-related processing.
  • In a fourteenth aspect based on the thirteenth aspect, the sound effect reproduction means changes a volume at which the sound effect is reproduced, in accordance with a characteristic of the sound detected by the sound detection means.
  • According to the thirteenth and fourteenth aspects, the player can intuitively recognize whether or not the drawing-related processing is being executed.
  • In a fifteenth aspect based on the first aspect, the drawing processing program further causes the computer to function as cursor display means (S57) and animation display means (S57). The cursor display means displays a predetermined cursor image at the designated position. The animation display means animates the cursor when the drawing-related processing execution means is executing the predetermined drawing-related processing.
  • In a sixteenth aspect based on the fifteenth aspect, the animation display means changes a speed of the animation in accordance with a characteristic of the sound detected by the sound detection means.
  • According to the fifteenth and sixteenth aspects, the player can visually recognize whether or not the drawing-related processing is being executed.
  • In a seventeenth aspect based on the first aspect, the pointing device is a touch panel.
  • According to the seventeenth aspect, an intuitive way of operation can be provided to the player.
  • In an eighteenth aspect based on the first aspect, the drawing processing program further causes the computer to function as shot image obtaining means (S1), and shot image display means (S21). The shot image obtaining means obtains image data of an image shot by predetermined shooting means. The shot image display means displays, on the display screen, the shot image. In addition, the drawing-related processing execution means executes the drawing-related processing on the shot image.
  • According to the eighteenth aspect, editing of an image with respect to a shot image, or the like can be provided together with a novel way of operation.
  • In a nineteenth aspect based on the first aspect, the drawing processing program further causes the computer to function as specified sound determination means for determining whether or not the sound detected by the sound detection means is a predetermined sound. In addition, the drawing-related processing execution means executes the drawing-related processing only when the specified sound determination means determines that the sound detected by the sound detection means is the predetermined sound.
  • According to the nineteenth aspect, it becomes possible to execute the drawing-related processing by identifying a specified sound such as a sound of the player blowing breath, and thereby a novel way of enjoyment can be provided to the player. In addition, it becomes possible to execute the drawing-related processing only when the player utters a specified sound, whereby the drawing-related processing can be prevented from being executed in response to a sound involuntarily uttered.
  • A twentieth aspect is an information processing apparatus capable of using a pointing device (13) for designating a position on a display screen (12), and of using sound input means (42), the information processing apparatus comprising input coordinate detection means (31), sound detection means (31), and drawing-related processing execution means (31). The input coordinate detection means continuously obtains a designated position on the display screen, based on a designation performed by the pointing device. The sound detection means detects that a sound which satisfies a predetermined condition is inputted to the sound input means. The drawing-related processing execution means executes, while the sound detection means detects that the sound is inputted, predetermined drawing-related processing on a position based on the designated position.
  • According to the twentieth aspect, the same effect as that of the drawing processing program of the above aspects can be obtained.
  • In a twenty-first aspect based on the twentieth aspect, the sound detection means is placed in proximity of the display screen.
  • According to the twenty-first aspect, since the display screen and the sound detection means are placed at positions which are close to each other, it becomes possible to provide an effect of intuitive rendering, which is, for example, an effect of, when a sound is uttered toward the screen, performing drawing on the display screen in response to the sound.
  • According to the above aspects, a painting program and a painting game which allow the player to enjoy drawing through a novel way of operation can be provided.
  • These and other objects, features, aspects and advantages of the present invention will become more apparent from the following detailed description of the present invention when taken in conjunction with the accompanying drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is an external view of a hand-held game apparatus 10 according to one embodiment of the present invention;
  • FIG. 2 is a block diagram of the hand-held game apparatus 10 according to the one embodiment of the present invention;
  • FIG. 3 shows an example of a screen of a game assumed in the present embodiment;
  • FIG. 4 shows an example of a shot image;
  • FIG. 5 shows an example of the screen of the game assumed in the present embodiment;
  • FIG. 6 shows a relationship between the shot image and a canvas;
  • FIG. 7 shows an example of the screen of the game assumed in the present embodiment;
  • FIG. 8 is a drawing for illustrating an operation in the game assumed in the present embodiment;
  • FIG. 9 shows an example of an image drawn in the game assumed in the present embodiment;
  • FIG. 10 shows an example of the screen of the game assumed in the present embodiment;
  • FIG. 11 shows an example of the screen of the game assumed in the present embodiment;
  • FIG. 12 shows an example of the screen of the game assumed in the present embodiment;
  • FIG. 13 is an illustrative diagram showing a memory map of a main memory 32 shown in FIG. 2;
  • FIG. 14 shows an example of a data configuration of a drawing tool master 327;
  • FIG. 15 shows an example of a data configuration of a spray table 332;
  • FIG. 16 is a drawing for illustrating a drawing area;
  • FIG. 17 is a flowchart showing graffiti game processing according to the present embodiment of the present invention;
  • FIG. 18 is a flowchart showing the detail of the camera processing shown in step S1 in FIG. 17;
  • FIG. 19 is a flowchart showing the detail of graffiti processing shown in step S2 in FIG. 17;
  • FIG. 20 is a flowchart showing the detail of pen processing shown in step S29 in FIG. 19;
  • FIG. 21 is a flowchart showing the detail of spray drawing processing shown in step S43 in FIG. 20;
  • FIG. 22 is a flowchart showing the detail of eraser processing shown in step S31 in FIG. 19;
  • FIG. 23 is a flowchart showing the detail of spray eraser processing shown in step S73 in FIG. 22;
  • FIG. 24 is a drawing for illustrating the processing outline of blow-off processing;
  • FIG. 25 is a drawing for illustrating the processing outline of the blow-off processing;
  • FIG. 26 is a drawing for illustrating the processing outline of the blow-off processing;
  • FIG. 27 is a flowchart showing the detail of the blow-off processing; and
  • FIG. 28 shows an example of a spray line formed by a plurality of colors.
  • DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • The following will describe embodiments of the present invention with reference to the drawings. The present invention is not limited to the embodiments.
  • FIG. 1 is an external view of a game apparatus 1 which executes a color conversion program according to the present invention. Here, as an example of the game apparatus 1, a hand-held game apparatus is shown. The game apparatus 1 has a camera, and thus functions as a shooting apparatus for shooting an image with the camera, displaying the shot image on a screen, and saving data of the shot image.
  • As shown in FIG. 1, the game apparatus 1 is a foldable hand-held game apparatus and is shown in a state (opened state) where the game apparatus 1 is opened. The game apparatus 1 is configured so as to have a size which allows a user to hold the game apparatus 1 with both hands or one hand even in the state where the game apparatus 1 is opened.
  • The game apparatus 1 includes a lower housing 11 and an upper housing 21. The lower housing 11 and the upper housing 21 are connected to each other such that the game apparatus 1 can be opened or closed (folded). In the example of FIG. 1, the lower housing 11 and the upper housing 21 are each formed in a plate-like shape of a horizontally long rectangle, and connected to each other rotatably around long-side portions thereof. Usually, the user uses the game apparatus 1 in the opened state. When not using the game apparatus 1, the user keeps the game apparatus 1 in a closed state. In the example shown in FIG. 1, in addition to the closed state and the opened state, the game apparatus 1 is capable of maintaining an angle between the lower housing 11 and the upper housing 21 at any angle ranging between the closed state and the opened state by frictional force generated at a connection portion, and the like. In other words, the upper housing 21 can be stationary at any angle with respect to the lower housing 11.
  • In the lower housing 11, a lower LCD (Liquid Crystal Display) 12 is provided. The lower LCD 12 has a horizontally long shape, and is located such that a long-side direction thereof corresponds to a long-side direction of the lower housing 11. Note that although an LCD is used as a display device provided in the game apparatus 1 in the present embodiment, any other display devices such as a display device using an EL (Electro Luminescence), and the like may be used. In addition, the game apparatus 1 can use a display device of any resolution. Note that although details will be described later, the lower LCD 12 is used mainly for displaying, in real time, an image to be shot by an inner camera 23 or an outer camera 25.
  • In the lower housing 11, operation buttons 14A to 14K, and a touch panel 13 are provided as input devices. As shown in FIG. 1, among the operation buttons 14A to 14K, the direction input button 14A, the operation button 14B, the operation button 14C, the operation button 14D, the operation button 14E, the power button 14F, the start button 14G, and the select button 14H are provided on an inner main surface of the lower housing 11 which is located inside when the upper housing 21 and the lower housing 11 are folded. The direction input button 14A is used, for example, for a selection operation, and the like. The operation buttons 14B to 14E are used, for example, for a determination operation, a cancellation operation, and the like. The power button 14F is used for turning on/off the power of the game apparatus 1. In the example shown in FIG. 1, the direction input button 14A and the power button 14F are provided on the inner main surface of the lower housing 11 and to one of the left and the right (to the left in FIG. 1) of the lower LCD 12 provided in the vicinity of a center of the inner main surface of the lower housing 11. Further, the operation buttons 14B to 14E, the start button 14G, and the select button 14H are provided on the inner main surface of the lower housing 11 and to the other one of the left and the right (to the right in FIG. 1) of the lower LCD 12. The direction input button 14A, the operation buttons 14B to 14E, the start button 14G, and the select button 14H are used for performing various operations with respect to the game apparatus 1.
  • Note that the operation buttons 14I to 14K are omitted in FIG. 1. For example, the L button 14I is provided at a left end portion of an upper side surface of the lower housing 11, and the R button 14J is provided at a right end portion of the upper side surface of the lower housing 11. The L button 14I and the R button 14J are used, for example, for performing a shooting instruction operation (shutter operation) with respect to the game apparatus 1. In addition, the volume button 14K is provided on a left side surface of the lower housing 11. The volume button 14K is used for adjusting the volume of speakers of the game apparatus 1.
  • In addition, the game apparatus 1 further includes the touch panel 13 as an input device other than the operation buttons 14A to 14K. The touch panel 13 is mounted on the lower LCD 12 so as to cover a screen of the lower LCD 12. Note that in the present embodiment, for example, a resistive film type touch panel is used as the touch panel 13. However, the touch panel 13 is not limited to the resistive film type, and any press-type touch panel may be used. In addition, in the present embodiment, the touch panel 13 having the same resolution (detection accuracy) as that of the lower LCD 12 is used, for example. However, the resolutions of the touch panel 13 and the lower LCD 12 may not necessarily be the same as each other. In addition, in a right side surface of the lower housing 11, an insertion opening (indicated by a dotted line in FIG. 1) is provided. The insertion opening is capable of accommodating a touch pen 27 which is used for performing an operation with respect to the touch panel 13. Note that although an input with respect to the touch panel 13 is usually performed by using the touch pen 27, a finger of the user, instead of the touch pen 27, can be used for operating the touch panel 13.
  • In the right side surface of the lower housing 11, an insertion opening (indicated by a two-dot chain line in FIG. 1) is formed for accommodating a memory card 28. Inside the insertion opening, a connector (not shown) is provided for electrically connecting the game apparatus 1 to the memory card 28. The memory card 28 is, for example, an SD (Secure Digital) memory card, and detachably mounted to the connector. The memory card 28 is used, for example, for storing (saving) an image shot by the game apparatus 1, and for loading an image generated by other apparatuses into the game apparatus 1.
  • Further, in the upper side surface of the lower housing 11, an insertion opening (indicated by a chain line in FIG. 1) is formed for accommodating a cartridge 29. Also inside the insertion opening, a connector (not shown) is provided for electrically connecting the game apparatus 1 to the cartridge 29. The cartridge 29 is a storage medium storing the color conversion program, a game program, and the like, and is detachably mounted in the insertion opening provided in the lower housing 11.
  • Three LEDs 15A to 15C are mounted to a left side part of the connection portion where the lower housing 11 and the upper housing 21 are connected to each other. Here, the game apparatus 1 is capable of performing wireless communication with another apparatus, and the first LED 15A is lit up while the power of the game apparatus 1 is ON. The second LED 15B is lit up while the game apparatus 1 is being charged. The third LED 15C is lit up while wireless communication is established. Thus, the three LEDs 15A to 15C can notify the user of a state of ON/OFF of the power of the game apparatus 1, a state of charge of the game apparatus 1, and a state of communication establishment of the game apparatus 1.
  • On the other hand, in the upper housing 21, an upper LCD 22 is provided. The upper LCD 22 has a horizontally long shape, and is located such that a long side direction thereof corresponds to a long side direction of the upper housing 21. Note that similarly to the lower LCD 12, a display device which is of any other type or has any other resolution may be used instead of the upper LCD 22. Note that a touch panel may be provided so as to cover the upper LCD 22. For example, on the upper LCD 22, an operation illustration screen is displayed for teaching roles of the operation buttons 14A to 14K and the touch panel 13 to the user.
  • In addition, in the upper housing 21, two cameras (the inner camera 23 and the outer camera 25) are provided. As shown in FIG. 1, the inner camera 23 is mounted in an inner main surface of the upper housing 21 and at the vicinity of the connection portion. On the other hand, the outer camera 25 is mounted in a surface opposite to the inner main surface in which the inner camera 23 is mounted, namely, in an outer main surface of the upper housing 21 (which is a surface located on the outside of the game apparatus 1 in the closed state, and which is a back surface of the upper housing 21 shown in FIG. 1). Note that in FIG. 1, the outer camera 25 is indicated by a dashed line. Thus, the inner camera 23 is capable of shooting an image in a direction in which the inner main surface of the upper housing 21 faces, and the outer camera 25 is capable of shooting an image in a direction opposite to the shooting direction of the inner camera 23, namely, in a direction in which the outer main surface of the upper housing 21 faces. As described above, in the present embodiment, the two cameras 23 and 25 are provided such that the shooting directions thereof are opposite to each other. For example, the user can shoot a view seen from the game apparatus 1 toward the user with the inner camera 23 as well as a view seen from the game apparatus 1 in a direction opposite to a direction toward the user with the outer camera 25.
  • Note that in the inner main surface of the upper housing 21 and at the vicinity of the connection portion, a microphone (a microphone 42 shown in FIG. 2) is accommodated as a sound input device. In the inner main surface of the upper housing 21 and at the vicinity of the connection portion, a microphone hole 16 is formed to allow the microphone 42 to detect a sound from outside the game apparatus 1. The position at which the microphone 42 is accommodated and the position of the microphone hole 16 are not necessarily in the connection portion, and, for example, the microphone 42 may be accommodated in the lower housing 11 and the microphone hole 16 may be formed in the lower housing 11 so as to correspond to the accommodating position of the microphone 42.
  • In addition, in the outer main surface of the upper housing 21, a fourth LED 26 (indicated by a dashed line in FIG. 1) is mounted. The fourth LED 26 is lit up at a time when shooting is performed by the outer camera 25 (when the shutter button is pressed). In addition, the fourth LED 26 is lit up while a moving picture is shot by the outer camera 25. The fourth LED 26 can notify a person to be shot and people around the person that shooting has been performed (is being performed) by the game apparatus 1.
  • In addition, sound holes 24 are formed in the inner main surface of the upper housing 21 and to the left and right of the upper LCD 22 provided in the vicinity of a center of the inner main surface of the upper housing 21. The speakers are accommodated in the upper housing 21 and at the back of the sound holes 24. The sound holes 24 are holes for releasing a sound from the speakers to the outside of the game apparatus 1 therethrough.
  • As described above, the inner camera 23 and the outer camera 25 which are configurations for shooting an image, and the upper LCD 22 which is display means for displaying, for example, an operation illustration screen upon shooting are provided in the upper housing 21. On the other hand, the input devices for performing an operation input with respect to the game apparatus 1 (the touch panel 13 and the buttons 14A to 14K), and the lower LCD 12 which is display means for displaying a game screen are provided in the lower housing 11. Thus, when using the game apparatus 1, the user can hold the lower housing 11 and perform an input with respect to the input device while viewing a shot image (an image shot by the camera) displayed on the lower LCD 12.
  • The following will describe an internal configuration of the game apparatus 1 with reference to FIG. 2. Note that FIG. 2 is a block diagram showing an example of the internal configuration of the game apparatus 1.
  • As shown in FIG. 2, the game apparatus 1 includes electronic components including a CPU 31, a main memory 32, a memory control circuit 33, a storage data memory 34, a preset data memory 35, a memory card interface (memory card I/F) 36, a cartridge I/F 44, a wireless communication module 37, a local communication module 38, a real time clock (RTC) 39, a power circuit 40, an interface circuit (I/F circuit) 41, and the like. These electronic components are mounted on an electronic circuit substrate and accommodated in the lower housing 11 (or may be accommodated in the upper housing 21).
  • The CPU 31 is information processing means for executing a predetermined program. In the present embodiment, the predetermined program is stored in a memory (e.g., the storage data memory 34) in the game apparatus 1, or in the memory cards 28 and/or 29, and the CPU 31 executes later-described graffiti processing by executing the predetermined program. Note that a program executed by the CPU 31 may be stored in advance in a memory in the game apparatus 1, may be obtained from the memory card 28 and/or the cartridge 29, or may be obtained from another apparatus by means of communication with the other apparatus. For example, the program may be downloaded and obtained from a predetermined server via the Internet, or a predetermined program stored in a stationary game apparatus may be downloaded and obtained by performing communication with the stationary game apparatus.
  • The main memory 32, the memory control circuit 33, and the preset data memory 35 are connected to the CPU 31. In addition, the storage data memory 34 is connected to the memory control circuit 33. The main memory 32 is storage means used as a work area and a buffer area of the CPU 31. In other words, the main memory 32 stores various data used in the graffiti processing, and stores a program obtained from the outside (the memory cards 28 and 29, another apparatus, or the like). In the present embodiment, for example, a PSRAM (Pseudo-SRAM) is used as the main memory 32. The storage data memory 34 is storage means for storing a program executed by the CPU 31, data of an image shot by the inner camera 23 and the outer camera 25, and the like. The storage data memory 34 is constructed of a nonvolatile storage medium, which is, in the present embodiment, a NAND flash memory, for example. The memory control circuit 33 is a circuit for controlling reading of data from the storage data memory 34 or writing of data to the storage data memory 34 in accordance with an instruction from the CPU 31. The preset data memory 35 is storage means for storing data (preset data) of various parameters which are set in advance in the game apparatus 1, and the like. A flash memory connected to the CPU 31 via an SPI (Serial Peripheral Interface) bus can be used as the preset data memory 35.
  • The memory card I/F 36 is connected to the CPU 31. The memory card I/F 36 reads data from the memory card 28 mounted to the connectors or writes data to the memory card 28, in accordance with an instruction from the CPU 31. In the present embodiment, data of images shot by the outer camera 25 is written to the memory card 28, and image data stored in the memory card 28 is read from the memory card 28 to be stored in the storage data memory 34.
  • The cartridge I/F 44 is connected to the CPU 31. The cartridge I/F 44 reads out data from the cartridge 29 mounted to the connector or writes data to the cartridge 29, in accordance with an instruction from the CPU 31. In the present embodiment, an application program which can be executed by the information processing apparatus 10 is read out from the cartridge 29 to be executed by the CPU 31, and data associated with the application program (e.g., saved data in a game) is written to the cartridge 29.
  • Note that the graffiti game program according to the present invention may be supplied to a computer system not only from an external storage medium such as the cartridge 29, but also via a wired or wireless communication line. In addition, the graffiti game program may be stored in advance in a nonvolatile storage unit in the computer system. Note that an information storage medium for storing the color conversion program is not limited to the above nonvolatile storage unit, and may be a CD-ROM, a DVD, or an optical disc-shaped storage medium similar to them.
  • The wireless communication module 37 has a function of connecting to a wireless LAN, for example, by a method conformed to the standard of IEEE802.11.b/g. The local communication module 38 has a function of wirelessly communicating with a game apparatus of the same type by a predetermined communication method. The wireless communication module 37 and the local communication module 38 are connected to the CPU 31. The CPU 31 is capable of receiving data from and transmitting data to another apparatus via the Internet by using the wireless communication module 37, and capable of receiving data from and transmitting data to another game apparatus of the same type by using the local communication module 38.
  • In addition, the RTC 39 and the power circuit 40 are connected to the CPU 31. The RTC 39 counts time, and outputs the time to the CPU 31. For example, the CPU 31 is capable of calculating current time (date) and the like, based on the time counted by the RTC 39. The power circuit 40 controls electric power from a power supply (typically, a battery accommodated in the lower housing 11) of the game apparatus 1 to supply the electric power to each electronic component of the game apparatus 1.
  • In addition, the game apparatus 1 includes the microphone 42 and an amplifier 43. The microphone 42 and the amplifier 43 are connected to the I/F circuit 41. The microphone 42 detects a voice produced by the user toward the game apparatus 1, and outputs a sound signal indicating the voice to the I/F circuit 41. The amplifier 43 amplifies the sound signal from the I/F circuit 41, and causes the speakers (not shown) to output the sound signal. The I/F circuit 41 is connected to the CPU 31.
  • In addition, the touch panel 13 is connected to the I/F circuit 41. The I/F circuit 41 includes a sound control circuit for controlling the microphone 42 and the amplifier (the speakers), and a touch panel control circuit for controlling the touch panel 13. The sound control circuit performs A/D conversion or D/A conversion with respect to the sound signal, and converts the sound signal into sound data in a predetermined format. The touch panel control circuit generates touched position data in a predetermined format, based on a signal from the touch panel 13, and outputs the touched position data to the CPU 31. For example, the touched position data is data indicating coordinate of a position at which an input is performed with respect to an input surface of the touch panel 13. Note that the touch panel control circuit reads a signal from the touch panel 13 and generates touched position data, once every predetermined time period. The CPU 31 is capable of recognizing a position at which an input is performed with respect to the touch panel 13 by obtaining the touched position data via the I/F circuit 41.
  • An operation button 14 includes the above operation buttons 14A to 14K, and is connected to the CPU 31. The operation button 14 outputs, to the CPU 31, operation data indicating an input state with respect to each of the buttons 14A to 14K (whether or not each button is pressed). The CPU 31 obtains the operation data from the operation button 14, and executes processing in accordance with an input with respect to the operation button 14.
  • The inner camera 23 and the outer camera 25 are connected to the CPU 31. Each of the inner camera 23 and the outer camera 25 shoots an image in accordance with an instruction from the CPU 31, and outputs data of the shot image to the CPU 31. In the present embodiment, the CPU 31 gives a shooting instruction to the inner camera 23 or the outer camera 25, and the camera which has received the shooting instruction shoots an image and transmits image data to the CPU 31.
  • In addition, the lower LCD 12 and the upper LCD 22 are connected to the CPU 31. Each of the lower LCD 12 and the upper LCD 22 displays an image thereon in accordance with an instruction from the CPU 31.
  • Next, referring to FIG. 3 to FIG. 12, the outline of an application assumed in the present embodiment will be described. A game assumed in the present embodiment is a painting application which allows the player to draw a picture by using the touch pen 27. FIG. 3 is an example of a screen of the game assumed in the present embodiment. As shown in FIG. 3, a game screen is displayed on the lower LCD 12, a toolbar 103 is displayed at the top of the game screen, and a canvas 101, which covers most of the game screen, is displayed under the toolbar 103. On the toolbar, a drawing tool icon 111, a line-type icon 112, and the like are displayed. In addition, FIG. 3 shows a state in which the canvas 101 is being touched by the touch pen 27, and a cursor 102 is displayed at the touched position on the canvas 101. In the application, the player can draw a picture on the canvas 101 by moving the touch pen 27 on the canvas 101, and the present invention provides a novel way of operation of the drawing operation, as described later.
  • In addition, in the application, the player can enjoy drawing graffiti on an image shot by the outer camera (or image shot by the inner camera 23) of the game apparatus 1. For example, when an image shown in FIG. 4 is shot by the outer camera 25, the shot image is placed as a “base picture” in the area of the canvas 101 so as to overlap with the canvas 101, whereby the player can enjoy drawing graffiti on the shot image as shown in FIG. 5.
  • FIG. 6 is a schematic view showing a positional relationship between the shot image and the canvas 101. The concept of the application is that two layers of a base-picture layer 105 and a handwriting layer 106 are used, and the shot image corresponds to the base-picture layer 105. The canvas 101 corresponds to the handwriting layer 106. The handwriting layer 106 is, as it were, a transparent sheet, and conceptually, processing in which the transparent sheet is overlapped on the shot image and graffiti is drawn on the sheet is performed. In other words, processing in which the shot image is directly edited (graffiti is directly drawn on the shot image) is not performed in the present embodiment.
  • Next, a drawing operation with respect to the canvas 101 in the application will be described. As described above, in the application, a picture can be drawing by moving the touch pen 27 on the canvas 101. Here, in the application, two types, i.e., a “pen” and an “eraser” can be used as types of drawing tools used for drawing in a game. The “pen” is a tool for drawing something on the canvas, and the “eraser” is a tool for erasing a content drawn on the canvas 101. Upon using the “pen” or the “eraser”, a line of uniform thickness, or a “spray” can be selected as a type (line type) of a drawn line. In the application, upon selection of the drawing tools, the “pen” or the “eraser” can be selected by operating the drawing tool icon 111 on the toolbar 103. The line type of the selected tool can be selected by operating the line-type icon 112. Specifically, use of the line of uniform thickness or use of the “spray” can be selected. At this time, the thickness of the line of uniform thickness can also be designated, and of the four icons of the line-type icon 112, left three icons represent the respective thicknesses. In addition, of the four icons, the rightmost icon on which a picture of a propeller is displayed represents the “spray”. In addition, when the “pen” is selected as a drawing tool, the drawing color (color of the line or the spray) can also be designated.
  • Next, an operation performed when the “pen” is used as a drawing tool will be described. When the “pen” is selected and the line of uniform thickness is selected as the line-type, the line of uniform thickness can be drawn at a touched position as shown in FIG. 3 and FIG. 5. Here, in the case of the pen, a line is drawn at the same time as touch is performed. Specifically, after the player operates the toolbar to select the “pen” and select the line of uniform thickness as the line-type, when the player brings the touch pen 27 into contact with the canvas 101 (touch panel 13), a touch input is detected, and at the same time, a dot (when the touch pen 27 is not being moved) or a line segment (when the touch pen 27 is being moved on the canvas 101) is drawn at the touched position.
  • On the other hand, when the “spray” is selected as the line-type, although a “spray line” as described later can be drawn at a touched position, the drawing is not performed by only touching the touch panel, unlike in the case of using the line of uniform thickness. Hereinafter, referring to FIG. 7 to FIG. 10, a drawing operation performed when the “spray” is selected as the line type will be described. First, the player operates the toolbar 103 to select the “pen” as a drawing tool. Specifically, every time the player touches the drawing tool icon 111, the drawing tool switches to “pen”→“eraser”→“pen” . . . (at this time, an image content of the drawing tool icon 111 also switches between an image of a pen tip and an image of an eraser). Then, after selecting the “pen”, the player touches the rightmost icon of a propeller image among the four icons of the line-type icon 112, and thereby can select the “spray” as the line-type. Thereafter, when the player touches a desired position on the canvas 101, the cursor 102 whose image represents a propeller is displayed at the touched position as shown in FIG. 7. However, as of this moment, nothing is drawn on the canvas 101 (in the case of using the “pen”, at least a “dot” is drawn at the touched position as of this moment). Therefore, in this state, even if the player moves the touch pen 27 while touching the canvas 101 with the touch pen 27, nothing is drawn on the canvas 101.
  • In this state, in order to perform drawing on the canvas 101, the player performs an operation of blowing breath on the cursor 102, as shown in FIG. 8. Then, an animation of a rotating propeller is displayed as the cursor 102, and an image (image made of a collection of multiple dots) which looks like sprayed ink is displayed at the touched position, as shown in FIG. 9. Then, by moving the touch pen 27 while blowing breath on the cursor 102, a line (hereinafter, referred to as spray line) made of a collection of multiple dots, as it were, a line which looks like a line drawn by spray is drawn on a trajectory obtained by moving the touch pen 27, as shown in FIG. 10. Thus, in the case of the “spray”, an operation of blowing breath on the cursor 102 which is of propeller type is performed, and drawing can be performed as if ink is sprayed on the canvas 101.
  • Moreover, in the application, the thickness and the density of the spray line change in accordance with the strength at which the player blows breath. For example, when the player weakly blows breath, the spray line which is thin and dilute (has the reduced number of dots) as shown in FIG. 11 can be drawn. When the player strongly blows breath, the spray line which is thick and dense (has the increased number of dots) as shown in FIG. 12 can be drawn. In addition, how strongly the player blows is reflected in the spray line in real time. For example, when the player desires to draw one spray line, if the player strongly blows at the beginning of the drawing and thereafter, the strength at which the player blows is gradually decreased, the spray line is drawn such that the increased number of dots are drawn when the drawing begins, as shown in FIG. 12, and that the number of dots is gradually decreased with the progression of the drawing. In addition, in the application, when such a spray line is drawn, a spraying sound is reproduced as sound effect.
  • Here, the outline (principle) of drawing processing of the spray line performed in the present embodiment will be described. As shown in FIG. 8, when the player blows on the cursor 102 (touch panel 13) while touching the canvas 101 with the touch pen 27, a sound produced by the player blowing breath is inputted to the microphone 42. In the application, the volume of a sound (hereinafter, referred to as microphone input sound) inputted to the microphone 42 is detected, the thickness and the like of the spray line are determined in accordance with the magnitude of the detected volume, and the spray line is drawn on the canvas 101. That is, in the present embodiment, in the case of using the “spray”, when the two types of inputs, i.e., a “touch input” to the touch panel 13 and a “sound input” to the microphone 42 are performed at the same time, the drawing processing of the spray line is executed. In addition, the volume of the sound effect reproduced when the spray line is drawn is varied in accordance with the magnitude of the detected microphone input sound.
  • Next, of the drawing tools described above, the “eraser” will be described. The case where the player operates the toolbar 103 to select the eraser as the drawing tool, and to select the line of uniform thickness as the “line type”, will be described. In this case, when the player touches a position on the canvas 101, the cursor 102 which is of eraser type is displayed. An operation performed in this case conforms with that performed in the case where the “pen” is selected and the line of uniform thickness is selected, and a line (uniform line and spray line) drawn at the touched position can be erased. Next, the case where the “spray” is selected as the “line-type” will be described. In this case, when the player touches a position on the canvas 101, the cursor 102 which is of propeller type is displayed as in the case where the “pen” is selected and the “spray” is selected as the line type. Then, when the player blows on the cursor 102, the spray line or the line of uniform thickness drawn within a predetermined range can be erased in accordance with the strength (that is, the magnitude of the microphone input sound) at which the player blows breath, and the touched position.
  • Thus, in the present embodiment, processing is performed such that drawing on the canvas 101 can be performed only after a touch input and an operation (sound input to the microphone 42) of blowing breath are performed, as in the case of the “spray”. Thus, it becomes possible to provide a drawing application having a nonconventional and novel way of operation.
  • Next, the detail of application processing performed by the game apparatus 1 will be described. Firstly, data which is stored in the main memory 32 when the application processing is performed will be described. FIG. 13 is an illustrative diagram showing a memory map of the main memory 32 shown in FIG. 2. Referring to FIG. 13, the main memory 32 includes a program storage area 321 and a data storage area 325. Data in the program storage area 321 and a part of data in the data storage area 325 are obtained by copying, onto the main memory 32, data stored in advance in a ROM of the cartridge 29. Note that other than in the cartridge 29, the programs and the data may be stored in, for example, the save data memory 37 instead of the cartridge 29, and may be copied from the save data memory 37 onto the main memory 32 when the programs are executed. In the present embodiment, the latest input coordinate and the input coordinate just prior to the latest input coordinate can be saved as touched position data 3262. The game apparatus 1 repeatedly detects an input to the touch panel 13 at intervals of a unit of time. When an input is detected, data which has been saved as the latest input coordinate is saved as input coordinate just prior to the latest input coordinate. When the player is touching the touch panel 13, data indicating coordinate of the touched position is saved as the latest input coordinate in the touched position data 3262. When the player is not touching the touch panel 13, the latest input coordinate indicating NULL is saved as in the touched position data 3262.
  • The program storage area 321 stores a program which is to be executed by the CPU 31 and which includes a main processing program 322, a camera processing program 323, a graffiti processing program 324, and the like.
  • A main processing program 322 is a program corresponding to processing shown by a flowchart in FIG. 17 described later. A camera processing program 323 is a program for causing the CPU 31 to execute processing for obtaining a shot image by using the outer camera 25, and a graffiti processing program 324 is a program for causing the CPU 31 to execute the processing, shown referring to FIG. 5 and the like, for drawing graffiti on the shot image.
  • The data storage area 325 stores operation data 326, a drawing tool master 327, drawing color data 328, shot image data 329, current tool data 330, sound effect data 331, a spray table 332, sound characteristic data 333, and the like.
  • The operation data 326 is data indicating a content of an operation performed by the player with respect to the game apparatus 1, and includes the operation button data 3261 and the touched position data 3262. The operation button data 3261 is data indicating an input state of each of the operation buttons 14A to 14K. The touched position data 3262 is data indicating coordinate (input coordinate) of a touched position inputted to the touch panel 13. In the present embodiment, while the player is touching the touch panel 13, the input coordinate is repeatedly obtained and saved as the touched position data 3262. Note that in the present embodiment, it is possible to save the latest input coordinate and input coordinate just prior to the latest input coordinate as the touched position data 3262.
  • The drawing tool master 327 is a table associated with the drawing tools described above. FIG. 14 shows an example of a table configuration of the drawing tool master 327. The drawing tool master 327 is a type 3271, a line type 3272, and a cursor image data 3273 in FIG. 14 showing an example of a data configuration of the drawing tool master 327. The drawing tool master 327 shown in FIG. 14 includes the type 3271, the line type 3272, and the cursor image data 3273. The type 3271 is data indicating drawing tools, which are, in the present embodiment, the “pen” and the “eraser”. The line type 3272 is data indicating the line types, which are, in the present embodiment, a line (hereinafter, referred to as uniform line) of uniform thickness, and the “spray”. The cursor image data is image data to be displayed as the cursor 102. In the present embodiment, when the type 3271 is the “pen” and the line type 3272 is the “uniform line”, image data indicating an image of a pen tip is stored as the cursor image data. When the type 3271 is the “pen” and the line type 3272 is the “spray”, the cursor image data stores image data of a propeller as described above. When the type 3271 is the “eraser” and the line type 3272 is the “uniform line”, image data of an eraser is stored as the cursor image data, and when the line type 3272 is the “spray”, an image of a propeller is stored as the cursor image data.
  • The drawing color data 328 is data indicating the color of a line or the like drawn on the canvas 101 when the type of the drawing tool is the “pen”. The shot image data 329 is data indicating an image shot by the outer camera 25. The current tool data 330 is data indicating the type of the drawing tool (pen or eraser) currently selected and the line type (uniform line or spopray). The sound effect data 331 is data of a sound effect to be reproduced upon drawing.
  • The spray table 332 is a table which defines the size of an area in which drawing is performed and the number of dots to be drawn, so as to associate the size and the number with the volume of the above-described microphone input sound. FIG. 15 shows an example of a data configuration of the spray table 332. The spray table shown in FIG. 15 includes a volume 3321, an area size 3322, and a dot number 3323. The volume 3321 indicates a range of magnitudes of the volume of the microphone input sound. Note that in the present embodiment, the magnitude of the volume is represented as a value from 0 to 100. The area size 3322 indicates a drawing area in which the spray line is drawn by performing the drawing processing once. In the present embodiment, the shape of the drawing area is circular, and a value indicating the radius of the drawing area is defined as a value of the area size 3322. The dot number 3323 defines the number of dots to be drawn in the drawing area. As shown by an example in FIG. 15, for example, when the volume 3321 indicates “11 to 30”, dots whose number is indicated by the dot number 3323 are drawn in a circular area 201 having a size shown in FIG. 16 (a). When the volume 3321 indicates “31 to 50”, an increased number of dots are drawn in the circular area 201 having an increased size, as shown in FIG. 16 (b) (note that dotted lines indicating the circular area 201 in FIG. 16 are just drawn as a matter of convenience, and are not displayed on the screen).
  • Referring to FIG. 13 again, the sound characteristic data 333 is data indicating a characteristic of a sound inputted to the microphone 42, and specifically, is data indicating the volume, frequency, tone, and the like. Note that in the present embodiment, data indicating the volume of the microphone input sound is stored as the sound characteristic data 333.
  • In processing described later, the data storage area 325 stores, in addition to the above-described data, various flags such as a reproduction flag used for indicating whether or not reproduction of a sound effect is being performed, various image data, and the like.
  • Next, a flow of application processing (hereinafter, referred to as graffiti game processing) executed by the game apparatus 1 will be described referring to FIG. 17 to FIG. 21. FIG. 17 is a flowchart showing the flow of the graffiti game processing executed by the game apparatus 1. When the game apparatus 1 is powered on, the CPU 31 of the game apparatus 1 executes a starting program stored in a boot ROM which is not shown, thereby initializing each unit such as the main memory 32. Then, a game program stored in the cartridge 29 is read into the main memory 32, thereby starting execution of the application program. As a result, a game image is displayed on the lower LCD 12, and thereby the application is started.
  • Referring to FIG. 17, first, the CPU 31 displays an inquiry screen for inquiring whether or not to execute camera processing (step S1). That is, in the application processing, it is also possible to execute the graffiti processing described later without performing shooting by using the camera. In other words, it is also possible to perform drawing processing with respect to the canvas 101 which is blank, as shown in FIG. 3, without using the base picture shown in FIG. 5 and the like.
  • Next, the CPU 31 obtains the operation data 326 from the main memory 32 (step S2). Next, it is determined whether or not a content indicated by the operation data is an instruction of executing camera processing (step S3). As a result, if the content is not an instruction of executing camera processing (NO in step S3), the CPU 31 proceeds to processing in step S5 described later. On the other hand, if the content is an instruction of executing camera processing (YES in step S3), the CPU 31 executes camera processing (step S4). In the camera processing, processing for shooting an image which is to be used as the base picture by using the outer camera 25 and saving the shot image is executed. Next, the CPU 31 executes the graffiti processing (step S5). In the graffiti processing, processing for displaying the screen as shown in FIG. 5 and enabling graffiti to be drawn on the image shot by the camera is executed.
  • FIG. 18 is a flowchart showing the detail of the camera processing shown in step S4. As shown in FIG. 18, first, the CPU 31 performs initialization processing (step S11). In the initialization processing, various parameters (shooting magnification, exposure time, and the like) for shooting which are defined as initial values in advance are set.
  • Next, the CPU 31 displays, on the lower LCD 12, a video being shot by the outer camera 25 (step 12).
  • Next, the CPU 31 obtains the operation data 326 from the main memory 32 (step S13). Thereafter, the CPU 31 determines whether or not a content of an operation performed by the player which is indicated by the operation data 326 indicates that the shutter button is pressed down (step S14). As a result of the determination, if the shutter is pressed down (YES in step S14), the CPU 31 performs processing of storing an image shot by the outer camera 25. That is, the image shot by the outer camera 25 is stored as the shot image data 329 in the main memory 32 (step S15). Thereafter, the CPU 31 returns to the processing in step S12 to repeat the processing.
  • On the other hand, as a result of the determination in step S14, if the shutter is not pressed (NO in step S14), next, the CPU 31 determines whether or not a content of an operation indicated by the operation data 326 is an operation of an instruction of ending the camera processing (step S16). As a result, if the content is an instruction of ending the camera processing (YES in step S16), the CPU 31 ends the camera processing. On the other hand, if the content is not an instruction of ending the camera processing (NO in step S16), the CPU 31 executes other processing based on the operation data 326 (step S17). For example, the CPU 31 executes setting of control of zoom magnification, exposure control, or the like. Thereafter, the CPU 31 returns to step S12 to repeat processing therefrom. Description of the camera processing is finished here.
  • Next, the graffiti processing shown in step S2 will be described. FIG. 19 is a flowchart showing the detail of the graffiti processing shown in step S2. As shown in FIG. 19, first, the CPU 31 executes initial processing with respect to the graffiti processing (step S21). Specifically, the CPU 31 generates and displays a game screen as shown in FIG. 3 and the like. In addition, the CPU 31 reads out the shot image data 329 from the main memory 32, and displays, as a “base picture”, a shot image which has been shot through the camera processing, on the canvas 101. At this time, if the camera processing has not been performed, nothing is stored in the shot image data 329, and therefore, in this case, the CPU 31 displays nothing on the canvas 101. That is, the canvas 101 which is blank is displayed. In addition, the CPU 31 sets, as an initial value of the current tool data 330, data indicating that the drawing tool is the “pen” and the line type is the “uniform line”. That is, at the start of the graffiti processing, the “pen” whose line-type is the “uniform line” is selected as the drawing tool.
  • When the initial processing is finished, next, the CPU 31 obtains the operation data 326 from the main memory 32 (step S22). Thereafter, the CPU 31 determines whether or not a content of an operation indicated by the operation data 326 is an instruction of ending the graffiti processing (step S23). As a result of the determination, if the content is an instruction of ending the graffiti processing (YES in step S23), the CPU 31 ends the graffiti processing.
  • On the other hand, if the content is not an instruction of ending the graffiti processing (NO in step S23), next, the CPU 31 determines whether or not the content of an operation is an operation of selecting the type of the drawing tool (step S24). As a result, if the content is an operation of selecting the drawing tool (YES in step S24), processing of selecting the drawing tool is executed based on the content of the operation data 326 (step S25). Here, an example of the selection operation will be described. First, the player touches the drawing tool icon 111 on the screen as shown in FIG. 3 and the like. Every time this operation of touching the drawing tool icon 111 is detected, the CPU 31 alternately sets the “pen” and the “eraser” as the drawing tool in the current tool data 330. That is, every time the player touches the drawing tool icon 111, the drawing tool switches between the “pen” and the “eraser”. Moreover, the player touches one of the four icons of the line-type icon 112. The CPU 31 detects the touched icon (more accurately, the coordinate where the icon is displayed), the line type corresponding to a content of the icon is set in the current tool data 330. When the rightmost icon is touched among the four icons, the “spray” is set as the line type in the current tool data 330. On the other hand, when the other icons are touched, the “uniform line” is set as the line type, and data indicating the thickness of the line is also set in the current tool data 330 in accordance with the touched icon. Referring to an example in FIG. 3, one of three types of thicknesses of lines can be selected. The thickness of the uniform line indicated by the rightmost icon is the thinnest, and the third icon from the left indicates the thickest uniform line. Thus, the CPU 31 executes processing of setting data indicating the drawing tool in the current tool data 330 based on operation data, and thereafter, the CPU 31 returns to step S22 to repeat processing therefrom.
  • On the other hand, as a result of the determination in step S24, if the content of an operation is not an operation of selecting the drawing tool (NO in step S24), next, the CPU 31 refers to the operation data 326 and thereby determines whether or not a touch input (more accurately, touch input to an area, of the touch panel 13, corresponding to an area in which the canvas 101 is displayed) to the canvas 101 is being performed (step S26). Specifically, the CPU 31 refers to the latest input coordinate stored in the touched position data 3262, and determines whether or not the latest input coordinate is in the area in which the canvas 101 is displayed. As a result, if a touch input to the canvas 101 is being performed (YES in step S26), the CPU 31 executes processing of displaying the cursor 102 at the position where the touch input is being performed (step S27). More specifically, first, the CPU 31 refers to the drawing tool master 327 and obtains a piece of the cursor image data 3272 which corresponds to the drawing tool currently selected. Then, the CPU 31 refers to the touched position data 3262 and displays, as the cursor 102, an image based on the piece of the cursor image data 3272 at the position where the touch input is being performed.
  • Subsequently to the display of the cursor 102, the CPU 31 determines whether or not the drawing tool currently selected, that is, the drawing tool indicated by the current tool data 330 is the “pen” (step S28). As a result, if the drawing tool is the “pen”, the CPU 31 executes pen processing (step S29). FIG. 20 is a flowchart showing the detail of the pen processing. As shown in FIG. 20, first, the CPU 31 determines whether or not the line type indicated by the current tool data 330 is the “spray” (step S41). As a result, if the line type is not the “spray” (NO in step S41), the CPU executes pen drawing processing (step S42). That is, based on the touched position, processing of drawing a uniform line of a thickness currently selected is executed. Thereafter, the CPU 31 ends the pen processing.
  • On the other hand, as a result of the determination in step S41, if the line type is the “spray” (YES in step S41), the CPU 31 executes spray drawing processing for drawing a spray line as described above referring to FIG. 10 and the like (step S43), and thereafter, ends the pen processing.
  • FIG. 21 is a flowchart showing the detail of the spray drawing processing shown in step S43. As shown in FIG. 21, first, the CPU 31 detects the volume of a sound (microphone input sound) inputted to the microphone 42, and stores the volume as the sound characteristic data 333 (step S51).
  • Next, the CPU 31 determines whether or not the volume indicated by the sound characteristic data 333 is equal to or larger than a predetermined threshold value which is set in advance (step S52). As a result of the determination, if the volume of the microphone input sound is equal to or larger than the predetermined threshold value (YES in step S52), the CPU 31 refers to the spray table 332, and determines the size of an area in which a spray line is drawn, and the number of dots to be drawn, based on the magnitude of the volume (step S53).
  • Next, in accordance with the magnitude of the volume indicated by the sound characteristic data 333, the CPU 31 determines a volume at which a sound effect reproduced upon drawing a spray line is reproduced (step S54).
  • Next, in accordance with the magnitude of the volume indicated by the sound characteristic data 333, the CPU 31 sets a speed at which an animation display of the cursor 102 is reproduced (step S55). As described above, in the present embodiment, when a spray line is drawn, an animation in which a propeller rotates is displayed as the cursor 102 which is of propeller type. The CPU 31 executes processing in which the speed at which the propeller rotates is set in accordance with the magnitude of the volume of the microphone input sound. For example, the speed at which the animation is reproduced is set such that if the volume of the microphone input sound is larger, the propeller rotates faster. For example, in the case where the animation in which the propeller rotates includes three images, setting may be performed such that when the volume of the microphone input sound is large, the image may be changed for every one frame, and that when the volume of the microphone input sound is not large, the image may be redrawn for every ten frames.
  • Next, the CPU 31 places the above-described drawing area such that the center of the drawing area coincides with the touched position, and draws dots to form a spray line on the canvas 101 (in the drawing area) in accordance with a content of the determination in step S53 (step S56). Here, the dots to form a spray line may be randomly placed in the drawing area, or may be drawn around the touched position such that the density of the dots is greatest at the touched position and that the density of the dots gradually becomes lesser as the dots become more distant from the touched position.
  • Next, the CPU 31 displays an animation (animation in which a propeller rotates) of the cursor 102 in accordance with the speed, set in step S55, at which the animation is reproduced (step S57).
  • Next, the CPU 31 determines whether or not the reproduction flag is set at OFF (step S58). The reproduction flag is a flag indicating whether or not a sound effect is being reproduced, and when a sound effect is not being reproduced, the reproduction flag is set at OFF. As a result of the determination, if the reproduction flag is OFF (YES in step S58), the CPU 31 sets the reproduction flag to ON (step S59). Then, the CPU 31 refers to the sound effect data 331, and starts reproducing a sound effect (spraying sound of a spray) for drawing of a spray line at a volume set in step S54 (step S60). Thereafter, the CPU 31 ends the spray processing.
  • On the other hand, as a result of the determination in step S58, if the CPU 31 determines that the reproduction flag is not OFF (NO in step S58), since it is considered that a sound effect is being reproduced, the CPU 31 ends the spray processing without executing the processing in steps S59 and S60.
  • Next, processing (NO in step S52) performed when, as a result of the determination in step S52, the volume indicated by the sound characteristic data 333 is smaller than the predetermined threshold value, will be described. In this case, next, the CPU 31 determines whether or not the reproduction flag is set at ON (step S61). As a result, if the reproduction flag is ON (YES in step S61), the CPU 31 stops the reproduction of the sound effect which has been started in step S60. Then, the CPU 31 sets the reproduction flag to OFF (step S63).
  • On the other hand, as a result of the determination in step S61, if the reproduction flag is not ON (NO in step S61), the CPU 31 ends the spray drawing processing without executing the processing in steps S62 and S63. Description of the spray drawing processing will be finished here.
  • Referring to FIG. 19 again, as a result of the determination in step S28, if the current tool data 330 does not indicate the “pen” (NO in step S28), next, the CPU 31 determines whether or not the drawing tool indicated by the current tool data 330 is the “eraser” (step S30). As a result of the determination, if the drawing tool is not the “eraser” (NO in step S30), the CPU 31 returns to step S22 to repeat processing therefrom.
  • On the other hand, as a result of the determination in step S30, if the current tool data 330 indicates the “eraser” (YES in step S30), the CPU 31 executes eraser processing (step S31). In the eraser processing, processing in which, when a microphone input sound of a volume equal to or larger than a predetermined value is inputted, a spray line and the like which are drawn at the touched position are erased, is executed. FIG. 22 is a flowchart showing the detail of the eraser processing. As shown in FIG. 22, first, the CPU 31 determines whether or not the line type indicated by the current tool data 330 is the “spray” (step S71). As a result, if the line type is not the “spray” (NO in step S71), the CPU 31 executes pen eraser processing (step S42). That is, the CPU 31 executes processing of, based on the touched position, erasing a uniform line or a spray line at the thickness of a line currently selected. Thereafter, the CPU 31 ends the eraser processing.
  • On the other hand, as a result of the determination in step S71, if the line type is the “spray”, the CPU 31 executes spray eraser processing (step S73). FIG. 23 is a flowchart showing the detail of the spray eraser processing shown in step S73. Since, in FIG. 23, the processing in steps S51 and S52, and the processing in steps S57 to S63 are the same as the processing in the corresponding steps described referring to FIG. 21, detailed description thereof is omitted and the other processing will mainly be described here.
  • As shown in FIG. 23, in step S52, the CPU 31 determines whether or not the volume of the microphone input sound is equal to or larger than a predetermined threshold value, and as a result, if the volume is equal to or larger than the predetermined threshold value (YES in step S52), the CPU 31 determines the size of an area (hereinafter, referred to as erasing area) to be erased, in accordance with the magnitude of the volume (step S81). A method for determining the erasing area conforms to a method for determining the drawing area for the spray line. That is, the CPU 31 refers to the spray table 332 and obtains the area size 3322 corresponding to the magnitude of the volume. Then, based on this size, the CPU 31 determines the size of the erasing area. Note that, similarly to the above-described drawing area, the shape of the erasing area is circular.
  • Next, the CPU 31 determines the volume at which the sound effect for erasing the spray line or the like is reproduced, in accordance with the magnitude of the volume (step S82).
  • Next, the CPU 31 determines the speed at which an animation of the cursor for the erasing is reproduced (step S83). That is, as in step S55, the CPU 31 determines the speed at which the propeller rotates, in accordance with the magnitude of the volume of the microphone input sound.
  • Next, the CPU 31 places the erasing area such that the center of the erasing area coincides with the touched position, and erases the spray line drawn within the erasing area (step S84).
  • Thereafter, the CPU 31 displays the animation of the cursor (step S57), and proceeds to processing of determining whether or not the reproduction flag is OFF (step S58). Since processing in step S58 and subsequent steps is the same as the processing in the respective steps described above referring to FIG. 21, detailed description thereof is omitted. Description of the spray eraser processing is finished here.
  • Referring to FIG. 19 again, next, processing (NO in step S26) performed when, as a result of the determination in step S26, a touch input is not being performed, will be described. In this case, next, the CPU 31 determines whether or not touch off has been performed, based on the operation data 326 (step S33). That is, the CPU 31 determines whether a state (state in which a touch continues to be performed) in which a touch input continues to be performed is interrupted, or a state in which a touch is not being performed continues. As a result of the determination, touch off has been performed (YES in step S33), the CPU 31 erases the cursor 102 (step S34). Thereafter, the CPU 31 returns to step S22 to repeat processing therefrom. On the other hand, if touch off has not been performed (NO in step S33), the CPU 31 directly returns to step S22 to repeat processing therefrom. Description of the graffiti processing is finished here. Specifically, if the touched position data 3262 indicates that the latest input coordinate is NULL and the input coordinate just prior to the latest input coordinate is the coordinate of the touched position, the CPU 31 determines that touch off has been performed, and if the touched position data 3262 indicates that the latest input coordinate is NULL and the input coordinate just prior to the latest input coordinate is also NULL, the CPU 31 determines that a state in which a touch is not being performed continues.
  • Thus, in the present embodiment, in the case where the player performs drawing by using the “spray”, drawing on the canvas 101 can be performed when two types of inputs, that is, a touch input to the canvas 101 and a sound input to the microphone 42 are performed. As a result, a novel game having a nonconventional and new feeling of operation can be provided.
  • In addition, in the present embodiment, the spray line is drawn while a touch input and a sound input to the microphone 42 continue to be performed (the player continues to blow). Therefore, by changing the strength at which the player blows breath, the thickness of the spray line can be changed in real time. Thus, it becomes possible to provide a novel way of enjoyment in which, depending on how the player blows breath, the thickness (corresponding to so-called pen pressure) of the spray line can be changed, that is, a content to be drawn can be changed.
  • In the above-described eraser processing, processing of erasing the spray line drawn at a touched position is performed. Other than such processing, processing (hereinafter, referred to as blow-off processing) in which dots forming the spray line which is present near or at a touched position are blown off in accordance with blowing breath on the touch panel may be executed. Hereinafter, the outline of the blow-off processing will be described. For example, it is assumed that a positional relationship between a touched position and dots is as shown in FIG. 24. FIG. 24 shows, as an example, a state in which there are five dots 211 a to 211 e above a touched position 210. In this state, when the player blows on the touched position 210 (that is, a sound input to the microphone 42 is performed), first, as shown in FIG. 25, straight lines 212 a to 212 e which respectively connect the dots 211 a to 211 e with the touched position 210 are calculated. Then, as shown in FIG. 26, processing of moving the dots 211 in accordance with the lengths and the directions of the respective straight lines 212 is executed. In the example in FIG. 26, the dots 211 are moved in the directions, along the respective straight lines 212, opposite to the directions toward the touched position 210. The distances of the movements are inversely proportional to the lengths of the respective straight lines. That is, the distance of a movement of a dot nearer to the touched position 210 is longer, and the distance of a movement of a dot farther from the touched position 210 is shorter. That is, in the processing, a dot nearer to a position (in this case, touched position 210) on which the player blows is subjected to stronger blow and thereby blown off farther.
  • Next, referring to FIG. 27, the detail of the blow-off processing will be described. FIG. 27 is a flowchart showing the detail of the blow-off processing. Here, the case where, in the flowchart of the graffiti processing described referring to FIG. 19, the blow-off processing is executed in place of the spray eraser processing in step S73, will be described as an example. As a matter of course, instead of executing the blow-off processing in place of the spray eraser processing, the blow-off processing may be executed together with the spray eraser processing.
  • As shown in FIG. 27, first, the CPU 31 detects a volume (step S51), and determines whether or not the detected volume is equal to or larger than a predetermined value (step S52). Since this processing is the same as the processing in steps S51 and S52 in FIG. 23, detailed description thereof is omitted.
  • As a result of the determination in step S52, if the volume is smaller than the predetermined threshold value (NO in step S52), the CPU 31 proceeds to processing in step S61. Since processing to be performed in this case is also the same as the processing from step S61 in FIG. 23, description thereof is omitted.
  • On the other hand, as a result of the determination in step S52, if the CPU 31 determines that the volume is equal to or larger than the predetermined threshold value (YES in step S52), the CPU 31 determines the size of an area (hereinafter, referred to as blow-off area) for the blow-off processing, in accordance with the magnitude of the volume (step S81). The size of the area is determined by referring to the spray table 332 and obtaining the area size 3322 corresponding to the magnitude of the volume, as in step S53.
  • Next, the CPU 31 calculates straight lines (see FIG. 25) which connect, with a touched position, the respective dots forming a spray line present within the blow-off area (step S83).
  • Next, the CPU 31 moves the dots within the blow-off area in accordance with the directions and the lengths of the respective calculated straight lines (see FIG. 26). Note that at this time, if a moved dot overlaps with a position of another dot, a dot which is nearer to the touched position before the movement is drawn over the other dot.
  • The dots are moved through the above processing, and thereafter, the CPU 31 proceeds to processing in step S58. Since the processing from step S58 is the same as the corresponding processing described above referring to FIG. 21, description thereof is omitted.
  • Thus, by performing processing in which when the player blows breath on a touched position, a dot is moved as if sands are blown off, it becomes possible to provide the player with a novel way of enjoyment.
  • In addition, in the above embodiment, the CPU 31 detects a sound produced when the player blows breath on the touch panel 13 and performs processing based on the volume thereof, and at this time, any other sound can be used (for example, a sound of clapping hands can be used). That is, the type and the content of the sound are not identified. However, the present invention is not limited thereto, and a sound of “blow” may be identified. A method of detection or identification of the “sound of blow” may be of any type. For example, there can be considered a method in which a waveform pattern of a sound segment included in a sound (sound of breath) of blow is stored in advance, the stored sound segment and a sound segment of an inputted sound are compared with each other, and thereby it is determined whether or not the player has blown. Alternatively, there may be used a method in which by using a fast fourier transform processing (FFT), the spectrum of an inputted sound is calculated, the calculated spectrum and the spectrum of a sound of blow which is stored in advance are compared with each other, and thereby it is determined whether or not the player has blown.
  • In addition, instead of using a volume, or using the type and the content of a specified sound as described above, a characteristic of an inputted sound, such as tone or frequency, may be calculated or identified, and a content of drawing processing may be changed in accordance with the characteristic of the inputted sound.
  • In addition, the reproduction of the sound effect may be executed such that a fade-in/fade-out effect is used upon start and end of the reproduction of the sound effect. This can prevent a noise (for example, noise of a sound “putsu”) upon start of reproduction from being generated.
  • In addition, for a drawing color used upon drawing, only one color may be used, or a plurality of colors may be used at the same time. In an exemplar case where a plurality of colors are used at the same time, an edged line (whose edge has a color different from a color of part of the edged line other than the edge) may be used if the “pen” is used as the drawing tool, for example. In addition, if the “spray” is used, dots forming a spray line may have colors different from each other. For example, when “gray” and “black” are designated as the drawing colors, the spray line which is formed by both a gray dot and a black dot may be drawn (see FIG. 28).
  • In addition, in the case where a plurality of drawing colors are used, when the above-described blow-off processing is executed and dots having different colors overlap with each other as a result of movements of the dots, the dots may be displayed as one dot having a color obtained by mixing the colors thereof with each other. Thus, when the spray line is drawn by using multiple drawing colors, the spray line which includes various colors in a mixed manner and cannot be predicted by the player can be displayed through the above-described blow-off processing, whereby a new way of enjoyment using the blow-off processing can be provided to the player.
  • In addition, in the spray eraser processing, a translucence effect may be used for erasing the spray line or the like. That is, instead of erasing the spray line or the like at the moment when the player blows breath, processing in which the spray line or the like is gradually diluted, and finally, cleanly erased may be executed.
  • In addition to other than the drawing processing using the “spray line” formed by multiple dots, the drawing processing using the “uniform line” may also be executed only when a microphone input sound is being inputted. Further, in this case, the pen pressure may be changed in accordance with the magnitude of the microphone input sound. For example, when the magnitude of the microphone input sound is small (when the strength at which the player blows breath is weak), a “faded line” or a “line of a dilute color” may be drawn, and when the magnitude of the microphone input sound is large (when the strength at which the player blows breath is strong), a “clear line” or a “line of a deep color” may be drawn. In addition, drawing of the “uniform line” with the “pen” may be executed without a microphone input sound, and the thickness of the line may be changed in real time by a breath being blown on the touch panel 13 while the “uniform line” is being drawn with the “pen”.
  • In addition, upon drawing in the spray processing, when the volume of a microphone input sound becomes smaller than a predetermined threshold value (that is, when the player stops blowing) while a touch input continues to be detected, drawing of the spray line may continue during about 1 to 2 seconds, for example, instead of immediately stopping drawing of the spray line. That is, processing in which even if the player stops blowing breath, rotation of the propeller continues during a short time and the spray line is drawn during the short time, may be executed.
  • In addition, it is understood that the image created in the above embodiment, on which graffiti has been drawn, may be saved. In this case, only data corresponding to the above-described handwriting layer 106 (see FIG. 6) may be saved, or data obtained by combining data corresponding to handwriting layer 106 with a shot image may be saved as a composite image.
  • In addition, in the above embodiment, the case where graffiti is drawn on an image shot by the outer camera is described as an example. However, the present invention is not limited thereto, and the present invention is applicable to general painting software which does not use the outer camera 25, that is, which does not allow graffiti to be drawn on a shot image or the like.
  • In addition, in the present embodiment, a hand-held game apparatus having two display devices is described as an example. However, the present invention is applicable to a hand-held terminal which has a single display device and has a touch panel on a screen of the display device. In addition, in the present embodiment, a touch panel is used as an example of a device which detects a designated position, in an operation area, designated by the player. However, a so-called pointing device which allows the player to designate a position in a predetermined area may be used as the device. For example, there may be used, as the device, a mouse which is capable of designating any position on a screen, or a tablet which designates any position on an operation surface having no display screen. Alternatively, there may be used, as the device, a pointing device in which: a device including shooting means for remotely shooting, for example, a display screen, or a marker positioned in the periphery of the display screen obtains a shot image by pointing toward the display screen; and from the position of the display screen or the marker on the shot image, coordinate, on the display screen corresponding to the position on the display screen at which the device has pointed, is calculated.
  • While the invention has been described in detail, the foregoing description is in all aspects illustrative and not restrictive. It is understood that numerous other modifications and variations can be devised without departing from the scope of the invention.

Claims (21)

1. A computer-readable storage medium having stored therein a drawing processing program which is executed by a computer of an information processing apparatus in which a pointing device for designating a position on a display screen, and sound input means can be used, the drawing processing program causing the computer to function as:
designated position detection means for continuously obtaining a designated position on the display screen, based on a designation performed by the pointing device;
sound detection means for detecting that a sound which satisfies a predetermined condition is inputted to the sound input means; and
drawing-related processing execution means for, while the sound detection means detects that the sound is inputted, executing predetermined drawing-related processing at a position based on the designated position obtained by the designated position detection means.
2. The computer-readable storage medium having stored therein the drawing processing program according to claim 1, wherein the drawing-related processing execution means changes a content of the drawing-related processing to be executed, based on the sound detected by the sound detection means, and in accordance with a characteristic of the sound.
3. The computer-readable storage medium having stored therein the drawing processing program according to claim 2, wherein the drawing-related processing execution means sequentially changes a content of the drawing-related processing to be executed, in a coordinated manner with changes in chronological order in the characteristic of the sound repeatedly detected by the sound detection means.
4. The computer-readable storage medium having stored therein the drawing processing program according to claim 2, wherein the drawing-related processing execution means executes the predetermined drawing-related processing only when a volume of the inputted sound is equal to or larger than a predetermined threshold value.
5. The computer-readable storage medium having stored therein the drawing processing program according to claim 2, wherein the drawing-related processing execution means executes, as the drawing-related processing, processing of drawing a line which connects, in chronological order, the position based on the designated position sequentially obtained by the designated position detection means.
6. The computer-readable storage medium having stored therein the drawing processing program according to claim 5, wherein the drawing-related processing execution means changes at least one of a thickness of the line and a density of a color in which the line is drawn, in accordance with a volume of the inputted sound.
7. The computer-readable storage medium having stored therein the drawing processing program according to claim 2, wherein the drawing-related processing execution means executes, as the drawing-related processing, processing of drawing one or more dots in a drawing range which is a predetermined range including therein the position based on the designated position.
8. The computer-readable storage medium having stored therein the drawing processing program according to claim 7, wherein the drawing-related processing execution means changes at least one of a size of the drawing range and a number of the dots to be drawn in the drawing range, in accordance with a volume of the inputted sound.
9. The computer-readable storage medium having stored therein the drawing processing program according to claim 8, wherein the drawing-related processing execution means draws the dots such that an area density of the number of the dots which are nearer to the position based on the designated position is higher, and that an area density of the number of the dots which are farther from the position based on the designated position is lower.
10. The computer-readable storage medium having stored therein the drawing processing program according to claim 8, wherein the drawing-related processing execution means draws the dots at random positions in the drawing area.
11. The computer-readable storage medium having stored therein the drawing processing program according to claim 7, wherein the drawing-related processing execution means executes, as the drawing-related processing, processing of moving the dots drawn on the display screen in a predetermined direction, based on the position based on the designated position, and the sound input detected by the sound detection means.
12. The computer-readable storage medium having stored therein the drawing processing program according to claim 11, wherein the drawing-related processing execution means
includes movement content calculation means for calculating: a direction of a line connecting each of the dots displayed on the display screen, with a reference point which is the position based on the designated position; and a distance from the reference point to each of the dots displayed on the display screen, and
moves the dots displayed on the screen, based on the direction and the distance calculated by the movement content calculation means.
13. The computer-readable storage medium having stored therein the drawing processing program according to claim 1, wherein the drawing processing program further causes the computer to function as sound effect reproduction means for causing predetermined sound output means to output a predetermined sound effect when the drawing-related processing execution means is executing the predetermined drawing-related processing.
14. The computer-readable storage medium having stored therein the drawing processing program according to claim 13, wherein the sound effect reproduction means changes a volume at which the sound effect is reproduced, in accordance with a characteristic of the sound detected by the sound detection means.
15. The computer-readable storage medium having stored therein the drawing processing program according to claim 1, wherein the drawing processing program further causes the computer to function as:
cursor display means for displaying a predetermined cursor image at the designated position; and
animation display means animates the cursor when the drawing-related processing execution means is executing the predetermined drawing-related processing.
16. The computer-readable storage medium having stored therein the drawing processing program according to claim 15, wherein the animation display means changes a speed of the animation in accordance with a characteristic of the sound detected by the sound detection means.
17. The computer-readable storage medium having stored therein the drawing processing program according to claim 1, wherein the pointing device is a touch panel.
18. The computer-readable storage medium having stored therein the drawing processing program according to claim 1, wherein the drawing processing program further causes the computer to function as:
shot image obtaining means for obtaining image data of an image shot by predetermined shooting means; and
shot image display means for displaying, on the display screen, the shot image based on the image data, and
the drawing-related processing execution means executes the drawing-related processing on the shot image.
19. The computer-readable storage medium having stored therein the drawing processing program according to claim 1, wherein
the drawing processing program further causes the computer to function as specified sound determination means for determining whether or not the sound detected by the sound detection means is a predetermined sound, and
the drawing-related processing execution means executes the drawing-related processing only when the specified sound determination means determines that the sound detected by the sound detection means is the predetermined sound.
20. An information processing apparatus in which a pointing device for designating a position on a display screen, and sound input means can be used, the information processing apparatus comprising:
designated position detection means for continuously obtaining a designated position on the display screen, based on a designation performed by the pointing device;
sound detection means for detecting that a sound which satisfies a predetermined condition is inputted to the sound input means; and
drawing-related processing execution means for, while the sound detection means detects that the sound is inputted, executing predetermined drawing-related processing at a position based on the designated position.
21. The information processing apparatus according to claim 20, wherein the sound detection means is placed in proximity of the display screen.
US12/646,306 2009-01-05 2009-12-23 Computer-readable storage medium having stored therein drawing processing program, and information processing apparatus Abandoned US20100210332A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2009-000443 2009-01-05
JP2009000443A JP5170771B2 (en) 2009-01-05 2009-01-05 Drawing processing program, information processing apparatus, information processing system, and information processing control method

Publications (1)

Publication Number Publication Date
US20100210332A1 true US20100210332A1 (en) 2010-08-19

Family

ID=42560413

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/646,306 Abandoned US20100210332A1 (en) 2009-01-05 2009-12-23 Computer-readable storage medium having stored therein drawing processing program, and information processing apparatus

Country Status (2)

Country Link
US (1) US20100210332A1 (en)
JP (1) JP5170771B2 (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110181619A1 (en) * 2010-01-22 2011-07-28 Samsung Electronics Co., Ltd. Apparatus and method for transmitting and receiving handwriting animation message
US20120174009A1 (en) * 2010-12-29 2012-07-05 Samsung Electronics Co., Ltd. Method for inputting memo in touch screen terminal and device thereof
US20120200540A1 (en) * 2010-06-01 2012-08-09 Kno, Inc. Utilization of temporal and spatial parameters to enhance the writing capability of an electronic device
US20120293555A1 (en) * 2010-01-15 2012-11-22 Akihiro Okano Information-processing device, method thereof and display device
US20130063367A1 (en) * 2011-09-13 2013-03-14 Changsoo Jang Air actuated device
US20130162671A1 (en) * 2011-12-27 2013-06-27 Yohei Fujita Image combining apparatus, terminal device, and image combining system including the image combining apparatus and terminal device
EP2680110A1 (en) * 2012-06-29 2014-01-01 Samsung Electronics Co., Ltd Method and apparatus for processing multiple inputs
US20140380226A1 (en) * 2013-06-21 2014-12-25 Sharp Kabushiki Kaisha Image display apparatus allowing operation of image screen and operation method thereof
US20150212793A1 (en) * 2012-10-17 2015-07-30 Tencent Technology (Shenzhen) Company Limited Mobile terminal and image processing method thereof
US20150228201A1 (en) * 2014-02-13 2015-08-13 Crayola, Llc Photo Strings
US20150242049A1 (en) * 2012-11-02 2015-08-27 Sony Corporation Display control device, display control method, and program
US20150324096A1 (en) * 2012-02-09 2015-11-12 Flixel Photos, Inc. Systems and methods for creation and sharing of selectively animated digital photos
US20170236318A1 (en) * 2016-02-15 2017-08-17 Microsoft Technology Licensing, Llc Animated Digital Ink
US20190011265A1 (en) * 2016-03-28 2019-01-10 Aisin Aw Co., Ltd. Server device, communication terminal, route retrieval system, and computer program
US10325407B2 (en) 2016-09-15 2019-06-18 Microsoft Technology Licensing, Llc Attribute detection tools for mixed reality
US10895954B2 (en) * 2017-06-02 2021-01-19 Apple Inc. Providing a graphical canvas for handwritten input

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5636888B2 (en) * 2010-11-09 2014-12-10 ソニー株式会社 Information processing apparatus, program, and command generation method
JP2019109579A (en) * 2017-12-15 2019-07-04 フリュー株式会社 Photograph making game machine, editing method and program
JP6863918B2 (en) * 2018-03-19 2021-04-21 グリー株式会社 Control programs, control methods and information processing equipment
KR102203573B1 (en) * 2019-03-13 2021-01-15 (주)피플인사이드 System and program for providing drawing service using speech recognition

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4764965A (en) * 1982-10-14 1988-08-16 Tokyo Shibaura Denki Kabushiki Kaisha Apparatus for processing document data including voice data
US5420607A (en) * 1992-09-02 1995-05-30 Miller; Robert F. Electronic paintbrush and color palette
US5600765A (en) * 1992-10-20 1997-02-04 Hitachi, Ltd. Display system capable of accepting user commands by use of voice and gesture inputs
US5689618A (en) * 1991-02-19 1997-11-18 Bright Star Technology, Inc. Advanced tools for speech synchronized animation
US5715412A (en) * 1994-12-16 1998-02-03 Hitachi, Ltd. Method of acoustically expressing image information
US5768607A (en) * 1994-09-30 1998-06-16 Intel Corporation Method and apparatus for freehand annotation and drawings incorporating sound and for compressing and synchronizing sound
US5802342A (en) * 1992-10-13 1998-09-01 Konami Co., Ltd. Image creating device loadable with external memory medium capable of storing an image creating program and created image data
US5838313A (en) * 1995-11-20 1998-11-17 Siemens Corporate Research, Inc. Multimedia-based reporting system with recording and playback of dynamic annotation
US6297818B1 (en) * 1998-05-08 2001-10-02 Apple Computer, Inc. Graphical user interface having sound effects for operating control elements and dragging objects
US6320601B1 (en) * 1997-09-09 2001-11-20 Canon Kabushiki Kaisha Information processing in which grouped information is processed either as a group or individually, based on mode
US20030080973A1 (en) * 1996-10-15 2003-05-01 Nikon Corporation Image recording and replay apparatus
US6724918B1 (en) * 1999-05-12 2004-04-20 The Board Of Trustees Of The Leland Stanford Junior University System and method for indexing, accessing and retrieving audio/video with concurrent sketch activity
US20040193428A1 (en) * 1999-05-12 2004-09-30 Renate Fruchter Concurrent voice to text and sketch processing with synchronized replay
US20080104503A1 (en) * 2006-10-27 2008-05-01 Qlikkit, Inc. System and Method for Creating and Transmitting Multimedia Compilation Data
US20110099476A1 (en) * 2009-10-23 2011-04-28 Microsoft Corporation Decorating a display environment
US8180073B1 (en) * 2008-02-08 2012-05-15 Mark J. Grote System for creating and manipulating digital media

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03196212A (en) * 1989-12-25 1991-08-27 Pfu Ltd Mouse cursor display control method
JPH087121A (en) * 1994-06-22 1996-01-12 Hitachi Ltd Information processor and attribute changing method
JP3416390B2 (en) * 1996-05-07 2003-06-16 シャープ株式会社 Drawing equipment
JPH10261099A (en) * 1997-03-17 1998-09-29 Casio Comput Co Ltd Image processor
JP2003263308A (en) * 2002-12-27 2003-09-19 Nec Infrontia Corp Screen control device and method
JP5078323B2 (en) * 2006-11-20 2012-11-21 株式会社カプコン Game device, program for realizing the game device, and recording medium

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4764965A (en) * 1982-10-14 1988-08-16 Tokyo Shibaura Denki Kabushiki Kaisha Apparatus for processing document data including voice data
US5689618A (en) * 1991-02-19 1997-11-18 Bright Star Technology, Inc. Advanced tools for speech synchronized animation
US5420607A (en) * 1992-09-02 1995-05-30 Miller; Robert F. Electronic paintbrush and color palette
US5802342A (en) * 1992-10-13 1998-09-01 Konami Co., Ltd. Image creating device loadable with external memory medium capable of storing an image creating program and created image data
US5600765A (en) * 1992-10-20 1997-02-04 Hitachi, Ltd. Display system capable of accepting user commands by use of voice and gesture inputs
US5768607A (en) * 1994-09-30 1998-06-16 Intel Corporation Method and apparatus for freehand annotation and drawings incorporating sound and for compressing and synchronizing sound
US5831615A (en) * 1994-09-30 1998-11-03 Intel Corporation Method and apparatus for redrawing transparent windows
US5715412A (en) * 1994-12-16 1998-02-03 Hitachi, Ltd. Method of acoustically expressing image information
US5838313A (en) * 1995-11-20 1998-11-17 Siemens Corporate Research, Inc. Multimedia-based reporting system with recording and playback of dynamic annotation
US20030080973A1 (en) * 1996-10-15 2003-05-01 Nikon Corporation Image recording and replay apparatus
US6320601B1 (en) * 1997-09-09 2001-11-20 Canon Kabushiki Kaisha Information processing in which grouped information is processed either as a group or individually, based on mode
US6297818B1 (en) * 1998-05-08 2001-10-02 Apple Computer, Inc. Graphical user interface having sound effects for operating control elements and dragging objects
US6724918B1 (en) * 1999-05-12 2004-04-20 The Board Of Trustees Of The Leland Stanford Junior University System and method for indexing, accessing and retrieving audio/video with concurrent sketch activity
US20040193428A1 (en) * 1999-05-12 2004-09-30 Renate Fruchter Concurrent voice to text and sketch processing with synchronized replay
US7458013B2 (en) * 1999-05-12 2008-11-25 The Board Of Trustees Of The Leland Stanford Junior University Concurrent voice to text and sketch processing with synchronized replay
US20080104503A1 (en) * 2006-10-27 2008-05-01 Qlikkit, Inc. System and Method for Creating and Transmitting Multimedia Compilation Data
US8180073B1 (en) * 2008-02-08 2012-05-15 Mark J. Grote System for creating and manipulating digital media
US20110099476A1 (en) * 2009-10-23 2011-04-28 Microsoft Corporation Decorating a display environment

Non-Patent Citations (27)

* Cited by examiner, † Cited by third party
Title
"SpaceTime Student Competition and Exhibition", ACM SIGGRAPH education committee 2008 annual report, pp. 21-35, education.siggraph.org., August 2008. *
Bickley et al., "Voice Input for Graphics and Text Creation: A Case Study", Proceedings of the 8th Annual Conference on Technology and Persons with Disabilities, pp. 32-36, 1993. *
Bickley et al., Case study of voice control of AutoCAD, Proceedings ECART 2, The Swedish Handicap Institute, Stockholm, 1993. *
Bolt et al., "'Put-that-there': Voice and Gesture at the Graphics Interface", Proceedings of the 7th Annual ACM Conference on Computer Graphics and Interactive Techniques (SIGGRAPH '80), v. 14, n. 3, pp. 262-270, 03 July 1980. *
Harada et al., "'Put-That-There': What, Where, How? Integrating Speech and Gesture in Interactive Workspaces. In At the Crossroads: The Interaction of HCI and Systems Issues in UbiComp, Ubicomp 2003, October 2003. *
Harada et al., "The Vocal Joystick: Evaluation of Voice-based Cursor Control Techniques", ACM Conference on Assistive Technologies - ASSETS, pp. 197-204, 2006. *
Harada et al., "VoiceDraw: A Hands-Free Voice-Driven Drawing Application for People with Motor Impairments", Proceedings of the 9th International ACM Conference on Computers and Accessibility (ASSETS '07), pp. 27-34, 15 October 2007. *
Hashimi, "Beyond Using Voice as Voice", Proceedings of the 16th International Conference of Advance Studies in Systems Research, Information and Cybernetics, 2005. *
Hashimi, "Blowtter: A Voice-Controlled Plotter", Proceedings of the 20th BCS HCI Group conference in co-operation with ACM on HCI Engage, vol. 2, pp 41-44, 2006. *
Hashimi, "Paralinguistic Vocal Control of Interactive Media", dissertation, Middlesex University, May 2007. *
Iga et al., "Kirifuki: Inhaling and Exhaling Interaction with Visual Objects", Transactions of the Virtual Reality Society of Japan, TVRSJ.7 No. 4, pp. 445-452, 2002. *
Igarashi et al., "Voice as Sound: Using Non-Verbal Voice Input for Interactive Control", Proceedings of 14th Annual ACM Symposium on User Interface Software and Technology (UIST '01), pp. 155-156, 2001. *
Katsura et al., "livePic", Proceedings of SIGGRAPH '06, August 2006. *
Kendrick et al., "Tux Paint: Mousing Your Way to a Masterpiece", Red Hat Magazine, Issue #2, December 2004. *
Matsumura, "Blowing Windows", http://we-make-money-not-art.com/archives/2005/08/blowing-windows.php, August 2005. *
Miller, "Voice Recognition as an Alternative Computer Mouse for the Disabled", Proceedings of RESNA International '92, pp. 58-60, June 1992. *
Mosteller, "Fun craft: Paint fireworks", http://www.parentdish.com/2007/07/04/fun-craft-paint-fireworks/, July 2007. *
Okuno et al., "Jellyfish Party: Blowing Soap Bubbles in Mixed Reality Space", Proceedings of the Second IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR '03), 2003. *
Rogerson, "Free virtual recorder for iPhone", http://www.musicradar.com/us/news/tech/free-virtual-recorder-for-iphone-186591, December 2008. *
Sawada et al., "BYU-BYU-View: A Wind Communication Interface", Proceedings of SIGGRAPH '07, August 2007. *
Schmandt et al., "The Intelligent Voice-Interactive Interface", Proceedings of the 1982 Conference on Human Factors in Computing Systems (CHI '82), pp. 363-366, 1982. *
Sharma et al., "Speech/Gesture Interface to a Visual-Computing Environment", IEEE Computer Graphics and Applications, pp. 29-36, March 2000. *
Shen et al., "Fun with Blow Painting", Proceeding of the Seventh ACM Conference on Creativity and Cognition (C&C '09), pp. 437-438, October 2009. *
Tan et al., "Utilizing Speech Recognition Technology to Increase Productivity -- Case Studies", Proceedings of RESNA International '92, pp. 61-63, June 1992. *
Wilcox et al., "Gaze and Voice Based Game Interaction: The Revenge of the Killer Penguins", Proceedings of SIGGRAPH '08, 2008. *
Wilson, "Art through a straw!", http://scrumdillydo.blogspot.com/2007/06/art-through-straw.html, June 2007. *
Yu-Wei Fu, "Blow Painting Interactive Application", http://www.fuinteractive.com/interactive-design/blowpainting-applicatio/, 2008. *

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120293555A1 (en) * 2010-01-15 2012-11-22 Akihiro Okano Information-processing device, method thereof and display device
US10277729B2 (en) * 2010-01-22 2019-04-30 Samsung Electronics Co., Ltd Apparatus and method for transmitting and receiving handwriting animation message
US20110181619A1 (en) * 2010-01-22 2011-07-28 Samsung Electronics Co., Ltd. Apparatus and method for transmitting and receiving handwriting animation message
US20120200540A1 (en) * 2010-06-01 2012-08-09 Kno, Inc. Utilization of temporal and spatial parameters to enhance the writing capability of an electronic device
US9141134B2 (en) * 2010-06-01 2015-09-22 Intel Corporation Utilization of temporal and spatial parameters to enhance the writing capability of an electronic device
US9996227B2 (en) 2010-06-01 2018-06-12 Intel Corporation Apparatus and method for digital content navigation
US9037991B2 (en) 2010-06-01 2015-05-19 Intel Corporation Apparatus and method for digital content navigation
US20120174009A1 (en) * 2010-12-29 2012-07-05 Samsung Electronics Co., Ltd. Method for inputting memo in touch screen terminal and device thereof
US20130063367A1 (en) * 2011-09-13 2013-03-14 Changsoo Jang Air actuated device
US20130162671A1 (en) * 2011-12-27 2013-06-27 Yohei Fujita Image combining apparatus, terminal device, and image combining system including the image combining apparatus and terminal device
US9875571B2 (en) * 2011-12-27 2018-01-23 Ricoh Company, Limited Image combining apparatus, terminal device, and image combining system including the image combining apparatus and terminal device
US9704281B2 (en) * 2012-02-09 2017-07-11 Flixel Photos Inc. Systems and methods for creation and sharing of selectively animated digital photos
US20150324096A1 (en) * 2012-02-09 2015-11-12 Flixel Photos, Inc. Systems and methods for creation and sharing of selectively animated digital photos
CN103529934A (en) * 2012-06-29 2014-01-22 三星电子株式会社 Method and apparatus for processing multiple inputs
EP2680110A1 (en) * 2012-06-29 2014-01-01 Samsung Electronics Co., Ltd Method and apparatus for processing multiple inputs
EP3693837A1 (en) * 2012-06-29 2020-08-12 Samsung Electronics Co., Ltd. Method and apparatus for processing multiple inputs
AU2013204564B2 (en) * 2012-06-29 2016-01-21 Samsung Electronics Co., Ltd. Method and apparatus for processing multiple inputs
US9286895B2 (en) 2012-06-29 2016-03-15 Samsung Electronics Co., Ltd. Method and apparatus for processing multiple inputs
US9977651B2 (en) * 2012-10-17 2018-05-22 Tencent Technology (Shenzhen) Company Limited Mobile terminal and image processing method thereof
US20150212793A1 (en) * 2012-10-17 2015-07-30 Tencent Technology (Shenzhen) Company Limited Mobile terminal and image processing method thereof
US9886128B2 (en) * 2012-11-02 2018-02-06 Sony Corporation Display control device, display control method, and program
US20150242049A1 (en) * 2012-11-02 2015-08-27 Sony Corporation Display control device, display control method, and program
US10198127B2 (en) 2012-11-02 2019-02-05 Sony Corporation Display control device, display control method, and program
US20140380226A1 (en) * 2013-06-21 2014-12-25 Sharp Kabushiki Kaisha Image display apparatus allowing operation of image screen and operation method thereof
US20150228201A1 (en) * 2014-02-13 2015-08-13 Crayola, Llc Photo Strings
WO2015123497A1 (en) * 2014-02-13 2015-08-20 Crayola, Llc Photo strings
US20170236318A1 (en) * 2016-02-15 2017-08-17 Microsoft Technology Licensing, Llc Animated Digital Ink
US20190011265A1 (en) * 2016-03-28 2019-01-10 Aisin Aw Co., Ltd. Server device, communication terminal, route retrieval system, and computer program
US10325407B2 (en) 2016-09-15 2019-06-18 Microsoft Technology Licensing, Llc Attribute detection tools for mixed reality
US10895954B2 (en) * 2017-06-02 2021-01-19 Apple Inc. Providing a graphical canvas for handwritten input

Also Published As

Publication number Publication date
JP2010157192A (en) 2010-07-15
JP5170771B2 (en) 2013-03-27

Similar Documents

Publication Publication Date Title
US20100210332A1 (en) Computer-readable storage medium having stored therein drawing processing program, and information processing apparatus
US10488941B2 (en) Combining virtual reality and augmented reality
US11093045B2 (en) Systems and methods to augment user interaction with the environment outside of a vehicle
JP6568902B2 (en) Interactive painting game and associated controller
US9454834B2 (en) Storage medium storing image processing program for implementing controlled image display according to input coordinate, and information processing device
WO2019141100A1 (en) Method and apparatus for displaying additional object, computer device, and storage medium
US8016671B2 (en) Game apparatus and storage medium storing game program
JP5415730B2 (en) Image processing program, image processing apparatus, image processing method, and image processing system
WO2010061584A2 (en) Handwritten input/output system, handwriting input sheet, information input system, and information input assistance sheet
JP4513830B2 (en) Drawing apparatus and drawing method
CN110622219B (en) Interactive augmented reality
US8851986B2 (en) Game program and game apparatus
US8376851B2 (en) Storage medium having game program stored therein and game apparatus
CN112044065B (en) Virtual resource display method, device, equipment and storage medium
US7724267B2 (en) Information processing program and information processing apparatus
US8643679B2 (en) Storage medium storing image conversion program and image conversion apparatus
JP5437726B2 (en) Information processing program, information processing apparatus, information processing system, and coordinate calculation method
US8421751B2 (en) Computer-readable storage medium having information processing program stored therein, information processing system, and information processing method
US20220019288A1 (en) Information processing apparatus, information processing method, and program
JP2008117083A (en) Coordinate indicating device, electronic equipment, coordinate indicating method, coordinate indicating program, and recording medium with the program recorded thereon
US20200150794A1 (en) Portable device and screen control method of portable device
JP2013033548A (en) Handwriting input/output system, handwriting input sheet, information input system, information input auxiliary sheet
KR102594106B1 (en) Control method of application for recording data and recording medium thereof
JP6267074B2 (en) Handwriting input / output system and optical reader
JP2014220006A (en) Handwriting input/output system and optical reading device

Legal Events

Date Code Title Description
AS Assignment

Owner name: NINTENDO CO., LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:IMAI, DAIJI;REEL/FRAME:024157/0497

Effective date: 20100301

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION