US20110304702A1 - Computer-Readable Storage Medium, Image Display Apparatus, Image Display System, and Image Display Method - Google Patents

Computer-Readable Storage Medium, Image Display Apparatus, Image Display System, and Image Display Method Download PDF

Info

Publication number
US20110304702A1
US20110304702A1 US13/158,824 US201113158824A US2011304702A1 US 20110304702 A1 US20110304702 A1 US 20110304702A1 US 201113158824 A US201113158824 A US 201113158824A US 2011304702 A1 US2011304702 A1 US 2011304702A1
Authority
US
United States
Prior art keywords
camera
real
virtual
virtual camera
eye
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/158,824
Inventor
Yuichiro Ito
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nintendo Co Ltd
Original Assignee
Nintendo Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=44358250&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=US20110304702(A1) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Nintendo Co Ltd filed Critical Nintendo Co Ltd
Assigned to NINTENDO CO., LTD. reassignment NINTENDO CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ITO, YUICHIRO
Publication of US20110304702A1 publication Critical patent/US20110304702A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/156Mixing image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals

Definitions

  • the present invention relates to a computer-readable storage medium, an image display apparatus, an image display system, and an image display method, and more particularly, to a computer-readable storage medium, an image display apparatus, an image display system, and an image display method for stereoscopically displaying a real world image onto which a 3-dimensional virtual object is superimposed, on a screen of a stereoscopic display apparatus capable of providing a stereoscopic view.
  • a stereoscopic display apparatus disclosed in Patent Literature 1 (Japanese Laid-Open Patent Publication No. 2008-146109) calculates the relative positions and relative orientations of a camera for a right eye and a camera for a left eye with respect to a marker placed in the real world, from respective images shot by the camera for a right eye and the camera for a left eye, which are attached to a head mounted display, and then, based on the result of the calculation, generates an image of a virtual object for a right eye and an image for the virtual object for a left eye.
  • the stereoscopic display apparatus superimposes the image of the virtual object for a right eye and the image for the virtual object for a left eye onto the images shot by the camera for a right eye and the camera for a left eye, respectively, and displays the resultant composite images on an LCD (liquid crystal display) for a right eye and an LCD for a left eye, respectively.
  • LCD liquid crystal display
  • Non-Patent Literature 1 Hirokazu Kato, Mark Billinghurst, “Marker Tracking and HMD Calibration for a Video-Based Augmented Reality Conferencing System,” iwar, pp.85, 2nd IEEE and ACM International Workshop on Augmented Reality, 1999 discloses a method of, based on the position and the orientation of a marker in an image shot by a camera, calculating the relative positions and the relative orientations of the marker and the camera with respect to each other in the real world.
  • an object of the present invention is to provide a computer-readable storage medium, an image display apparatus, an image display system, and an image display method that are capable of displaying a virtual object so as to be normally stereoscopically visible.
  • the present invention has the following features to attain the object mentioned above.
  • a computer-readable storage medium of the present invention is a computer-readable storage medium having stored therein an image display program which stereoscopically displays a real world image onto which a 3-dimensional virtual object is superimposed, on a screen of a stereoscopic display apparatus capable of providing a stereoscopic view, by using outputs from a real camera for a right eye and a real camera for a left eye, the image display program causing a computer to function as first position orientation calculation means, virtual camera setting means, right virtual space image generation means, left virtual space image generation means, and display control means.
  • the first position orientation calculation means recognizes a predetermined shooting target in a piece of real world image data outputted from one of the real camera for a right eye and the real camera for a left eye, among pieces of real world image data respectively outputted from the real camera for a right eye and the real camera for a left eye, and calculates position orientation information indicating relative positions and relative orientations of the one of the real cameras and the predetermined shooting target with respect to each other.
  • the virtual camera setting means determines the position and the orientation of a right virtual camera for generating an image for a right eye, and the position and the orientation of a left virtual camera for generating an image for a left eye, in a predetermined virtual space, by using the position orientation information calculated by the first position orientation calculation means.
  • the right virtual space image generation means generates a right virtual space image indicating the predetermined virtual space as it is looked at from the right virtual camera.
  • the left virtual space image generation means generates a left virtual space image indicating the predetermined virtual space as it is looked at from the left virtual camera.
  • the display control means superimposes the right virtual space image onto the piece of real world image data outputted from the real camera for a right eye, and superimposes the left virtual space image onto the piece of real world image data outputted from the real camera for a left eye, to output images for a stereoscopic view to the stereoscopic display apparatus.
  • the virtual camera setting means may determine the position and the orientation of the right virtual camera for generating the image for a right eye, and the position and the orientation of the left virtual camera for generating the image for a left eye, in the predetermined virtual space, without using a result of recognition of the predetermined shooting target in the piece of real world image data outputted from the other one of the real cameras which is different from the one of the real camera in the first position orientation calculation means.
  • the position and the orientation of one of the two virtual cameras are determined based on the position orientation information calculated based on the one, of the two pieces of real world image data, in which the predetermined shooting target can be recognized, and the position and the orientation of the other one of the two virtual cameras are determined based on the position and the orientation of the one of the two virtual cameras which have been determined. Therefore, even if the predetermined shooting target can be recognized only in one of the two pieces of real world image data respectively outputted from the two real cameras, a virtual object can be appropriately displayed in a stereoscopic manner.
  • a virtual object can be appropriately displayed in a stereoscopic manner merely by recognizing the predetermined shooting target in one of the two pieces of real world image data.
  • a load of processing on the computer can be reduced.
  • the virtual camera setting means may determine the positions and the orientations of the right virtual camera and the left virtual camera such that the relationship between the relative orientations of the right virtual camera and the left virtual camera with respect to each other is the same as the relationship between designed relative orientations of the real camera for a right eye and the real camera for a left eye with respect to each other.
  • the relative orientations of the right virtual camera and the left virtual camera can be set appropriately. Therefore, a virtual object can be appropriately displayed in a stereoscopic manner.
  • the virtual camera setting means may include: a first virtual camera orientation determination section for determining the orientation of one of the right virtual camera and the left virtual camera that corresponds to the one of the real cameras in the first position orientation calculation means, by using the position orientation information calculated by the first position orientation calculation means; and a second virtual camera orientation determination section for, based on the orientation of the one of the right virtual camera and the left virtual camera determined by the first virtual camera orientation determination section, determining the orientation of the other one of the right virtual camera and the left virtual camera such that the relationship between the relative orientations of the right virtual camera and the left virtual camera with respect to each other is the same as the relationship between designed relative orientations of the real camera for a right eye and the real camera for a left eye with respect to each other.
  • the image display program may further cause the computer to function as virtual camera relative position relationship determination means for determining a relative position relationship between the right virtual camera and the left virtual camera.
  • the virtual camera setting means may include: a first virtual camera position determination section for determining the position of one of the right virtual camera and the left virtual camera that corresponds to the one of the real cameras in the first position orientation calculation means, by using the position orientation information calculated by the first position orientation calculation means; and a second camera position determination section for determining the position of the other one of the right virtual camera and the left virtual camera so as to be separated, by the relative position determined by the virtual camera relative position relationship determination means, from the position of the one of the right virtual camera and the left virtual camera determined by the first virtual camera position determination section.
  • the “relative position relationship” mentioned above may be the distance between the left virtual camera and the right virtual camera, or may be the relative position of one of the virtual cameras with respect to the other one of the virtual cameras.
  • the virtual camera relative position relationship determination means may determine the relative position relationship between the right virtual camera and the left virtual camera, based on a result of recognition of the predetermined shooting target in the two pieces of real world image data respectively outputted from the two real cameras, or may determine the relative position relationship between the right virtual camera and the left virtual camera, based on the designed relative relationship between the real camera for a right eye and the real camera for a left eye.
  • the relative position relationship between the right virtual camera and the left virtual camera can be set appropriately. Therefore, a virtual object can be appropriately displayed in a stereoscopic manner.
  • the virtual camera setting means may include: a first virtual camera position determination section for determining the position of one of the right virtual camera and the left virtual camera that corresponds to the one of the real cameras in the first position orientation calculation means, by using the position orientation information calculated by the first position orientation calculation means; a first virtual camera orientation determination section for determining the orientation of the one of the right virtual camera and the left virtual camera that corresponds to the one of the real cameras in the first position orientation calculation means, by using the position orientation information calculated by the first position orientation calculation means; and a second virtual camera position determination section for determining the position of the other one of the right virtual camera and the left virtual camera so as to be present, in a direction based on the orientation of the one of the right virtual camera and the left virtual camera determined by the first virtual camera orientation determination section, from the position of the one of the right virtual camera and the left virtual camera determined by the first virtual camera position determination section,
  • the second virtual camera position determination section may determine the position of the other one of the right virtual camera and the left virtual camera so as to be present, in the lateral direction of the orientation of the one of the right virtual camera and the left virtual camera determined by the first virtual camera orientation determination section, from the position of the one of the right virtual camera and the left virtual camera determined by the first virtual camera position determination section.
  • the image display program may further cause the computer to function as virtual cameras distance determination means for determining the distance between the right virtual camera and the left virtual camera.
  • the second virtual camera position determination section may determine the position of the other one of the right virtual camera and the left virtual camera so as to be separated by the distance determined by the virtual cameras distance determination means, in the direction based on the orientation of the one of the right virtual camera and the left virtual camera determined by the first virtual camera orientation determination section, from the position of the one of the right virtual camera and the left virtual camera determined by the first virtual camera position determination section.
  • the virtual camera relative position relationship determination means may determine the relative position relationship between the right virtual camera and the left virtual camera, based on a parallax between the pieces of real world image data respectively outputted from the real camera for a right eye and the real camera for a left eye.
  • the relative position relationship between the right virtual camera and the left virtual camera is determined based on the parallax between the two pieces of real world image data. Therefore, even if the relative position relationship between the two real cameras is not known, or even if the relative position relationship between the two real cameras has error because of error in the attachment of the two real cameras, the right virtual camera and the left virtual camera can be set appropriately.
  • the relative position relationship which is determined by the relative position relationship determination means may be the interval between the right virtual camera and the left virtual camera.
  • the image display program may further cause the computer to function as second position orientation calculation means for recognizing the predetermined shooting target in the piece of real world image data outputted from the other one of the real cameras which is different from the one of the real cameras in the first position orientation calculation means, among the two pieces of real world image data respectively outputted from the two real cameras, and calculating position orientation information indicating relative positions and relative orientations of the other one of the real cameras and the predetermined shooting target with respect to each other.
  • the virtual camera relative position relationship determination means may calculate the distance between the one of the real cameras and the other one of the real cameras, based on the information, about the relative positions of the one of the real cameras and the predetermined shooting target, which is calculated by the first position orientation calculation means, and on the information, about the relative positions of the other one of the real cameras and the predetermined shooting target, which is calculated by the second position orientation calculation means.
  • the first position orientation calculation means may include first transform matrix generation means for, based on the piece of real world image data outputted from the one of the real cameras, generating a first transform matrix that transforms a coordinate value represented in a coordinate system whose origin is the position of the predetermined shooting target, into a coordinate value represented in a first shooting section coordinate system whose origin is the position of the one of the real cameras.
  • the second position orientation calculation means may include second transform matrix generation means for, based on the piece of real world image data outputted from the other one of the real cameras, generating a second transform matrix that transforms a coordinate value represented in a coordinate system whose origin is the position of the predetermined shooting target, into a coordinate value represented in a second shooting section coordinate system whose origin is the position of the other one of the real cameras.
  • the virtual camera relative position relationship determination means may execute processing of calculating the relative position relationship between the right virtual camera and the left virtual camera every time a new piece of real world image data is outputted from each of the two real cameras, and may determine the relative position relationship between the right virtual camera and the left virtual camera, based on results of a plurality of the calculations of the relative position relationship obtained by the processing being executed a plurality of times.
  • the virtual camera relative position relationship determination means may determine the relative position relationship between the right virtual camera and the left virtual camera, based on the results of the plurality of calculations of the relative position relationship, only when all the results of the plurality of calculations of the relative position relationship are within a predetermined range.
  • the image display program may further cause the computer to function as depth variation determination means for determining whether or not the depth distance from each of the two real cameras to the predetermined shooting target has varied beyond a predetermined range in comparison with the depth distance as it was when the virtual camera relative position relationship determination means determined the relative position relationship between the right virtual camera and the left virtual camera.
  • the virtual camera relative position relationship determination means may determine again the relative position relationship between the right virtual camera and the left virtual camera, when the result of the determination by the depth variation determination means is positive.
  • the magnitude of influence of error in attachment of the two real cameras varies depending on the depth distance from each of the two real cameras to the predetermined shooting target. According to the above configuration, the influence of error in attachment of the two real cameras can appropriately be corrected as necessary at any time.
  • the image display program may further cause the computer to function as depth distance calculation means for calculating the depth distance from each of the two real cameras to the predetermined shooting target, based on a piece of real world image data that includes the predetermined shooting target.
  • the depth variation determination means may determine whether or not the depth distance from each of the two real cameras to the predetermined shooting target has varied beyond the predetermined range by comparing a reference depth distance calculated by the depth distance calculation means at a time when the virtual camera relative position relationship determination means determined the relative position relationship between the right virtual camera and the left virtual camera, with the latest depth distance calculated by the depth distance calculation means after the calculation of the reference depth distance.
  • the depth distance calculation means may execute processing of calculating the depth distance from each of the two real cameras to the predetermined shooting target every time a new piece of real world image data is outputted from each of the two real cameras, and may calculate the reference depth distance, based on results of a plurality of the calculations of the depth distance obtained by the processing being executed a plurality of times.
  • the depth distance calculation means may determine the reference depth distance, based on the results of the plurality of calculations of the depth distance, only when all the results of the plurality of calculations of the depth distance are within a predetermined range.
  • the image display program may further cause the computer to function as: second position orientation calculation means for recognizing the predetermined shooting target in the piece of real world image data outputted from the other one of the real cameras which is different from the one of the real cameras in the first position orientation calculation means, among the two pieces of real world image data respectively outputted from the two real cameras, and calculating position orientation information indicating relative positions and relative orientations of the other one of the real cameras and the predetermined shooting target with respect to each other; and real camera selection means for selecting one of the two real cameras in accordance with the position of the predetermined shooting target in at least one of the two pieces of real world image data respectively outputted from the two real cameras.
  • second position orientation calculation means for recognizing the predetermined shooting target in the piece of real world image data outputted from the other one of the real cameras which is different from the one of the real cameras in the first position orientation calculation means, among the two pieces of real world image data respectively outputted from the two real cameras, and calculating position orientation information indicating relative positions and relative orientations of the other one of the real cameras and the
  • the virtual camera setting means may, if the real camera selected by the real camera selection means is the one of the real cameras, determine the position and the orientation of the right virtual camera, and the position and the orientation of the left virtual camera, by using the position orientation information indicating the relative positions and relative orientations of the one of the real cameras and the predetermined shooting target, which is calculated by the first position orientation calculation means, and if the real camera selected by the real camera selection means is the other one of the real cameras, determine the position and the orientation of the right virtual camera, and the position and the orientation of the left virtual camera, by using the position orientation information indicating the relative positions and relative orientations of the other one of the real cameras and the predetermined shooting target, which is calculated by the second position orientation calculation means.
  • the right virtual camera and the left virtual camera can be set appropriately. Therefore, it is possible to keep stereoscopically displaying a virtual object.
  • the real camera selection means may switch the real camera for a left eye to the real camera for a right eye, in accordance with entrance of the position of the predetermined shooting target in the piece of real world image data outputted from the real camera for a left eye of the two real cameras into a right end area of the piece of real world image data, and may switch the real camera for a right eye to the real camera for a left eye, in accordance with entrance of the position of the predetermined shooting target in the piece of real world image data outputted from the real camera for a right eye of the two real cameras into a left end area of the piece of real world image data.
  • the pieces of real world image data to be used by the first position orientation calculation means and the display control means may be outputted in real time from the real cameras.
  • the computer may be included in an information processing apparatus having the two real cameras and the stereoscopic display apparatus.
  • An image display apparatus stereoscopically displays a real world image onto which a 3-dimensional virtual object is superimposed, on a screen of a stereoscopic display apparatus capable of providing a stereoscopic view, by using outputs from a real camera for a right eye and a real camera for a left eye, and comprises first position orientation calculation means, virtual camera setting means, right virtual space image generation means, left virtual space image generation means, and display control means.
  • the first position orientation calculation means recognizes a predetermined shooting target in a piece of real world image data outputted from one of the real camera for a right eye and the real camera for a left eye, among pieces of real world image data respectively outputted from the real camera for a right eye and the real camera for a left eye, and calculates position orientation information indicating relative positions and relative orientations of the one of the real cameras and the predetermined shooting target with respect to each other.
  • the virtual camera setting means determines the position and the orientation of a right virtual camera for generating an image for a right eye, and the position and the orientation of a left virtual camera for generating an image for a left eye, in a predetermined virtual space, by using the position orientation information calculated by the first position orientation calculation means.
  • the right virtual space image generation means generates a right virtual space image indicating the predetermined virtual space as it is looked at from the right virtual camera.
  • the left virtual space image generation means generates a left virtual space image indicating the predetermined virtual space as it is looked at from the left virtual camera.
  • the display control means superimposes the right virtual space image onto the piece of real world image data outputted from the real camera for a right eye, and superimposes the left virtual space image onto the piece of real world image data outputted from the real camera for a left eye, to output images for a stereoscopic view to the stereoscopic display apparatus.
  • An image display system stereoscopically displays a real world image onto which a 3-dimensional virtual object is superimposed, on a screen of a stereoscopic display apparatus capable of providing a stereoscopic view, by using outputs from a real camera for a right eye and a real camera for a left eye, and comprises a predetermined shooting target, first position orientation calculation means, virtual camera setting means, right virtual space image generation means, left virtual space image generation means, and display control means.
  • the first position orientation calculation means recognizes the predetermined shooting target in a piece of real world image data outputted from one of the real camera for a right eye and the real camera for a left eye, among pieces of real world image data respectively outputted from the real camera for a right eye and the real camera for a left eye, and calculates position orientation information indicating relative positions and relative orientations of the one of the real cameras and the predetermined shooting target with respect to each other.
  • the virtual camera setting means determines the position and the orientation of a right virtual camera for generating an image for a right eye, and the position and the orientation of a left virtual camera for generating an image for a left eye, in a predetermined virtual space, by using the position orientation information calculated by the first position orientation calculation means.
  • the right virtual space image generation means generates a right virtual space image indicating the predetermined virtual space as it is looked at from the right virtual camera.
  • the left virtual space image generation means generates a left virtual space image indicating the predetermined virtual space as it is looked at from the left virtual camera.
  • the display control means superimposes the right virtual space image onto the piece of real world image data outputted from the real camera for a right eye, and superimposes the left virtual space image onto the piece of real world image data outputted from the real camera for a left eye, to output images for a stereoscopic view to the stereoscopic display apparatus.
  • An image display method stereoscopically displays a real world image onto which a 3-dimensional virtual object is superimposed, on a screen of a stereoscopic display apparatus capable of providing a stereoscopic view, by using outputs from a real camera for a right eye and a real camera for a left eye, and comprises a first position orientation calculation step, a virtual camera setting step, a right virtual space image generation step, a left virtual space image generation step, and a display control step.
  • the first position orientation calculation step recognizes a predetermined shooting target in a piece of real world image data outputted from one of the real camera for a right eye and the real camera for a left eye, among pieces of real world image data respectively outputted from the real camera for a right eye and the real camera for a left eye, and calculates position orientation information indicating relative positions and relative orientations of the one of the real cameras and the predetermined shooting target with respect to each other.
  • the virtual camera setting step determines the position and the orientation of a right virtual camera for generating an image for a right eye, and the position and the orientation of a left virtual camera for generating an image for a left eye, in a predetermined virtual space, by using the position orientation information calculated in the first position orientation calculation step.
  • the right virtual space image generation step generates a right virtual space image indicating the predetermined virtual space as it is looked at from the right virtual camera.
  • the left virtual space image generation step generates a left virtual space image indicating the predetermined virtual space as it is looked at from the left virtual camera.
  • the display control step superimposes the right virtual space image onto the piece of real world image data outputted from the real camera for a right eye, and superimposes the left virtual space image onto the piece of real world image data outputted from the real camera for a left eye, to output images for a stereoscopic view to the stereoscopic display apparatus.
  • FIG. 1 is a front view of a game apparatus 10 in its opened state
  • FIG. 2 is a side view of the game apparatus 10 in its opened state
  • FIG. 3 shows a left side view, a front view, a right side view, and a rear view of the game apparatus 10 in its closed state;
  • FIG. 4 is a cross-sectional view of an upper housing 21 shown in FIG. 1 taken along a line A-A′;
  • FIG. 5A is a diagram illustrating a state in which a slider 25 a of a 3D adjustment switch 25 is positioned at the lowermost position (a third position);
  • FIG. 5B is a diagram illustrating a state in which the slider 25 a of the 3D adjustment switch 25 is positioned at a position (a first position) above the lowermost position;
  • FIG. 5C is a diagram illustrating a state in which the slider 25 a of the 3D adjustment switch 25 is positioned at the uppermost position (a second position);
  • FIG. 6 is a block diagram illustrating an internal configuration of the game apparatus 10 ;
  • FIG. 7 shows an example of a stereoscopic image displayed on a screen of an upper LCD 22 ;
  • FIG. 8 shows another example of a stereoscopic image displayed on the screen of the upper LCD 22 ;
  • FIG. 9 shows a marker 61
  • FIG. 10 shows another example of a stereoscopic image displayed on the screen of the upper LCD 22 ;
  • FIG. 11 shows a memory map of a main memory 32 of the game apparatus 10 ;
  • FIG. 12 shows an example of variables stored in the main memory 32 ;
  • FIG. 13 is a flowchart showing a flow of marker processing
  • FIG. 14 is a flowchart showing a flow of main processing
  • FIG. 15 is a flowchart showing the details of updating processing
  • FIG. 16 is a flowchart showing the details of virtual cameras interval determination processing
  • FIG. 17 is a flowchart showing the details of view matrix generation processing
  • FIG. 18 is a flowchart showing the details of main real world image setting processing
  • FIG. 19 is a flowchart showing the details of display mode switching processing
  • FIG. 20 shows an example of a left real world image and a right real world image
  • FIG. 21 shows the position and the orientation of a left virtual camera 63 L calculated in accordance with a result of marker recognition processing
  • FIG. 22 shows the position and the orientation of a right virtual camera 63 R calculated in accordance with a result of marker recognition processing
  • FIG. 23 shows an area cut from the left real world image, based on a stereoscopic view zero distance
  • FIG. 24 shows an area cut from the right real world image, based on the stereoscopic view zero distance
  • FIG. 25 shows a position relationship between a virtual object 62 and a left virtual camera 63 L
  • FIG. 26 shows a method of generating an image for a left eye
  • FIG. 27 shows a method of calculating coordinates of a right virtual camera 63 R represented in a left virtual camera coordinate system
  • FIG. 28 shows ew and ed
  • FIG. 29 shows a method of calculating coordinates of the left virtual camera 63 L and coordinates of the right virtual camera 63 R represented in a marker coordinate system
  • FIG. 30 shows the position of the right virtual camera 63 R determined based on the position and the orientation of the left virtual camera 63 L;
  • FIG. 31 shows the position of the left virtual camera 63 L determined based on the position and the orientation of the right virtual camera 63 R;
  • FIG. 32 shows
  • FIG. 33 is a diagram illustrating a reason for calculating EyeWidth again
  • FIG. 34 is a diagram illustrating the reason for calculating EyeWidth again.
  • FIG. 35 is a diagram illustrating the reason for calculating EyeWidth again.
  • FIG. 36 is a diagram illustrating the reason for calculating EyeWidth again.
  • FIG. 37 is a diagram illustrating a left projection matrix
  • FIG. 38 is a diagram illustrating a right projection matrix
  • FIG. 39 is a diagram illustrating a method of determining whether or not to switch a main real world image from the left real world image to the right real world image.
  • FIG. 40 is a diagram illustrating a method of determining whether or not to switch the main real world image from the right real world image to the left real world image.
  • FIG. 1 to FIG. 3 are each a plan view of an outer appearance of a game apparatus 10 .
  • the game apparatus 10 is a hand-held game apparatus, and is configured to be foldable as shown in FIG. 1 to FIG. 3 .
  • FIG. 1 and FIG. 2 show the game apparatus 10 in an opened state
  • FIG. 3 shows the game apparatus 10 in a closed state.
  • FIG. 1 is a front view of the game apparatus 10 in the opened state
  • FIG. 2 is a right side view of the game apparatus 10 in the opened state.
  • the game apparatus 10 is able to shoot an image by means of an imaging section, display the shot image on a screen, and store data of the shot image.
  • the game apparatus 10 can execute a game program which is stored in an exchangeable memory card or a game program which is received from a server or another game apparatus, and can display, on the screen, an image generated by computer graphics processing, such as an image shot by a virtual camera set in a virtual space, for example.
  • the game apparatus 10 includes a lower housing 11 and an upper housing 21 as shown in FIG. 1 to FIG. 3 .
  • the lower housing 11 and the upper housing 21 are connected to each other so as to be openable and closable (foldable).
  • the lower housing 11 and the upper housing 21 are each formed in a horizontally long plate-like rectangular shape, and are connected to each other at long side portions thereof so as to be pivotable with respect to each other.
  • projections 11 A each of which projects in a direction orthogonal to an inner side surface (main surface) 11 B of the lower housing 11 are provided at the upper long side portion of the lower housing 11
  • a projection 21 A which projects from the lower side surface of the upper housing 21 in a direction orthogonal to the lower side surface of the upper housing 21 is provided at the lower long side portion of the upper housing 21 . Since the projections HA of the lower housing 11 and the projection 21 A of the upper housing 21 are connected to each other, the lower housing 11 and the upper housing 21 are foldably connected to each other.
  • a structure of the lower housing 11 will be described.
  • a lower LCD (Liquid Crystal Display) 12 in the lower housing 11 , a touch panel 13 , operation buttons 14 A to 14 L ( FIG. 1 , FIG. 3 ), an analog stick 15 , an LED 16 A and an LED 16 B, an insertion opening 17 , and a microphone hole 18 are provided.
  • these components will be described in detail.
  • the lower LCD 12 is accommodated in the lower housing 11 .
  • the lower LCD 12 has a horizontally long shape, and is located such that a long side direction thereof corresponds to a long side direction of the lower housing 11 .
  • the lower LCD 12 is positioned at the center of the lower housing 11 .
  • the lower LCD 12 is provided on the inner side surface (main surface) of the lower housing 11 , and a screen of the lower LCD 12 is exposed at an opening of the lower housing 11 .
  • the number of pixels of the lower LCD 12 may be, for example, 256 dots ⁇ 192 dots (width ⁇ height).
  • the lower LCD 12 is a display device for displaying an image in a planar manner (not in a stereoscopically visible manner), which is different from the upper LCD 22 as described below.
  • an LCD is used as a display device in the present embodiment, any other display device such as a display device using an EL (Electro Luminescence), or the like may be used.
  • a display device having any resolution may be used as the lower LCD 12 .
  • the game apparatus 10 includes the touch panel 13 as an input device.
  • the touch panel 13 is mounted on the screen of the lower LCD 12 .
  • the touch panel 13 may be, but is not limited to, a resistive film type touch panel.
  • a touch panel of any type such as electrostatic capacitance type may be used.
  • the touch panel 13 has the same resolution (detection accuracy) as that of the lower LCD 12 .
  • the resolution of the touch panel 13 and the resolution of the lower LCD 12 may not necessarily be the same.
  • the insertion opening 17 (indicated by dashed line in FIG. 1 and FIG. 3( d )) is provided on the upper side surface of the lower housing 11 .
  • the insertion opening 17 is used for accommodating a touch pen 28 which is used for performing an operation on the touch panel 13 .
  • a touch pen 28 which is used for performing an operation on the touch panel 13 .
  • an input on the touch panel 13 is usually made by using the touch pen 28
  • a finger of a user may be used for making an input on the touch panel 13 , besides the touch pen 28 .
  • the operation buttons 14 A to 14 L are each an input device for making a predetermined input. As shown in FIG. 1 , among operation buttons 14 A to 14 L, a cross button 14 A (a direction input button 14 A), a button 14 B, a button 14 C, a button 14 D, a button 14 E, a power button 14 F, a selection button 14 J, a HOME button 14 K, and a start button 14 L are provided on the inner side surface (main surface) of the lower housing 11 .
  • the cross button 14 A is cross-shaped, and includes buttons for indicating an upward, a downward, a leftward, or a rightward direction.
  • the button 14 B, button 14 C, button 14 D, and button 14 E are positioned so as to form a cross shape.
  • buttons 14 A to 14 E, the selection button 14 J, the HOME button 14 K, and the start button 14 L are assigned functions, respectively, in accordance with a program executed by the game apparatus 10 , as necessary.
  • the cross button 14 A is used for selection operation and the like, and the operation buttons 14 B to 14 E are used for, for example, determination operation and cancellation operation.
  • the power button 14 F is used for powering the game apparatus 10 on/off.
  • the analog stick 15 is a device for indicating a direction, and is provided to the left of the lower LCD 12 in an upper portion of the inner side surface of the lower housing 11 .
  • the cross button 14 A is provided to the left of the lower LCD 12 in the lower portion of the lower housing 11 . That is, the analog stick 15 is provided above the cross button 14 A.
  • the analog stick 15 and the cross button 14 A are positioned so as to be operated by a thumb of a left hand with which the lower housing is held.
  • the analog stick 15 is provided in the upper area, and thus the analog stick 15 is positioned such that a thumb of a left hand with which the lower housing 11 is held is naturally positioned on the position of the analog stick 15 , and the cross button 14 A is positioned such that the thumb of the left hand is positioned on the position of the cross button 14 A when the thumb of the left hand is slightly moved downward from the analog stick 15 .
  • the analog stick 15 has a top, corresponding to a key, which slides parallel to the inner side surface of the lower housing 11 .
  • the analog stick 15 acts in accordance with a program executed by the game apparatus 10 .
  • the analog stick 15 acts as an input device for moving the predetermined object in the 3-dimensional virtual space.
  • the predetermined object is moved in a direction in which the top corresponding to the key of the analog stick 15 slides.
  • a component which enables an analog input by being tilted by a predetermined amount in any direction, such as the upward, the downward, the rightward, the leftward, or the diagonal direction, may be used.
  • buttons that is, the button 14 B, the button 14 C, the button 14 D, and the button 14 E, which are positioned so as to form a cross shape, are positioned such that a thumb of a right hand with which the lower housing 11 is held is naturally positioned on the positions of the four buttons. Further, the four buttons and the analog stick 15 sandwich the lower LCD 12 , so as to be bilaterally symmetrical in position with respect to each other. Thus, depending on a game program, for example, a left-handed person can make a direction instruction input by using these four buttons.
  • the microphone hole 18 is provided on the inner side surface of the lower housing 11 .
  • a microphone (see FIG. 7 ) is provided as a sound input device described below, and the microphone detects for a sound from the outside of the game apparatus 10 .
  • FIG. 3( a ) is a left side view of the game apparatus 10 in the closed state.
  • FIG. 3( b ) is a front view of the game apparatus 10 in the closed state.
  • FIG. 3( c ) is a right side view of the game apparatus 10 in the closed state.
  • FIG. 3( d ) is a rear view of the game apparatus 10 in the closed state.
  • an L button 14 G and an R button 14 H are provided on the upper side surface of the lower housing 11 .
  • the L button 14 G is positioned on the left end portion of the upper side surface of the lower housing 11 and the R button 14 H is positioned on the right end portion of the upper side surface of the lower housing 11 .
  • the L button 14 G and the R button 14 H can function as shutter buttons (imaging instruction buttons) of the imaging section, for example. Further, as shown in FIG. 3( a ), a sound volume button 14 I is provided on the left side surface of the lower housing 11 . The sound volume button 14 I is used for adjusting a sound volume of a speaker of the game apparatus 10 .
  • a cover section 11 C is provided on the left side surface of the lower housing 11 so as to be openable and closable.
  • a connector (not shown) is provided inside the cover section 11 C for electrically connecting between the game apparatus 10 and an external data storage memory 45 .
  • the external data storage memory 45 is detachably connected to the connector.
  • the external data storage memory 45 is used for, for example, recording (storing) data of an image shot by the game apparatus 10 .
  • the connector and the cover section 11 C may be provided on the right side surface of the lower housing 11 .
  • an insertion opening 11 D through which an external memory 44 having a game program stored therein is inserted is provided on the upper side surface of the lower housing 11 .
  • a connector (not shown) for electrically connecting between the game apparatus 10 and the external memory 44 in a detachable manner is provided inside the insertion opening 11 D.
  • a predetermined game program is executed by connecting the external memory 44 to the game apparatus 10 .
  • the connector and the insertion opening 11 D may be provided on another side surface (for example, the right side surface) of the lower housing 11 .
  • a first LED 16 A for notifying a user of an ON/OFF state of a power supply of the game apparatus 10 is provided on the lower side surface of the lower housing 11
  • a second LED 16 B for notifying a user of an establishment state of a wireless communication of the game apparatus 10 is provided on the right side surface of the lower housing 11 .
  • the game apparatus 10 can make wireless communication with other devices, and the second LED 16 B is lit up when the wireless communication is established.
  • the game apparatus 10 has a function of connecting to a wireless LAN in a method based on, for example, IEEE 802.11.b/g standard.
  • a wireless switch 19 for enabling/disabling the function of the wireless communication is provided on the right side surface of the lower housing 11 (see FIG. 3( c )).
  • an upper LCD (Liquid Crystal Display) 22 As shown in FIG. 1 to FIG. 3 , in the upper housing 21 , an upper LCD (Liquid Crystal Display) 22 , an outer imaging section 23 (an outer imaging section (left) 23 a and an outer imaging section (right) 23 b ), an inner imaging section 24 , a 3D adjustment switch 25 , and a 3D indicator 26 are provided.
  • an upper LCD Liquid Crystal Display
  • an outer imaging section 23 an outer imaging section (left) 23 a and an outer imaging section (right) 23 b )
  • an inner imaging section 24 As shown in FIG. 1 to FIG. 3 , theses components will be described in detail.
  • the upper LCD 22 is accommodated in the upper housing 21 .
  • the upper LCD 22 has a horizontally long shape, and is located such that a long side direction thereof corresponds to a long side direction of the upper housing 21 .
  • the upper LCD 22 is positioned at the center of the upper housing 21 .
  • the area of a screen of the upper LCD 22 is set so as to be greater than the area of the screen of the lower LCD 12 .
  • the screen of the upper LCD 22 is horizontally elongated as compared to the screen of the lower LCD 12 .
  • a rate of the horizontal width in the aspect ratio of the screen of the upper LCD 22 is set so as to be greater than a rate of the horizontal width in the aspect ratio of the screen of the lower LCD 12 .
  • the screen of the upper LCD 22 is provided on the inner side surface (main surface) 21 B of the upper housing 21 , and the screen of the upper LCD 22 is exposed at an opening of the upper housing 21 . Further, as shown in FIG. 2 , the inner side surface of the upper housing 21 is covered with a transparent screen cover 27 .
  • the screen cover 27 protects the screen of the upper LCD 22 , and integrates the upper LCD 22 and the inner side surface of the upper housing 21 with each other, thereby achieving unity.
  • the number of pixels of the upper LCD 22 may be, for example, 640 dots ⁇ 200 dots (the horizontal line ⁇ the vertical line).
  • the upper LCD 22 is an LCD, a display device using an EL (Electro Luminescence), or the like may be used. In addition, a display device having any resolution may be used as the upper LCD 22 .
  • the upper LCD 22 is a display device capable of displaying a stereoscopically visible image. Further, in the present embodiment, an image for a left eye and an image for a right eye are displayed by using substantially the same display area. Specifically, the upper LCD 22 may be a display device using a method in which the image for a left eye and the image for a right eye are alternately displayed in the horizontal direction in predetermined units (for example, every other line). Alternatively, a display device using a method in which the image for a left eye and the image for a right eye are alternately displayed for a predetermined time period may be used. Further, in the present embodiment, the upper LCD 22 is a display device capable of displaying an image which is stereoscopically visible with naked eyes.
  • a lenticular lens type display device or a parallax barrier type display device is used which enables the image for a left eye and the image for a right eye, which are alternately displayed in the horizontal direction, to be separately viewed by the left eye and the right eye, respectively.
  • the upper LCD 22 of a parallax barrier type is used.
  • the upper LCD 22 displays, by using the image for a right eye and the image for a left eye, an image (a stereoscopic image) which is stereoscopically visible with naked eyes.
  • the upper LCD 22 allows a user to view the image for a left eye with her/his left eye, and the image for a right eye with her/his right eye by utilizing a parallax barrier, so that a stereoscopic image (a stereoscopically visible image) exerting a stereoscopic effect for a user can be displayed. Further, the upper LCD 22 may disable the parallax barrier. When the parallax barrier is disabled, an image can be displayed in a planar manner (it is possible to display a planar visible image which is different from a stereoscopically visible image as described above. Specifically, a display mode is used in which the same displayed image is viewed with a left eye and a right eye).
  • the upper LCD 22 is a display device capable of switching between a stereoscopic display mode for displaying a stereoscopically visible image and a planar display mode (for displaying a planar visible image) for displaying an image in a planar manner.
  • the switching of the display mode is performed by the 3D adjustment switch 25 described later.
  • the imaging directions of the outer imaging section (left) 23 a and the outer imaging section (right) 23 b are each the same as the outward normal direction of the outer side surface 21 D. Further, these imaging sections are each designed so as to be positioned in a direction which is opposite to the normal direction of the display surface (inner side surface) of the upper LCD 22 by 180 degrees. Specifically, the imaging direction of the outer imaging section (left) 23 a and the imaging direction of the outer imaging section (right) 23 b are parallel to each other.
  • the outer imaging section (left) 23 a and the outer imaging section (right) 23 b can be used as a stereo camera depending on a program executed by the game apparatus 10 . Further, depending on a program, when any one of the two outer imaging sections ( 23 a and 23 b ) is used alone, the outer imaging section 23 may be used as a non-stereo camera. Further, depending on a program, images shot by the two outer imaging sections ( 23 a and 23 b ) may be combined with each other or may compensate for each other, thereby enabling imaging using an extended imaging range.
  • the outer imaging section 23 is structured so as to include two imaging sections, that is, the outer imaging section (left) 23 a and the outer imaging section (right) 23 b .
  • Each of the outer imaging section (left) 23 a and the outer imaging section (right) 23 b includes an imaging device, such as a CCD image sensor or a CMOS image sensor, having a common predetermined resolution, and a lens.
  • the lens may have a zooming mechanism.
  • the outer imaging section (left) 23 a and the outer imaging section (right) 23 b forming the outer imaging section 23 are aligned so as to be parallel to the horizontal direction of the screen of the upper LCD 22 .
  • the outer imaging section (left) 23 a and the outer imaging section (right) 23 b are positioned such that a straight line connecting between the two imaging sections is parallel to the horizontal direction of the screen of the upper LCD 22 .
  • Reference numerals 23 a and 23 b which are indicated as dashed lines in FIG.
  • FIG. 1 represent the outer imaging section (left) 23 a and the outer imaging section (right) 23 b, respectively, which are positioned on the outer side surface reverse of the inner side surface of the upper housing 21 .
  • the outer imaging section (left) 23 a is positioned to the left of the upper LCD 22 and the outer imaging section (right) 23 b is positioned to the right of the upper LCD 22 .
  • the outer imaging section (left) 23 a shoots an image for a left eye, which is viewed by a left eye of a user
  • the outer imaging section (right) 23 b shoots an image for a right eye, which is viewed by a right eye of the user.
  • a distance between the outer imaging section (left) 23 a and the outer imaging section (right) 23 b is set so as to be approximately the same as a distance between both eyes of a person, that is, may be set so as to be within a range from 30 mm to 70 mm, for example.
  • the distance between the outer imaging section (left) 23 a and the outer imaging section (right) 23 b is not limited to a distance within the range described above.
  • the outer imaging section (left) 23 a and the outer imaging section (right) 23 b are secured to the housing, and the imaging directions thereof cannot be changed.
  • the outer imaging section (left) 23 a and the outer imaging section (right) 23 b are positioned to the left and to the right, respectively, of the upper LCD 22 (on the left side and the right side, respectively, of the upper housing 21 ) so as to be horizontally symmetrical with respect to the center of the upper LCD 22 .
  • the outer imaging section (left) 23 a and the outer imaging section (right) 23 b are positioned so as to be symmetrical with respect to a line which divides the upper LCD 22 into two equal parts, that is, the left part and the right part.
  • the outer imaging section (left) 23 a and the outer imaging section (right) 23 b are positioned at positions which are reverse of positions above the upper edge of the screen of the upper LCD 22 and which are on the upper portion of the upper housing 21 in an opened state. Specifically, when the upper LCD 22 is projected on the outer side surface of the upper housing 21 , the outer imaging section (left) 23 a and the outer imaging section (right) 23 b are positioned, on the outer side surface of the upper housing 21 , at a position above the upper edge of the screen of the upper LCD 22 having been projected.
  • the two imaging sections ( 23 a and 23 b ) of the outer imaging section 23 are positioned to the left and the right of the upper LCD 22 so as to be horizontally symmetrical with respect to the center of the upper LCD 22 . Therefore, when a user views the upper LCD 22 from the front thereof, the imaging direction of the outer imaging section 23 can be the same as the direction of the line of sight of the user. Further, the outer imaging section 23 is positioned at a position reverse of a position above the upper edge of the screen of the upper LCD 22 . Therefore, the outer imaging section 23 and the upper LCD 22 do not interfere with each other inside the upper housing 21 . Therefore, the upper housing 21 may have a reduced thickness as compared to a case where the outer imaging section 23 is positioned on a position reverse of a position of the screen of the upper LCD 22 .
  • the inner imaging section 24 is positioned on the inner side surface (main surface) 21 B of the upper housing 21 , and acts as an imaging section which has an imaging direction which is the same direction as the inward normal direction of the inner side surface.
  • the inner imaging section 24 includes an imaging device, such as a CCD image sensor and a CMOS image sensor, having a predetermined resolution, and a lens.
  • the lens may have a zooming mechanism.
  • the inner imaging section 24 is positioned, on the upper portion of the upper housing 21 , above the upper edge of the screen of the upper LCD 22 . Further, in this state, the inner imaging section 24 is positioned at the horizontal center of the upper housing 21 (on a line which separates the upper housing 21 (the screen of the upper LCD 22 ) into two equal parts, that is, the left part and the right part). Specifically, as shown in FIG. 1 and FIG.
  • the inner imaging section 24 is positioned on the inner side surface of the upper housing 21 at a position reverse of the middle position between the left and the right imaging sections (the outer imaging section (left) 23 a and the outer imaging section (right) 23 b ) of the outer imaging section 23 .
  • the inner imaging section 24 is positioned at the middle position between the left and the right imaging sections having been projected.
  • the dashed line 24 indicated in FIG. 3( b ) represents the inner imaging section 24 positioned on the inner side surface of the upper housing 21 .
  • the inner imaging section 24 is used for shooting an image in the direction opposite to that of the outer imaging section 23 .
  • the inner imaging section 24 is positioned on the inner side surface of the upper housing 21 at a position reverse of the middle position between the left and the right imaging sections of the outer imaging section 23 .
  • the inner imaging section 24 can shoot an image of a face of the user from the front thereof.
  • the left and the right imaging sections of the outer imaging section 23 do not interfere with the inner imaging section 24 inside the upper housing 21 , thereby enabling reduction of the thickness of the upper housing 21 .
  • the 3D adjustment switch 25 is a slide switch, and is used for switching a display mode of the upper LCD 22 as described above. Further, the 3D adjustment switch 25 is used for adjusting the stereoscopic effect of a stereoscopically visible image (stereoscopic image) which is displayed on the upper LCD 22 . As shown in FIG. 1 to FIG. 3 , the 3D adjustment switch 25 is provided at the end portions of the inner side surface and the right side surface of the upper housing 21 , and is positioned at a position at which the 3D adjustment switch 25 is visible to a user when the user views the upper LCD 22 from the front thereof. Further, an operation section of the 3D adjustment switch 25 projects on the inner side surface and the right side surface, and can be viewed and operated from both sides. All the switches other than the 3D adjustment switch 25 are provided on the lower housing 11 .
  • FIG. 4 is a cross-sectional view of the upper housing 21 shown in FIG. 1 taken along a line A-A′.
  • a recessed portion 21 C is formed at the right end portion of the inner side surface of the upper housing 21
  • the 3D adjustment switch 25 is provided in the recessed portion 21 C.
  • the 3D adjustment switch 25 is provided so as to be visible from the front surface and the right side surface of the upper housing 21 as shown in FIG. 1 and FIG. 2 .
  • a slider 25 a of the 3D adjustment switch 25 is slidable to any position in a predetermined direction (the height direction), and a display mode of the upper LCD 22 is determined in accordance with the position of the slider 25 a.
  • FIG. 5A to FIG. 5C are each a diagram illustrating a state in which the slider 25 a of the 3D adjustment switch 25 slides.
  • FIG. SA is a diagram illustrating a state in which the slider 25 a of the 3D adjustment switch 25 is positioned at the lowermost position (a third position).
  • FIG. 5B is a diagram illustrating a state in which the slider 25 a of the 3D adjustment switch 25 is positioned above the lowermost position (a first position).
  • FIG. 5C is a diagram illustrating a state in which the slider 25 a of the 3D adjustment switch 25 is positioned at the uppermost position (a second position).
  • the upper LCD 22 is set to the planar display mode, and a planar image is displayed on the screen of the upper LCD 22 (the upper LCD 22 may remain set to the stereoscopic display mode, and the same image may be used for the image for a left eye and the image for a right eye, to perform planar display).
  • the slider 25 a is positioned between a position shown in FIG. 5B (a position (first position) above the lowermost position) and a position shown in FIG. 5C (the uppermost position (the second position)
  • the upper LCD 22 is set to the stereoscopic display mode.
  • a stereoscopically visible image is displayed on the screen of the upper LCD 22 .
  • a manner in which the stereoscopic image is visible is adjusted in accordance with the position of the slider 25 a .
  • an amount of deviation in the horizontal direction between a position of an image for a right eye and a position of an image for a left eye is adjusted in accordance with the position of the slider 25 a.
  • the slider 25 a of the 3D adjustment switch 25 is configured so as to be fixed at the third position, and is slidable, along the height direction, to any position between the first position and the second position.
  • the slider 25 a is fixed at the third position by a projection (not shown) which projects, from the side surface of the 3D adjustment switch 25 , in the lateral direction shown in FIG. 5A , and does not slide upward from the third position unless a predetermined force or a force greater than the predetermined force is applied upward.
  • a projection not shown
  • the slider 25 a is positioned between the third position and the first position, the manner in which the stereoscopic image is visible is not adjusted, which is intended as a margin.
  • the third position and the first position may be the same position, and, in this case, no margin is provided. Further, the third position may be provided between the first position and the second position.
  • a direction in which an amount of deviation in the horizontal direction between a position of an image for a right eye and a position of an image for a left eye is adjusted when the slider is moved from the third position toward the first position is opposite to a direction in which an amount of deviation in the horizontal direction between the position of the image for the right eye and the position of the image for the left eye is adjusted when the slider is moved from the third position toward the second position.
  • the 3D indicator 26 indicates whether or not the upper LCD 22 is in the stereoscopic display mode.
  • the 3D indicator 26 is implemented as a LED, and is lit up when the stereoscopic display mode of the upper LCD 22 is enabled.
  • the 3D indicator 26 may be lit up only when the program processing for displaying a stereoscopically visible image is performed (namely, image processing in which an image for a left eye is different from an image for a right eye is performed in the case of the 3D adjustment switch being positioned between the first position and the second position) in a state where the upper LCD 22 is in the stereoscopic display mode.
  • the 3D indicator 26 is positioned near the screen of the upper LCD 22 on the inner side surface of the upper housing 21 .
  • the user when a user views the screen of the upper LCD 22 from the front thereof, the user can easily view the 3D indicator 26 . Therefore, also when a user is viewing the screen of the upper LCD 22 , the user can easily recognize the display mode of the upper LCD 22 .
  • a speaker hole 21 E is provided on the inner side surface of the upper housing 21 . A sound is outputted through the speaker hole 21 E from a speaker 43 described below.
  • FIG. 6 is a block diagram illustrating an internal configuration of the game apparatus 10 .
  • the game apparatus 10 includes, in addition to the components described above, electronic components such as an information processing section 31 , a main memory 32 , an external memory interface (external memory I/F) 33 , an external data storage memory I/F 34 , an internal data storage memory 35 , a wireless communication module 36 , a local communication module 37 , a real-time clock (RTC) 38 , an acceleration sensor 39 , a power supply circuit 40 , an interface circuit (I/F circuit) 41 , and the like.
  • These electronic components are mounted on an electronic circuit substrate, and accommodated in the lower housing 11 (or the upper housing 21 ).
  • the information processing section 31 is information processing means which includes a CPU (Central Processing Unit) 311 for executing a predetermined program, a GPU (Graphics Processing Unit) 312 for performing image processing, and the like.
  • the CPU 311 of the information processing section 31 executes a program stored in a memory (for example, the external memory 44 connected to the external memory I/F 33 or the internal data storage memory 35 ) inside the game apparatus 10 , thereby executing processing (for example, shooting processing, or image display processing described later) corresponding to the program.
  • the program executed by the CPU 311 of the information processing section 31 may be acquired from another device through communication with the other device.
  • the information processing section 31 further includes a VRAM (Video RAM) 313 .
  • the GPU 312 of the information processing section 31 generates an image in accordance with an instruction from the CPU 311 of the information processing section 31 , and renders the image in the VRAM 313 .
  • the GPU 312 of the information processing section 31 outputs the image rendered in the VRAM 313 , to the upper LCD 22 and/or the lower LCD 12 , and the image is displayed on the upper LCD 22 and/or the lower LCD 12 .
  • the external memory I/F 33 is an interface for detachably connecting to the external memory 44 .
  • the external data storage memory I/F 34 is an interface for detachably connecting to the external data storage memory 45 .
  • the main memory 32 is volatile storage means used as a work area and a buffer area for (the CPU 311 of) the information processing section 31 . That is, the main memory 32 temporarily stores various types of data used for the processing based on the program, and temporarily stores a program acquired from the outside (the external memory 44 , another device, or the like), for example.
  • a PSRAM Pseudo-SRAM
  • the external memory 44 is nonvolatile storage means for storing a program executed by the information processing section 31 .
  • the external memory 44 is implemented as, for example, a read-only semiconductor memory.
  • the information processing section 31 can load a program stored in the external memory 44 .
  • Predetermined processing is performed by the program loaded by the information processing section 31 being executed.
  • the external data storage memory 45 is implemented as a non-volatile readable and writable memory (for example, a NAND flash memory), and is used for storing predetermined data. For example, images shot by the outer imaging section 23 and/or images shot by another device are stored in the external data storage memory 45 .
  • the external data storage memory 45 When the external data storage memory 45 is connected to the external data storage memory I/F 34 , the information processing section 31 loads an image stored in the external data storage memory 45 , and the image can be displayed on the upper LCD 22 and/or the lower LCD 12 .
  • the internal data storage memory 35 is implemented as a non-volatile readable and writable memory (for example, a NAND flash memory), and is used for storing predetermined data. For example, data and/or programs downloaded through the wireless communication module 36 by wireless communication are stored in the internal data storage memory 35 .
  • a non-volatile readable and writable memory for example, a NAND flash memory
  • the wireless communication module 36 has a function of connecting to a wireless LAN by using a method based on, for example, IEEE 802.11.b/g standard.
  • the local communication module 37 has a function of performing wireless communication with the same type of game apparatus in a predetermined communication method (for example, communication by a unique protocol, or infrared communication).
  • the wireless communication module 36 and the local communication module 37 are connected to the information processing section 31 .
  • the information processing section 31 can perform data transmission to and data reception from another device via the Internet by using the wireless communication module 36 , and can perform data transmission to and data reception from the same type of another game apparatus by using the local communication module 37 .
  • the acceleration sensor 39 is connected to the information processing section 31 .
  • the acceleration sensor 39 detects magnitudes of accelerations (linear accelerations) in the directions of the straight lines along the three axial (xyz axial) directions, respectively.
  • the acceleration sensor 39 is provided inside the lower housing 11 .
  • the long side direction of the lower housing 11 is defined as x axial direction
  • the short side direction of the lower housing 11 is defined as y axial direction
  • the direction orthogonal to the inner side surface (main surface) of the lower housing 11 is defined as z axial direction, thereby detecting magnitudes of the linear accelerations for the respective axes.
  • the acceleration sensor 39 is, for example, an electrostatic capacitance type acceleration sensor.
  • the acceleration sensor 39 may be an acceleration sensor for detecting a magnitude of acceleration for one axial direction or two-axial directions.
  • the information processing section 31 can receive data (acceleration data) representing accelerations detected by the acceleration sensor 39 , and detect an orientation and a motion of the game apparatus 10 .
  • the RTC 38 and the power supply circuit 40 are connected to the information processing section 31 .
  • the RTC 38 counts time, and outputs the time to the information processing section 31 .
  • the information processing section 31 calculates a current time (date) based on the time counted by the RTC 38 .
  • the power supply circuit 40 controls power from the power supply (the rechargeable battery accommodated in the lower housing 11 as described above) of the game apparatus 10 , and supplies power to each component of the game apparatus 10 .
  • the I/F circuit 41 is connected to the information processing section 31 .
  • the microphone 42 and the speaker 43 are connected to the I/F circuit 41 .
  • the speaker 43 is connected to the I/F circuit 41 through an amplifier which is not shown.
  • the microphone 42 detects a voice from a user, and outputs a sound signal to the I/F circuit 41 .
  • the amplifier amplifies a sound signal outputted from the I/F circuit 41 , and a sound is outputted from the speaker 43 .
  • the touch panel 13 is connected to the I/F circuit 41 .
  • the I/F circuit 41 includes a sound control circuit for controlling the microphone 42 and the speaker 43 (amplifier), and a touch panel control circuit for controlling the touch panel.
  • the sound control circuit performs A/D conversion and D/A conversion on the sound signal, and converts the sound signal to a predetermined form of sound data, for example.
  • the touch panel control circuit generates a predetermined form of touch position data based on a signal outputted from the touch panel 13 , and outputs the touch position data to the information processing section 31 .
  • the touch position data represents a coordinate of a position, on an input surface of the touch panel 13 , on which an input is made.
  • the touch panel control circuit reads a signal outputted from the touch panel 13 , and generates the touch position data every predetermined time.
  • the information processing section 31 acquires the touch position data, to recognize a position on which an input is made on the touch panel 13 .
  • the operation button 14 includes the operation buttons 14 A to 14 L described above, and is connected to the information processing section 31 .
  • Operation data representing an input state of each of the operation buttons 14 A to 14 I is outputted from the operation button 14 to the information processing section 31 , and the input state indicates whether or not each of the operation buttons 14 A to 14 I has been pressed.
  • the information processing section 31 acquires the operation data from the operation button 14 to perform processing in accordance with the input on the operation button 14 .
  • the lower LCD 12 and the upper LCD 22 are connected to the information processing section 31 .
  • the lower LCD 12 and the upper LCD 22 each display an image in accordance with an instruction from (the GPU 312 of) the information processing section 31 .
  • the information processing section 31 causes the upper LCD 22 to display a stereoscopic image (image that is stereoscopically visible).
  • the information processing section 31 is connected to an LCD controller (not shown) of the upper LCD 22 , and causes the LCD controller to set the parallax barrier to ON or OFF.
  • the parallax barrier is set to ON in the upper LCD 22
  • an image for a right eye and an image for a left eye, which are stored in the VRAM 313 of the information processing section 31 are outputted to the upper LCD 22 .
  • the LCD controller alternately repeats reading of pixel data of the image for a right eye for one line in the vertical direction, and reading of pixel data of the image for a left eye for one line in the vertical direction, thereby reading, from the VRAM 313 , the image for a right eye and the image for a left eye.
  • an image to be displayed is divided into the images for a right eye and the images for a left eye each of which is a rectangle-shaped image having one line of pixels aligned in the vertical direction, and an image, in which the rectangle-shaped image for the left eye which is obtained through the division, and the rectangle-shaped image for the right eye which is obtained through the division are alternately aligned, is displayed on the screen of the upper LCD 22 .
  • a user views the images through the parallax barrier in the upper LCD 22 , so that the image for the right eye is viewed by the user's right eye, and the image for the left eye is viewed by the user's left eye.
  • the stereoscopically visible image is displayed on the screen of the upper LCD 22 .
  • the outer imaging section 23 and the inner imaging section 24 are connected to the information processing section 31 .
  • the outer imaging section 23 and the inner imaging section 24 each shoot an image in accordance with an instruction from the information processing section 31 , and output data of the shot image to the information processing section 31 .
  • the 3D adjustment switch 25 is connected to the information processing section 31 .
  • the 3D adjustment switch 25 transmits, to the information processing section 31 , an electrical signal in accordance with the position of the slider 25 a.
  • the 3D indicator 26 is connected to the information processing section 31 .
  • the information processing section 31 controls whether or not the 3D indicator 26 is to be lit up. For example, the information processing section 31 lights up the 3D indicator 26 when the upper LCD 22 is in the stereoscopic display mode.
  • the game apparatus 10 has the internal configuration as described above.
  • a composite image obtained by combining an image of the real world currently shot by the outer imaging section 23 (the outer imaging section (left) 23 a and the outer imaging section (right) 23 b ), and an image of a virtual object present in a 3-dimensional virtual space is displayed in a stereoscopically visible manner on the screen of the upper LCD 22 , based on an image display program.
  • Two images shot by the outer imaging section 23 are supplied to the upper LCD such that the two images have a predetermined parallax, and then are stereoscopically displayed.
  • FIG. 7 shows an example of a stereoscopic image displayed on the screen of the upper LCD 22 when three balls 60 a to 60 c (real objects) are shot by the outer imaging section 23 .
  • the balls 60 a to 60 e are shot by the outer imaging section 23
  • the balls 60 a to 60 c are displayed in a stereoscopically visible manner on the screen of the upper LCD 22 .
  • the ball 60 a which is nearest the outer imaging section 23 appears to be at the nearest position when viewed by the user
  • the ball 60 c which is farthest from the outer imaging section 23 appears to be at the farthest position when viewed by the user. It is noted that in FIG.
  • FIG. 7 the state in which balls are partially beyond the screen of the upper LCD 22 is shown for the purpose of facilitating the understanding, but actually, the balls are displayed in the screen.
  • FIG. 8 and FIG. 10 described later It is noted that in FIG. 7 , since a marker 61 (see FIG. 8 ) described later is not shot by the outer imaging section 23 , a virtual object is not displayed on the screen of the upper LCD 22 .
  • the states in which a display target appears to protrude when viewed by the user are shown, displaying in a stereoscopically visible manner includes not only the case where the display target appears to protrude when viewed by the user, but also the case where the display target is displayed with depth so as to appear to be in the back of the screen.
  • FIG. 8 shows an example of a stereoscopic image displayed on the screen of the upper LCD 22 when the marker 61 (real object) is being shot by the outer imaging section 23 .
  • a square including an arrow is drawn on the marker 61
  • the CPU 311 performs image processing such as pattern matching for an image obtained by the outer imaging section 23 , thereby determining whether or not the marker is included in the image.
  • the marker 61 When the marker 61 is being shot by the outer imaging section 23 , the marker 61 is displayed as a real world image in a stereoscopically visible manner on the upper LCD 22 , and a virtual object 62 (for example, a virtual object representing a dog) is displayed being superimposed onto the real world image, in a stereoscopically visible manner at the position of the marker 61 .
  • a virtual object 62 for example, a virtual object representing a dog
  • directions forward direction, rightward direction, and upward direction
  • a virtual object can be placed with its orientation corresponding to the orientation of the marker 61 .
  • the virtual object 62 can be placed on the marker 61 such that the forward direction of the virtual object 62 coincides with the forward direction of the marker 61 .
  • FIG. 10 shows another example of a stereoscopic image displayed on the screen of the upper LCD 22 when the marker 61 is being shot by the outer imaging section 23 .
  • the position and the direction of the marker 61 displayed on the screen of the upper LCD 22 have changed by the user moving the game apparatus 10 , the position and the direction of the virtual object 62 change so as to follow the change. Therefore, it appears to the user as if the virtual object 62 were actually present in the real world.
  • FIG. 11 is a diagram showing a memory map of the main memory 32 of the game apparatus 10 .
  • the main memory 32 stores an image display program 70 , a latest left real world image 71 L, a latest right real world image 71 R, a display left real world image 72 L, a display right real world image 72 R, a stereoscopic view zero distance 73 , a virtual object information 74 , a left transform matrix 75 L, a right transform matrix 75 R, a left view matrix 76 L, a right view matrix 76 R, a left projection matrix 77 L, a right projection matrix 77 R, a display mode 78 , a marker recognition mode 79 , a main real world image identification information 80 , variables 81 , and the like.
  • the image display program 70 is a program for causing the CPU 311 to execute the image display processing.
  • the latest left real world image 71 L is the latest image shot by the outer imaging section (left) 23 a.
  • the latest right real world image 71 R is the latest image shot by the outer imaging section (right) 23 b.
  • the display left real world image 72 R is the latest image that has been determined to be displayed on the upper LCD 22 , among images shot by the outer imaging section (right) 23 b.
  • the stereoscopic view zero distance 73 is a variable indicating the distance, of an object from the outer imaging section 23 in the shooting direction, that allows the object to appear to be present at a position with the same depth as the screen of the upper LCD 22 when the object is stereoscopically displayed on the screen of the upper LCD 22 .
  • the stereoscopic view zero distance 73 is represented by unit of centimeter.
  • the stereoscopic view zero distance 73 has a fixed value of 25 cm, but this value is merely an example.
  • the stereoscopic view zero distance 73 may be changed anytime as needed, in accordance with instructions from the user, or automatically by a computer.
  • the stereoscopic view zero distance 73 may be defined by a distance from the marker 61 or a ratio about the distance between the outer imaging section 23 and the marker 61 .
  • the stereoscopic view zero distance 73 does not necessarily need to be defined as a distance in the real world, but may be defined as a distance in the virtual world. If, for example, the size of the marker 61 is known, it is possible to cause the unit of length in the real world and the unit of length in the virtual space to coincide with each other. In this way, if it is possible to cause both units to coincide with each other, the stereoscopic view zero distance 73 can be set by using the unit of length in the real world.
  • the stereoscopic view zero distance 73 may be set by using the unit of length in the virtual world. It is noted that, as described later, in the present embodiment, it is possible to cause the unit of length in the real world and the unit of length in the virtual space to coincide with each other without using the size of the marker 61 .
  • the virtual object information 74 is information that relates to the aforementioned virtual object 62 , and includes 3D model data (polygon data) indicating the shape of the virtual object 62 , texture data indicating the pattern of the virtual object 62 , information about the position and the orientation of the virtual object 62 in the virtual space, and the like.
  • 3D model data polygon data
  • texture data indicating the pattern of the virtual object 62
  • information about the position and the orientation of the virtual object 62 in the virtual space and the like.
  • the left transform matrix 75 L is a coordinate transform matrix that transforms coordinates represented in a coordinate system (marker coordinate system) set based on the position and the orientation of the marker 61 , into coordinates represented in a coordinate system (outer imaging section (left) coordinate system) set based on the position and the orientation of the outer imaging section (left) 23 a, which matrix is calculated by recognizing the position and the orientation of the marker 61 in a left real world image.
  • the left transform matrix 75 L includes information about the relative position and relative orientation of the outer imaging section (left) 23 a with respect to the position and the orientation of the marker 61 . More specifically, the left transform matrix 75 L includes information about the position and the orientation of the outer imaging section (left) 23 a in the marker coordinate system.
  • the right transform matrix 75 R is a coordinate transform matrix that transforms coordinates represented in the marker coordinate system, into coordinates represented in a coordinate system (outer imaging section (right) coordinate system) set based on the position and the orientation of the outer imaging section (right) 23 b, which matrix is calculated by recognizing the position and the orientation of the marker 61 in a right real world image.
  • the right transform matrix 75 R includes information about the relative position and relative orientation of the outer imaging section (right) 23 b with respect to the position and the orientation of the marker 61 . More specifically, the right transform matrix 75 R includes information about the position and the orientation of the outer imaging section (right) 23 b in the marker coordinate system.
  • a transform matrix that transforms the marker coordinate system into the outer imaging section (left) coordinate system or the outer imaging section (right) coordinate system is referred to as a “marker-camera transform matrix”.
  • the left transform matrix 75 L and the right transform matrix 75 R are the “marker-camera transform matrices”.
  • the left view matrix 76 L is a coordinate transform matrix that transforms coordinates represented in a world coordinate system of the virtual world, into coordinates represented in a left virtual camera coordinate system, which matrix is used for drawing the virtual object 62 as it is looked at by the left virtual camera.
  • the left view matrix 76 L includes information about the position and the orientation of the left virtual camera in the world coordinate system of the virtual world.
  • the right view matrix 76 R is a coordinate transform matrix that transforms coordinates represented in the world coordinate system of the virtual world, into coordinates represented in a right virtual camera coordinate system, which matrix is used for drawing the virtual object 62 as it is looked at by the right virtual camera.
  • the right view matrix 76 R includes information about the position and the orientation of the right virtual camera in the world coordinate system of the virtual world.
  • the left projection matrix 77 L is a coordinate transform matrix that transforms coordinates represented in a left virtual camera coordinate system, into coordinates represented in a screen coordinate system, which matrix is used for drawing the virtual world (the virtual object 62 present in the virtual world) as it is looked at by the left virtual camera.
  • the right projection matrix 77 R is a coordinate transform matrix that transforms coordinates represented in a right virtual camera coordinate system, into coordinates represented in the screen coordinate system, which matrix is used for drawing the virtual world (the virtual object 62 present in the virtual world) as it is looked at by the right virtual camera.
  • the display mode 78 is data indicating the current display mode. More specifically, the display mode 78 is data indicating a synchronous display mode in which the real world image and the virtual space image are superimposed and displayed being synchronized with each other, or an asynchronous display mode in which the real world image and the virtual space image are superimposed and displayed without synchronizing the real world image and the virtual space image with each other.
  • the marker recognition mode 79 is data indicating the current marker recognition mode. More specifically, the marker recognition mode 79 is data indicating a one-image recognition mode in which marker recognition processing is performed for only one of the left real world image and the right real world image, or a both-image recognition mode in which the marker recognition processing is performed for both the left real world image and the right real world image.
  • the main real world image identification information 80 is data indicating which of the left real world image and the right real world image is a main real world image.
  • the marker recognition processing is performed for only the main real world image, whereas the marker recognition processing is not performed for the other real world image (hereinafter, referred to as a sub real world image).
  • the variables 81 are used in executing the image display program 70 , and include variables shown in FIG. 12 . The meanings of the variables will be described below as needed.
  • the information processing section 31 (CPU 311 ) of the game apparatus 10 executes a boot program stored in a ROM, which is not shown, thereby initializing units such as the main memory 32 .
  • the image display program stored in the data storage internal memory 35 is loaded onto the main memory 32 , and then the CPU 311 of the information processing section 31 starts to execute the image display program.
  • FIG. 13 is a flowchart showing a flow of marker processing executed based on the image display program 70 by the CPU 311 .
  • the marker processing is executed in parallel with main processing described later.
  • the marker processing is executed when the CPU 311 is in an idle state.
  • the details of the marker processing will be described.
  • step S 10 in FIG. 13 the CPU 311 determines whether or not both real world images (that is, the left real world image and the right real world image) have been obtained by the outer imaging section 23 . If both real world images have been obtained, the CPU 311 proceeds to step S 11 .
  • the latest one of left real world images obtained by the outer imaging section 23 is stored, as the latest left real world image 71 L, in the main memory 32
  • the latest one of right real world images obtained by the outer imaging section 23 is stored, as the latest right real world image 71 R, in the main memory 32 .
  • the outer imaging section (left) 23 a and the outer imaging section (right) 23 b of the upper housing 21 are separated by a certain distance away (for example, 3.5 cm, hereinafter, referred to as imaging sections distance). Therefore, when the marker 61 is shot by the outer imaging section (left) 23 a and the outer imaging section (right) 23 b at the same time, a difference due to a parallax occurs between each of the position and the orientation of the marker 61 in the left real world image shot by the outer imaging section (left) 23 a, and each of the position and the orientation of the marker 61 in the right real world image shot by the outer imaging section (right) 23 b, as shown in FIG. 20 .
  • step S 11 the CPU 311 performs the marker recognition processing for the main real world image. More specifically, the CPU 311 determines whether or not the marker 61 is included in the main real world image by a pattern matching method or the like. If the marker 61 is included in the main real world image, the CPU 311 calculates the left transform matrix 75 L or the right transform matrix 75 R, based on the position and the orientation of the marker 61 in the main real world image (if the main real world image is the left real world image, the CPU 61 calculates the left transform matrix 75 L, and if the main real world image is the right real world image, the CPU 61 calculates the right transform matrix 75 R).
  • the left transform matrix 75 L is a matrix reflecting therein the position and the orientation of the outer imaging section (left) 23 a which are calculated based on the position and the orientation of the marker 61 in the left real world image. To be more exact, as shown in FIG.
  • the left transform matrix 75 L is a coordinate transform matrix that transforms coordinates represented in the marker coordinate system (in which the position of the marker 61 in the real world is the origin, and the forward direction, the lateral direction, and the normal direction of the marker 61 correspond to the respective axes), into coordinates represented in the outer imaging section (left) coordinate system based on the position and the orientation of the outer imaging section (left) 23 a which are calculated based on the position and the orientation of the marker 61 in the left real world image.
  • the right transform matrix 75 R is a matrix reflecting therein the position and the orientation of the outer imaging section (right) 23 b which are calculated based on the position and the orientation of the marker 61 in the right real world image.
  • the right transform matrix 75 R is a coordinate transform matrix that transforms coordinates represented in the marker coordinate system, into coordinates represented in the outer imaging section (right) coordinate system based on the position and the orientation of the outer imaging section (right) 23 b which are calculated based on the position and the orientation of the marker 61 in the right real world image.
  • the position of the outer imaging section (right) 23 b indicated by the right transform matrix 75 R which is the result of the marker recognition for the right real world image coincides with a position separated by the certain distance (imaging sections distance), along the x-axis direction of the outer imaging section (left) coordinate system (which direction is the lateral direction of the game apparatus 10 and is included in the horizontal directions when the game apparatus 10 is used), from the position of the outer imaging section (left) 23 a indicated by the left transform matrix 75 L which is the result of the marker recognition for the left real world image.
  • the orientation of the outer imaging section (right) 23 b indicated by the right transform matrix 75 R, and the orientation of the outer imaging section (left) 23 a indicated by the left transform matrix 75 L are the same as the state of the attachment of the outer imaging section (left) 23 a and the outer imaging section (right) 23 b to the game apparatus 10 .
  • the game apparatus 10 is designed such that the outer imaging section (left) 23 a and the outer imaging section (right) 23 b are attached so as to be in parallel with each other.
  • the x-axis, the y-axis, and the z-axis of the outer imaging section (left) coordinate system are in parallel with the x-axis, the y-axis, and the z-axis of the outer imaging section (right) coordinate system.
  • the marker recognition has some error
  • the attachment of the outer imaging section (left) 23 a and the outer imaging section (right) 23 b to the game apparatus 10 has some error. Therefore, the relationship between each of the position and the orientation of the outer imaging section (left) 23 a indicated by the left transform matrix 75 L, and each of the position and the orientation of the outer imaging section (right) 23 b indicated by the right transform matrix 75 R, is not ideal.
  • the left transform matrix 75 L and the right transform matrix 75 R are such that the outer imaging section (left) 23 a and the outer imaging section (right) 23 b are rather close to each other or rather away from each other, or such that the orientation of the outer imaging section (left) 23 a and the orientation of the outer imaging section (right) 23 b are different.
  • the left transform. matrix 75 L and the right transform matrix 75 R are such that the orientation of the outer imaging section (left) 23 a and the orientation of the outer imaging section (right) 23 b are not in parallel with each other.
  • a matrix that transforms the marker coordinate system into the outer imaging section (left) coordinate system or the outer imaging section (right) coordinate system is designated as a view matrix of a virtual camera, to superimpose a CG image onto the real world image.
  • the left transform matrix 75 L is merely designated as the left view matrix 76 L
  • the right transform matrix 75 R is merely designated as the right view matrix 76 R in stereoscopically displaying a virtual space image on the upper LCD 22 , a normal stereoscopic view of the virtual object 62 might not be provided.
  • the position and the orientation (view matrix) of one of the virtual cameras are determined such that the relationship between each of the position and the orientation of the left virtual camera 63 L, and each of the position and the orientation of the right virtual camera 63 R, is ideal.
  • a null value is stored in the left transform matrix 75 L or the right transform matrix 75 R, whereby the failure of the recognition of the marker 61 in the left real world image or the right real world image is recorded.
  • step S 12 the CPU 311 determines whether or not the current marker recognition mode is the one-image recognition mode. Then, if the current marker recognition mode is the one-image recognition mode, the CPU 311 proceeds to step S 13 , and if the current marker recognition mode is not the one-image recognition mode (that is, the marker recognition mode is the both-image recognition mode), the CPU 311 proceeds to step S 14 .
  • step S 13 the CPU 311 determines, as a failure, the result of the marker recognition for one of the left real world image and the right real world image that is not the main real world image (hereinafter, the one is referred to as a sub real world image). More specifically, if the main real world image is the left real world image, a null value is stored in the right transform matrix 75 R, and if the main real world image is the right real world image, a null value is stored in the left transform matrix 75 L.
  • step S 14 the CPU 311 performs the marker recognition processing for the sub real world image. More specifically, the CPU 311 determines whether or not the marker 61 is included in the sub real world image by a pattern matching method or the like. If the marker 61 is included in the sub real world image, the CPU 311 calculates the left transform matrix 75 L or the right transform matrix 75 R, based on the position and the orientation of the marker 61 in the sub real world image (if the sub real world image is the left real world image, the CPU 311 calculates the left transform matrix 75 L, and if the sub real world image is the right real world image, the CPU 311 calculates the right transform matrix 75 R).
  • step S 15 the CPU 311 determines whether or not the current display mode is the synchronous display mode. Then, if the current display mode is the synchronous display mode, the CPU 311 proceeds to step S 17 , and if the current display mode is not the synchronous display mode (that is, the current display mode is the asynchronous display mode), the CPU 311 proceeds to step S 16 .
  • step S 16 the CPU 311 stores the latest left real world image 71 L and the latest right real world image 71 R as the display left real world image 72 L and the display right real world image 72 R, respectively, in the main memory 32 . Then, the CPU 311 returns to step S 10 .
  • step S 17 the CPU 311 determines whether or not the current marker recognition mode is the one-image recognition mode. Then, if the current marker recognition mode is the one-image recognition mode, the CPU 311 proceeds to step S 18 , and if the current marker recognition mode is the one-image recognition mode (the current marker recognition mode is the both-image recognition mode), the CPU 311 proceeds to step S 19 .
  • step S 18 the CPU 311 determines whether or not the marker recognition for the main real world image has been successful. More specifically, if the main real world image is the left real world image, the CPU 311 determines whether or not a valid matrix which is not a null value is stored in the left transform matrix 75 L, and if the main real world image is the right real world image, the CPU 311 determines whether or not a valid matrix which is not a null value is stored in the right transform matrix 75 R. Then, if the marker recognition for the main real world image has been successful, the CPU 311 proceeds to step S 16 , and if the marker recognition for the main real world image has not been successful, the CPU 311 returns to step S 10 .
  • step S 19 the CPU 311 determines whether or not at least one of the marker recognition for the left real world image and the marker recognition for the right real world image has been successful. More specifically, the CPU 311 determines whether or not valid matrices which are not null values are stored in the left transform matrix 75 L and the right transform matrix 75 R. Then, if at least one of the marker recognition for the left real world image and the marker recognition for the right real world image has been successful, the CPU 311 proceeds to step S 16 , and if not, the CPU 311 returns to step S 10 .
  • FIG. 14 is a flowchart showing a flow of main processing executed based on the image display program 70 by the CPU 311 .
  • the main processing is executed in parallel with the marker processing.
  • the details of the main processing will be described.
  • step S 20 in FIG. 14 the CPU 311 places the virtual object 62 at a predetermined position in the 3-dimensional virtual space.
  • the virtual object 62 is placed at the origin of the virtual space (the origin of the world coordinate system).
  • step S 21 the CPU 311 executes updating processing.
  • the CPU 311 updates variables used for drawing the virtual object 62 .
  • the details of the updating processing will be described later.
  • step S 22 the CPU 311 executes virtual object processing.
  • the virtual object processing relates to the virtual object 62 placed in the virtual space. For example, as needed, the size of the virtual object 62 is changed, or the virtual object 62 is caused to take a predetermined action (The virtual object 62 is caused to move in the virtual space. For example, if the virtual object 62 is caused to move around the origin of the virtual space, the virtual object 62 walking around the origin of the marker coordinate system is displayed). It is noted that control of a movement of the virtual object 62 is realized by changing the coordinates of the position of the virtual object 62 in the world coordinate system in the virtual space.
  • step S 23 in accordance with the stereoscopic view zero distance 73 , the CPU 311 determines a drawing area that is an area, of the left real world image, used for display. More specifically, as shown in FIG. 23 , a point that is separated by the value of the stereoscopic view zero distance 73 (for example, 30 cm) in the shooting direction from the outer imaging section 23 , and that is at the same distance from the outer imaging section (left) 23 a and the outer imaging section (right) 23 b, is defined as a stereoscopic view zero point. An area whose center overlaps with the stereoscopic view zero point as they are viewed from the outer imaging section (left) 23 a, is determined as the drawing area of the left real world image.
  • the length in the lateral direction of the drawing area is obtained as a ratio on a segment, taken from a line perpendicular to the shooting direction of the outer imaging section (left) 23 a, between the lines creating the angle of view of the outer imaging section (left) 23 a .
  • the ratio is applied to the length in the lateral direction of the real world image, whereby the length of the drawing area of the real world image is determined.
  • the length in the height direction of the drawing area is determined based on the length in the lateral direction so as to adapt to the ratio of the height and the width of the display screen.
  • step S 27 for determining a drawing area of the right real world image an area whose center overlaps with the stereoscopic view zero point as they are viewed from the outer imaging section (right) 23 b, is determined as the drawing area of the right real world image, as shown in FIG. 24 .
  • the position of the stereoscopic view zero point on an image for a left eye, and the position of the stereoscopic view zero point on an image for a right eye coincide with each other, and appear to have the same depth when viewed by the user.
  • step S 24 the CPU 311 draws a drawing area of the display left real world image 72 L which is determined in step S 23 , in a predetermined storage area (hereinafter, referred to as a left frame buffer) for temporarily storing an image for a left eye to be supplied to the upper LCD 22 , in the VRAM 313 .
  • a predetermined storage area hereinafter, referred to as a left frame buffer
  • step S 25 the CPU 311 determines whether or not the value of an AR-Active (see FIG. 12 ) is true.
  • the AR-Active is a variable (flag) indicating whether or not a virtual object can be displayed being superimposed onto the real world image. If a virtual object can be displayed being superimposed onto the real world image, the value of the AR-Active is set at true, and if a virtual object cannot be displayed being superimposed onto the real world image (for example, in the case where the marker 61 cannot be recognized at all), the value of the AR-Active is set at false (initial value). If the value of the AR-Active is true, the CPU 311 proceeds to step S 26 , and if the value of the AR-Active is not true (that is, the value of the AR-Active is false), the CPU 311 proceeds to step S 27 .
  • step S 26 for example, the CPU 311 overwrites the left frame buffer with the virtual space (hereinafter, referred to as a virtual space image for a left eye) as it is viewed from the left virtual camera 63 L, as shown in FIG. 23 (actually and typically, the GPU 312 draws the virtual space in accordance with an instruction from the CPU 311 ).
  • the virtual space image for a left eye is superimposed onto the left real world image drawn in the left frame buffer in step S 24 , as shown in FIG. 26 .
  • the resultant image in the left frame buffer is supplied, as an image for a left eye, to the upper LCD 22 at a predetermined timing.
  • the background of the virtual space in the virtual space image for a left eye is transparent. Therefore, superimposing the virtual space image onto the real world image generates an image in which the virtual object 62 is present on the real world image.
  • step S 27 the CPU 311 determines a drawing area of the right real world image in accordance with the stereoscopic view zero distance 73 . It is noted that the details of processing of determining the drawing area are omitted because the processing is the same as processing of determining the drawing area of the left real world image.
  • step S 28 the CPU 311 draws a drawing area of the display right real world image 72 R which is determined in step S 27 , in a predetermined storage area (hereinafter, referred to as a right frame buffer) for temporarily storing an image for a right eye to be supplied to the upper LCD 22 , in the VRAM 313 .
  • a predetermined storage area hereinafter, referred to as a right frame buffer
  • step S 29 the CPU 311 determines whether or not the value of the AR-Active is true. If the value of the AR-Active is true, the CPU 311 proceeds to step S 30 , and if the value of the AR-Active is not true (that is, the value of the AR-Active is false), the CPU 311 proceeds to step S 31 .
  • step S 30 the CPU 311 overwrites the right frame buffer with the virtual space (hereinafter, referred to as a virtual space image for a right eye) as it is viewed from the right virtual camera 63 R (actually and typically, the GPU 312 draws the virtual space in accordance with an instruction from the CPU 311 ).
  • the virtual space image for a right eye is superimposed onto the right real world image drawn in the right frame buffer in step S 28 .
  • the resultant image in the right frame buffer is supplied, as an image for a right eye, to the upper LCD 22 at a predetermined timing.
  • step S 31 the CPU 311 waits for an interrupt signal (vertical synchronization interrupt) from the upper LCD 22 . If the interrupt signal is generated, the CPU 311 returns to step S 21 . As a result, steps S 21 to S 31 are repeated in cycles of a predetermined period (for example, 1/60 second).
  • a predetermined period for example, 1/60 second.
  • step S 21 in the main processing will be described with reference to the flowchart in FIG. 15 .
  • step S 40 in FIG. 15 the CPU 311 determines whether or not the result of the marker recognition (that is, the left transform matrix 75 L and the right transform matrix 75 R) in the aforementioned marker processing has been updated. If the result of the marker recognition has been updated, the CPU 311 proceeds to step S 41 , and if the result of the marker recognition has not been updated, the CPU 311 proceeds to step S 44 .
  • the result of the marker recognition that is, the left transform matrix 75 L and the right transform matrix 75 R
  • step S 41 the CPU 311 determines whether or not both the marker recognition for the left real world image and the marker recognition for the right real world image have been successful. If both the marker recognition for the left real world image and the marker recognition for the right real world image have been successful, the CPU 311 proceeds to step S 42 , and if at least one of or both of the marker recognition for the left real world image and the marker recognition for the right real world image have been failed, the CPU 311 proceeds to step S 44 .
  • step S 42 the CPU 311 determines whether or not the value of EyeWidth (see FIG. 12 ) is 0, or whether or not the value of EyeMeasure (see FIG. 12 ) is true.
  • the EyeWidth indicates the distance, between the outer imaging section (left) 23 a and the outer imaging section (right) 23 b, that is based on the position relationship therebetween, and is calculated by using the left transform matrix 75 L and the right transform matrix 75 R. It is noted that the distance does not need to be calculated on a scale of the real world, and may be calculated as a distance in the virtual space.
  • a relationship between the position of the outer imaging section (left) 23 a and the position of the outer imaging section (right) 23 b is calculated based on the position of the marker 61 , by using a relationship between the position of the marker 61 and the position of the outer imaging section (left) 23 a, which is indicated by the left transform matrix 75 L, and a relationship between the position of the marker 61 and the position of the outer imaging section (right) 23 b, which is indicated by the right transform matrix 75 R, whereby the distance is calculated. It is noted that if the distance is yet to be determined, the value of the distance is set at 0 (initial value). It is noted that in the present embodiment, the EyeWidth is a scalar value.
  • the EyeWidth may be a vector connecting the position of the outer imaging section (left) 23 a indicated by the left transform matrix 75 L, and the position of the outer imaging section (right) 23 b indicated by the right transform matrix 75 R.
  • the EyeWidth may be the magnitude of the component, perpendicular to the shooting direction, of the vector connecting the position of the outer imaging section (left) 23 a indicated by the left transform matrix 75 L, and the position of the outer imaging section (right) 23 b indicated by the right transform matrix 75 R. It is noted that as described later, the EyeWidth is used for setting the distance (interval) between the left virtual camera 63 L and the right virtual camera 63 R in the virtual space.
  • the EyeMeasure is a variable (flag) indicating whether or not the EyeWidth needs to be calculated again. If the EyeWidth needs to be calculated again, the EyeMeasure is set at true, and if the EyeWidth does not need to be calculated again, the EyeMeasure is set at false (initial value). If the value of the EyeWidth is 0 or if the value of the EyeMeasure is true, the CPU 311 proceeds to step S 43 , and if not (that is, if the value of the Eye Width is not 0 and the value of the EyeMeasure is false), the CPU 311 proceeds to step S 44 .
  • step S 43 the CPU 311 executes virtual cameras interval determination processing.
  • the interval between left virtual camera 63 L and the right virtual camera 63 R that is, the EyeWidth
  • the CPU 311 executes virtual cameras interval determination processing.
  • the interval between left virtual camera 63 L and the right virtual camera 63 R that is, the EyeWidth
  • the EyeWidth is determined and updated, as appropriate. The details of the virtual cameras interval determination processing will be described later.
  • step S 44 the CPU 311 executes view matrix generation processing.
  • the left view matrix 76 L and the right view matrix 76 R are calculated, based on the interval between the left virtual camera 63 L and the right virtual camera 63 R (that is, the EyeWidth) which has been already determined. The details of the view matrix generation processing will be described later.
  • step S 45 the CPU 311 executes main real world image setting processing.
  • the left real world image and the right real world image take turns switching to the main real world image, as appropriate (this is equivalent to the outer imaging section (left) 23 a and the outer imaging section (right) 23 b taking turns switching to a main imaging section).
  • the details of the main real world image setting processing will be described later.
  • step S 46 the CPU 311 executes display mode switching processing, and ends the updating processing.
  • the display mode switching processing the display mode is switched between the synchronous display mode and the asynchronous display mode, as appropriate. The details of the display mode switching processing will be described later.
  • the details of the virtual cameras interval determination processing in step S 43 in the updating processing will be described with reference to the flowchart in FIG. 16 .
  • the distance between the position of the outer imaging section (left) 23 a relative to the position of the marker 61 , which is indicated by the left transform matrix 75 L, and the position of the outer imaging section (right) 23 b relative to the position of the marker 61 , which is indicated by the right transform matrix 75 R is calculated.
  • There are several methods for calculating the distance In the present embodiment, the following method is employed as an example of the methods for the calculation.
  • step S 50 in FIG. 16 the CPU 311 calculates coordinates V 0 , based on the left transform matrix 75 L and the right transform matrix 75 R.
  • V 0 coordinates
  • FIG. 27 to FIG. 29 a method of calculating the V 0 will be described. It is noted that in FIG. 27 to FIG.
  • the shooting direction of the outer imaging section (left) 23 a, and the shooting direction of the outer imaging section (right) 23 b are drawn so as to be greatly different from each other, for the purpose of emphasizing that a relationship between each of the position and the orientation of the outer imaging section (left) 23 a relative to the marker 61 , which are calculated based on the left real world image, and each of the position and the orientation of the outer imaging section (right) 23 b relative to the marker 61 , which are calculated based on the right real world image, is not always ideal, as previously described.
  • Multiplication by the inverse matrix of the right transform matrix 75 R corresponds to transform of coordinates represented in the outer imaging section (right) coordinate system into coordinates represented in the marker coordinate system. Therefore, the multiplication by the inverse matrix transforms the coordinate V 1 ( FIG. 28 ) represented in the outer imaging section (right) coordinate system into the coordinates V 0 represented in the marker coordinate system as shown in FIG. 29 .
  • the coordinate V 0 thus calculated indicates the difference between the position (coordinates, of the origin of the marker coordinate system, that are represented in the outer imaging section (left) coordinate system) of the marker 61 relative to the outer imaging section (left) 23 a, which is calculated based on the left real world image, and the position (coordinates, of the origin of the marker coordinate system, that are represented in the outer imaging section (right) coordinate system) of the marker 61 relative to the outer imaging section (right) 23 a , which is calculated based on the right real world image.
  • the difference between the relative positions of the marker 61 is considered as being due to the difference between the attachment positions of the outer imaging section (left) 23 a and the outer imaging section (right) 23 b, thereby estimating the attachment positions of the outer imaging section (left) 23 a and the outer imaging section (right) 23 b.
  • step S 51 based on the left transform matrix 75 L, the CPU 311 calculates the coordinates V 1 , of the origin of the marker coordinate system, that are represented in the outer imaging section (left) coordinate system. Specifically, (0, 0, 0) is multiplied by the left transform matrix 75 L, thereby obtaining the coordinates V 1 , of the origin of the marker coordinate system, that are represented in the outer imaging section (left) coordinate system.
  • the coordinates, of the origin of the marker coordinate system, that are represented in the outer imaging section (left) coordinate system are defined as V 1 .
  • the coordinates, of the origin of the marker coordinate system, that are represented in the outer imaging section (right) coordinate system may be defined as V 1 .
  • step S 52 the CPU 311 stores, in ew (see FIG. 12 ), the magnitude (the distance from the origin) (see FIG. 29 ) of the V 0 obtained in step S 50 , and stores, in ed (see FIG. 12 ), the absolute value (see FIG. 27 ) of the z-axis component of the V 1 obtained in step S 51 .
  • the value of ew is calculated in unit of length in the marker coordinate system. Therefore, the unit of ew does not coincide with unit of length in the real world.
  • the actual distance between the outer imaging section (left) 23 a and the outer imaging section (right) 23 b is known (for example, 3.5 cm).
  • the unit of length in virtual space it is possible to match the unit of length in virtual space to the unit of length in the real world. It is noted that if the size of the marker 61 is known, it is also possible to match the unit of length in virtual space to the unit of length in the real world, by using a correspondence relationship between the size of the marker 61 calculated based on the result of recognition of the marker image, and the size of the marker 61 in the real world.
  • the distance (ew) between the outer imaging section (left) 23 a and the outer imaging section (right) 23 b calculated as described above varies in accordance with the distance in the shooting direction between the outer imaging section 23 and the marker 61 .
  • the distance (EyeWidth described later) between the outer imaging section (left) 23 a and the outer imaging section (right) 23 b is calculated again by using the marker-camera transform matrix as it is at this time.
  • the ew indicates the magnitude of the V 0 .
  • the ew may indicate the component of the V 0 in the direction of a line connecting the position of the outer imaging section (left) 23 a, which is indicated by the left transform matrix 75 L, and the position of the outer imaging section (right) 23 b, which is indicated by the right transform matrix 75 R.
  • the ew calculated as described above indicates the interval between the outer imaging section (left) 23 a and the outer imaging section (right) 23 b in the marker coordinate system.
  • the ew is used as the interval between the left virtual camera 63 R and the right virtual camera 63 R (in step S 65 or S 68 described later).
  • ew there is another method of calculating the ew other than the above-described method.
  • (0, 0, 0) is multiplied by the left transform matrix 75 L, thereby obtaining the coordinates V 1 , of the origin of the marker coordinate system, that are represented in the outer imaging section (left) coordinate system (that is, the position of the marker 61 relative to the outer imaging section (left) 23 a ).
  • (0, 0, 0) is multiplied by the right transform matrix 75 R, thereby obtaining coordinates Vr, of the origin of the marker coordinate system, that are represented in the outer imaging section (right) coordinate system (that is, the position of the marker 61 relative to the outer imaging section (right) 23 b ).
  • the ew may be obtained by calculating the distance between the coordinates V 1 and the coordinates Vr calculated as described above.
  • step S 53 the CPU 311 determines whether or not the value of LogIndex (see FIG. 12 ) is larger than 1 (initial value). If the value of the LogIndex is larger than 1, the CPU 311 proceeds to step S 57 , and if not (that is, if the value of the LogIndex is 1), the CPU 311 proceeds to step S 54 .
  • the LogIndex is a variable for identifying each component of arrays (LogWidth and LogDepth).
  • step S 54 the CPU 311 stores the value of the ew obtained in step S 52 in the array LogWidth [LogIndex]. Moreover, the CPU 311 stores the value of the ed obtained in step S 52 in the array LogDepth [LogIndex]. Moreover, the value of the LogIndex is incremented.
  • the LogWidth is array variables for storing a plurality of values of the ew.
  • the LogDepth is array variables for storing a plurality of values of the ed.
  • step S 55 the CPU 311 determines whether or not the value of the LogIndex is larger than the value of LogMax. If the value of the LogIndex is larger than the value of the LogMax, the CPU 311 proceeds to step S 56 , and if not (that is, if the value of the LogIndex is equal to or smaller than the value of the LogMax), the CPU 311 ends the virtual cameras interval determination processing.
  • step S 56 the CPU 311 calculates the average value of the components of the LogWidth, and stores the average value in the EyeWidth. Moreover, the CPU 311 calculates the average value of the components of the LogDepth, and stores the average value in the EyeDepth. Moreover, the CPU 311 sets the value of the EyeMeasure at false. Moreover, the CPU 311 sets the marker recognition mode 79 at the one-image recognition mode.
  • the EyeDepth is a variable indicating the depth distance (the depth, the distance in the shooting direction) to the origin of the marker coordinate system from the coordinates of the position of the outer imaging section (left) 23 a in the marker coordinate system, which is indicated by the left transform matrix 75 L (or the coordinates of the position of the outer imaging section (right) 23 b in the marker coordinate system, which is indicated by the right transform matrix 75 R).
  • the EyeDepth is used as a reference value for determination in step S 72 in FIG. 17 described later.
  • the initial value of the EyeDepth is 0. If processing in step S 56 has been finished, the virtual cameras interval determination processing is ended.
  • step S 57 the CPU 311 determines whether or not: the absolute value of the difference between the value of the ew obtained in step S 52 and the value of the LogWidth [1] is smaller than 10 percent of the value of the LogWidth [1]; and the absolute value of the difference between the value of the ed obtained in step S 52 and the value of the LogDepth [1] is smaller than 10 percent of the value of the LogDepth [1]. If the result of the determination in step S 57 is positive, the CPU 311 proceeds to step S 54 , and if the result of the determination in step S 57 is negative, the CPU 311 proceeds to step S 58 .
  • step S 58 the CPU 311 resets the value of the LogIndex at 1 (initial value), and ends the virtual cameras interval determination processing.
  • the ew and the ed are calculated based on the position of the outer imaging section (left) 23 a calculated based on the position and the orientation of the marker 61 in the left real world image, and the position of the outer imaging section (right) 23 b calculated based on the position and the orientation of the marker 61 in the right real world image.
  • the values of the ew and the values of the ed calculated in this manner are sequentially stored in the LogWidth and the LogDepth, respectively.
  • values of the ew or values of the ed are stored again in the LogWidth or the LogDepth from the beginning.
  • the left real world image and the right real world image shot by the outer imaging section 23 are likely to be blurred, and the accuracy of the recognition of the marker 61 significantly decreases.
  • values of the ew or values of the ed which are sequentially detected are likely to vary significantly. It is desired to avoid determining the value of the EyeWidth and the value of the EyeDepth, based on values of the ew and values of the ed, having low reliability, detected in such a situation as described above.
  • the value of the EyeWidth and the value of the EyeDepth are determined based on the values of the ew and the values of the ed.
  • the accuracies of the value of the EyeWidth and the value of the EyeDepth can be enhanced because the average value of a plurality of values of the ew and the average value of a plurality of values of the ed obtained by calculations performed a plurality of times are used as the value of the EyeWidth and the value of the EyeDepth, respectively.
  • step S 44 in the updating processing ( FIG. 15 ) will be described with reference to the flowchart in FIG. 17 .
  • step S 60 in FIG. 17 the CPU 311 determines whether or not the value of the EyeWidth is larger than 0. If the value of the EyeWidth is larger than 0, the CPU 311 proceeds to step S 63 , and if not (that is, if the value of the EyeWidth is 0), the CPU 311 proceeds to step S 61 .
  • step S 61 the CPU 311 determines whether or not the display mode is the synchronous display mode. If the display mode is the synchronous display mode, the CPU 311 ends the view matrix generation processing, and if not (that is, if the display mode is the asynchronous display mode), the CPU 311 proceeds to step S 62 .
  • step S 62 the CPU 311 sets the value of the AR-Active at false. Then, the CPU 311 ends the view matrix generation processing.
  • step S 63 the CPU 311 determines whether or not the marker recognition for the left real world image has been successful, with reference to the left transform matrix 75 L. If the marker recognition for the left real world image has been successful, the CPU 311 proceeds to step S 64 , and if not (that is, the value of the left transform matrix 75 L is a null value), the CPU 311 proceeds to step S 66 .
  • step S 64 the CPU 311 stores the value of the left transform matrix 75 L in the left view matrix 76 L.
  • step S 65 the CPU 311 stores, in the right view matrix 76 R, the value of the left view matrix 76 L multiplied by a parallel movement matrix ( ⁇ EyeWidth, 0, 0).
  • ⁇ EyeWidth a parallel movement matrix
  • the orientation of the right virtual camera 63 R is the same as that of the left virtual camera 63 L (that is, the x-axis, the y-axis, and the z-axis in the left virtual camera coordinate system are in parallel with those in the right virtual camera coordinate system).
  • the position and the orientation of the left virtual camera 63 L, and the position and the orientation of the right virtual camera 63 R are kept consistent with each other, whereby the virtual object 62 can be displayed on the upper LCD 22 so as to provide a normal stereoscopic view.
  • step S 66 the CPU 311 determines whether or not the marker recognition for the right real world image has been successful, with reference to the right transform matrix 75 R. If the marker recognition for the right real world image has been successful, the CPU 311 proceeds to step S 67 , and if not (that is, the value of the right transform matrix 75 R is a null value), the CPU 311 ends the view matrix generation processing.
  • step S 67 the CPU 311 stores the value of the right transform matrix 75 R in the right view matrix 76 R.
  • step S 68 the CPU 311 stores, in the left view matrix 76 L, the value of the right view matrix 76 R multiplied by a parallel movement matrix (EyeWidth, 0, 0).
  • a position that is separated by the value of the EyeWidth in the negative direction of the x-axis in the right virtual camera coordinate system from the position of the right virtual camera 63 R in the world coordinate system of the virtual space, which has been set in step S 67 is used as the position of the left virtual camera 63 L for generating a virtual space image for a left eye, as shown in FIG. 31 .
  • the orientation of the left virtual camera 63 L is the same as that of the right virtual camera 63 R (that is, the x-axis, the y-axis, and the z-axis in the left virtual camera coordinate system are in parallel with those in the right virtual camera coordinate system).
  • the position and the orientation of the left virtual camera 63 L, and the position and the orientation of the right virtual camera 63 R are kept consistent with each other, whereby the virtual object 62 can be displayed on the upper LCD 22 so as to provide normal stereoscopic viewing.
  • the position and the orientation of one of the virtual cameras are set by using (more specifically, directly using) a “marker-camera transform matrix” calculated from an image shot by one of the outer imaging sections (for example, the outer imaging section (left) 23 a )
  • the position and the orientation of the other one of the virtual cameras are set without using a “marker-camera transform matrix” calculated from an image shot by the other one of the outer imaging sections (for example, the outer imaging section (right) 23 b ).
  • a transform matrix (left transform matrix 25 L) for the outer imaging section (left) 23 a and a transform matrix (right transform matrix 25 R) for the outer imaging section (right) 23 b are used as the “marker-camera transform matrices”.
  • the respective transform matrices 25 L and 25 R are not used.
  • one transform matrix is set (one of the transform matrices 25 L and 25 R is directly used, or one new transform matrix (for example, using an average position and an average orientation) is generated from both the transform matrices 25 L and 25 R), and then the positions and the orientations of both the virtual cameras 63 L and 63 R are set by using the one transform matrix that has been set. In this way, problems on the accuracy of AR recognition can be solved.
  • step S 69 the CPU 311 sets the value of AR-Active at true. As a result, processing of displaying the virtual object 62 being superimposed onto the real world image is started, or restarted.
  • step S 70 the CPU 311 determines whether or not the value of the EyeMeasure is true. If the value of the EyeMeasure is true, the CPU 311 ends the view matrix generation processing, and if not (that is, if the value of the EyeMeasure is false), the CPU 311 proceeds to step S 71 .
  • step S 71 based on the left view matrix 76 L determined in step S 64 or step S 68 , the CPU 311 calculates coordinates V, of the origin of the virtual space, that are represented in the left virtual camera coordinate system. Specifically, as shown in FIG. 32 , (0, 0, 0) is multiplied by the left view matrix 76 L, thereby obtaining the coordinates V, of the origin of the virtual space, that are represented in the left virtual camera coordinate system. It is noted that in the present embodiment, the coordinates, of the origin of the virtual space, that are represented in the left virtual camera coordinate system are defined as V.
  • coordinates, of the origin of the virtual space, that are represented in the right virtual camera coordinate system, which coordinates are calculated based on the right view matrix 76 R determined in step S 65 or step S 67 may be defined as V.
  • the V calculated as described above is substantially the same as the position of the marker 61 in the outer imaging section (left) coordinate system (it is noted that the value of the V is calculated in unit of length in the virtual space or the marker coordinate system, and does not coincides with unit of length in the real world).
  • step S 72 the CPU 311 determines whether or not the absolute value of the difference between the absolute value (
  • is substantially the same as the distance (depth), in the shooting direction, from the outer imaging section 23 to the marker 61 .
  • step S 72 the CPU 311 determines whether or not the depth distance (depth) from the outer imaging section 23 to the marker 61 has varied beyond a range of ⁇ 20 percent in comparison with the depth distance as it was when the value of the EyeWidth was calculated in the virtual cameras interval determination processing ( FIG. 16 ).
  • step S 73 the CPU 311 sets the value of the EyeMeasure at true, resets the value of the LogIndex at 1, and sets the marker recognition mode 79 at the both-image recognition mode, thereby ending the view matrix generation processing. As a result, calculation of the EyeWidth in the virtual cameras interval determination processing is started again.
  • the EyeWidth is calculated again when the depth distance (depth) from the outer imaging section 23 to the marker 61 has varied beyond a predetermined range in comparison with the depth distance as it was when the value of the EyeWidth was previously calculated in the virtual cameras interval determination processing, is because the optimum interval (EyeWidth) between the virtual cameras varies in accordance with the depth distance (depth) from the outer imaging section 23 to the marker 61 owing to error in the attachment of the outer imaging section (left) 23 a and the outer imaging section (right) 23 b to the game apparatus 10 .
  • the shooting direction of the outer imaging section (left) 23 a and the shooting direction of the outer imaging section (right) 23 b are not in parallel with each other, it is assumed that when the depth distance from the outer imaging section 23 to the marker 61 is D 1 , the value of the EyeWidth calculated in the virtual cameras interval determination processing is EyeWidth 1 , as shown in FIG. 33 .
  • the left virtual camera 63 L and the right virtual camera 63 R are placed with an interval of the EyeWidth 1 provided therebetween as shown in FIG.
  • step S 45 in the updating processing ( FIG. 15 ) will be described with reference to the flowchart in FIG. 18 .
  • step S 80 in FIG. 18 the CPU 311 determines whether or not the value of the AR-Active is true. If the value of the AR-Active is true, the CPU 311 proceeds to step S 81 , and if not (that is, if the value of the AR-Active is false), the CPU 311 ends the main real world image setting processing.
  • step S 81 the CPU 311 generates the left projection matrix 77 L and the right projection matrix 77 R.
  • the left projection matrix 77 L is a matrix for specifying a drawing area of the virtual space as viewed from the left virtual camera 63 L.
  • the left projection matrix 77 L is generated so as to correspond to the drawing area of the left real world image determined in step S 23 in the main processing.
  • the right projection matrix 77 R is a matrix for specifying a drawing area of the virtual space as viewed from the right virtual camera 63 R.
  • the right projection matrix 77 R is generated so as to correspond to the drawing area of the right real world image determined in step S 27 in the main processing.
  • the projection matrix for the left virtual camera 63 L is set as a projection matrix that defines a view volume that has an angle of view such that the ratio of the angle of view in the horizontal direction of the outer imaging section (left) 23 a, and the angle of view in the horizontal direction of the left virtual camera 63 L is the same as the ratio representing the position and the size in the lateral direction of the drawing area with respect to the length in the lateral direction of the left real world image.
  • step S 82 the CPU 311 determines whether or not the left real world image is the main real world image. If the left real world image is the main real world image, the CPU 311 proceeds to step S 83 , and if not (that is, if the right real world image is the main real world image), the CPU 311 proceeds to step S 86 .
  • step S 83 the CPU 311 multiplies (0, 0, 0, 1) by the left view matrix 76 L and the left projection matrix 77 L, thereby obtaining a vector V.
  • step S 84 the CPU 311 determines whether or not a value (V.x/V.w) obtained by multiplying the first component (x) of the vector V by the fourth component (w) is larger than 0.5. If the value (V.x/V.w) is larger than 0.5, the CPU 311 proceeds to step S 85 , and if not, the CPU 311 ends the main real world image setting processing.
  • the value of the V.x/V.w indicates the position in the lateral direction, in the virtual space image for a left eye, where the origin of the world coordinate system in the virtual space is present (it is noted that the position in the lateral direction, in the virtual space image for a left eye, where the origin of the world coordinate system in the virtual space is present, is equivalent to the position in the lateral direction, in an image shot by the outer imaging section (left) 23 a, where the origin of the marker coordinate system is present). If the origin of the world coordinate system in the virtual space is at the middle of the virtual space image for a left eye, the value of the V.x/V.w is 0.
  • V.x/V.w approaches ⁇ 1.0 as the origin of the world coordinate system in the virtual space approaches the left end of the virtual space image for a left eye
  • the value of the V.x/V.w approaches +1.0 as the origin of the world coordinate system in the virtual space approaches the right end of the virtual space image for a left eye. If the value of V.x/V.w is larger than 0.5, the origin of the world coordinate system in the virtual space is in the right end area (diagonal-line area in FIG. 39 ) of the virtual space image for a left eye. In other words, the marker 61 is in the right end area of the left real world image shot by the outer imaging section (left) 23 a.
  • step S 85 the CPU 311 changes the main real world image from the left real world image to the right real world image. That is, in the virtual space image shot by the left virtual camera 63 L, if the origin in the virtual space is present to the right of a position separated rightward by a predetermined distance (or by a predetermined percent of the lateral width of the virtual space image) from the middle of the virtual space image, the CPU 311 changes the main real world image to the right real world image.
  • the CPU 311 changes the main real world image to the right real world image.
  • the CPU 311 changes the main real world image to the right real world image before the marker 61 disappears from the left real world image. Therefore, it is possible to continuously recognize the marker 61 in the one-image recognition mode. If processing in step S 85 has been finished, the main real world image setting processing is ended.
  • step S 86 the CPU 311 multiplies (0, 0, 0, 1) by the right view matrix 76 R and the right projection matrix 77 R, thereby obtaining the vector V.
  • step S 87 the CPU 311 determines whether or not the value (V.x/V.w) obtained by multiplying the first component (x) of the vector V by the fourth component (w) is smaller than ⁇ 0.5. If the value (V.x/V.w) is smaller than ⁇ 0.5, the CPU 311 proceeds to step S 87 , and if not, the CPU 311 ends the main real world image setting processing.
  • the value of the V.x/V.w indicates the position in the lateral direction, in the virtual space image for a right eye, where the origin of the world coordinate system in the virtual space is present. If the origin of the world coordinate system in the virtual space is at the middle of the virtual space image for a right eye, the value of the V.x/V.w is 0.
  • V.x/V.w approaches ⁇ 1.0 as the origin of the world coordinate system in the virtual space approaches the left end of the virtual space image for a right eye
  • the value of the V.x/V.w approaches +1.0 as the origin of the world coordinate system in the virtual space approaches the right end of the virtual space image for a right eye.
  • the origin of the world coordinate system in the virtual space is in the left end area (diagonal-line area in FIG. 40 ) of the virtual space image for a right eye.
  • the marker 61 is in the left end area of the right real world image shot by the outer imaging section (right) 23 b.
  • step S 88 the CPU 311 changes the main real world image from the right real world image to the left real world image.
  • the CPU 311 changes the main real world image to the left real world image before the marker 61 disappears from the right real world image. Therefore, it is possible to continuously recognize the marker 61 in the one-image recognition mode. If processing in step S 88 has been finished, the main real world image setting processing is ended.
  • step S 46 in the updating processing FIG. 15
  • the details of the display mode switching processing in step S 46 in the updating processing FIG. 15 .
  • step S 90 in FIG. 19 the CPU 311 determines whether or not the display mode is the synchronous display mode. If the display mode is the synchronous display mode, the CPU 311 proceeds to step S 91 , and if not (that is, if the display mode is the asynchronous display mode), the CPU 311 proceeds to step S 96 .
  • step S 91 the CPU 311 determines whether or not: the EyeWidth is larger than 0; and the marker recognition for the left real world image or the marker recognition for the right real world image has been successful. Then, if the result of the determination is positive, the CPU 311 proceeds to step S 92 , and if the result of the determination is negative (that is, if the EyeWidth is 0 or if both the marker recognition for the left real world image and the marker recognition for the right real world image have been failed), the CPU 311 proceeds to step S 93 .
  • step S 92 the CPU 311 sets the value of SyncCount (see FIG. 12 ) at 20.
  • the SyncCount is a variable for determining a timing of switching the display mode from the synchronous display mode to the asynchronous display mode. If processing in step S 92 has been finished, the display mode switching processing is ended.
  • step S 93 the CPU 311 decrements the value of SyncCount.
  • step S 94 the CPU 311 determines whether or not the value of the SyncCount is larger than 0. If the value of the SyncCount is larger than 0, the CPU 311 ends the display mode switching processing, and if not (that is, the value of the SyncCount is 0), the CPU 311 proceeds to step S 95 .
  • step S 95 the CPU 311 sets the value of the LogIndex at 1, sets the value of the EyeWidth and the value of the EyeDepth at 0, changes the display mode 78 from the synchronous display mode to the asynchronous display mode, and sets the marker recognition mode 79 to the both-image recognition mode. If processing in step S 95 has been finished, the display mode switching processing is ended.
  • step S 96 the CPU 311 determines whether or not: the EyeWidth is larger than 0; and the marker recognition for the left real world image or the marker recognition for the right real world image has been successful. Then, if the result of the determination is positive, the CPU 311 proceeds to step S 97 , and if the result of the determination is negative (that is, if the EyeWidth is 0 or if both the marker recognition for the left real world image and the marker recognition for the right real world image have been failed), the CPU 311 ends the display mode switching processing.
  • step S 97 the CPU 311 sets the value of SyncCount at 20, and changes the display mode from the asynchronous display mode to the synchronous display mode. If processing in step S 97 has been finished, the display mode switching processing is ended.
  • the display mode switching processing appropriately switches the display mode between the synchronous display mode and the asynchronous display mode. More specifically, just after execution of the image display program has been started, the display mode is the asynchronous display mode, and the latest real world image is constantly displayed on the upper LCD 22 . Thereafter, at the time when the marker 61 has been recognized and it has become possible to display the virtual object 62 being superimposed onto the real world image displayed on the upper LCD 22 , the display mode is changed from the asynchronous display mode to the synchronous display mode.
  • the synchronous display mode in order to superimpose and display the virtual object 62 at a proper position in the real world image, a composite image obtained by superimposing the virtual object 62 onto the last real world image in which the marker 61 has been recognized (which is not always the latest real world image) is displayed on the upper LCD 22 .
  • the display mode is changed from the synchronous display mode to the asynchronous display mode, and the latest real world image is constantly displayed on the upper LCD 22 . In this way, when the marker 61 continues to be unrecognized, it is possible to prevent an old image from continuing to be displayed on the upper LCD 22 .
  • the position and the orientation of one of the virtual cameras is determined based on the position and the orientation of one of the outer imaging sections (the outer imaging section (left) 23 a or the outer imaging section (right) 23 b ) in the marker coordinate system, which are calculated from the result of the marker recognition for the corresponding one of the left real world image and the right real world image.
  • the position and the orientation of the other one of the virtual cameras are determined such that the relationship between each of the position and the orientation of the left virtual camera 63 L, and each of the position and the orientation of the right virtual camera 63 R, is ideal. Therefore, the virtual object 62 can be displayed so as to provide a normal stereoscopic view, on a display apparatus capable of providing a stereoscopic view.
  • the interval between the two outer imaging sections are calculated based on the position of the outer imaging section (left) 23 a calculated based on the result of the marker recognition for the left real world image, and the position of the outer imaging section (right) 23 b calculated based on the result of the marker recognition for the right real world image.
  • the virtual cameras interval (EyeWidth) is determined based on the result of the calculation. A position separated, by the calculated interval in the direction perpendicular to the shooting direction in a coordinate system of one of the virtual cameras, from the position of the one of the virtual cameras, is calculated, to set the other one of the virtual cameras at the calculated position. As a result, both the virtual cameras are set so as to be placed along the direction perpendicular to the shooting direction.
  • the marker recognition processing needs to be performed only for the main real world image. Therefore, a load of processing can be reduced in comparison with the case where the marker recognition processing is always performed for both the left real world image and the right real world image.
  • the interval (EyeWidth), between the outer imaging section (left) 23 a and the outer imaging section (right) 23 b in the marker coordinate system, that corresponds to the interval (for example, 3.5 cm) between the outer imaging section (left) 23 a and the outer imaging section (right) 23 b in the real world is figured out based on the result of the virtual cameras interval determination processing.
  • a scale of the real world such as displaying a character (virtual object) having a height of 30 cm being superimposed onto the real world image, or moving a character (virtual object) displayed being superimposed onto the real world image real world image at a velocity of 10 cm per second.
  • the position and the orientation of the marker 61 in the real world image are recognized, and the virtual object 62 is superimposed onto the real world image in accordance with the result of the recognition.
  • the position and/or the orientation of any recognition target may be recognized, and the virtual object 62 may be superimposed onto the real world image in accordance with the result of the recognition.
  • An example of a recognition target is the face of a person.
  • a stereoscopic image is displayed on the upper LCD 22 , based on the real world image being shot in real time by the outer imaging section 23 .
  • a stereoscopic image may be displayed on the upper LCD 22 , based on data of moving images shot in the past by the outer imaging section 23 , an external stereo camera, or the like.
  • the outer imaging section 23 is provided in the game apparatus 10 in advance.
  • an external camera attachable to and detachable from the game apparatus 10 may be used.
  • the upper LCD 22 is provided in the game apparatus 10 in advance.
  • an external stereoscopic display attachable to and detachable from the game apparatus 10 may be used.
  • the virtual object 62 is placed at the position of the origin of the marker coordinate system.
  • the virtual object 62 may be placed at a position away from the origin of the marker coordinate system.
  • one virtual object is placed in the virtual space.
  • a plurality of virtual objects may be placed in the virtual space.
  • the interval (EyeWidth) between the outer imaging section (left) 23 a and the outer imaging section (right) 23 b in the marker coordinate system is calculated, and based on the interval, the position and the orientation of one of the left virtual camera 63 L and the right virtual camera 63 R are determined from the position and the orientation of the other one of the virtual cameras calculated based on the result of the marker recognition.
  • the position and the orientation of the outer imaging section 23 may be calculated based on the position and the orientation of the outer imaging section (left) 23 a calculated based on the result of the marker recognition for the left real world image, and the position and the orientation of the outer imaging section (right) 23 b calculated based on the result of the marker recognition for the right real world image.
  • the positions and/or the orientations of the left virtual camera 63 L and the right virtual camera 63 R may be determined.
  • the orientations of the left virtual camera 63 L and the right virtual camera 63 R may be determined such that both the orientations are intermediate between the orientation of the outer imaging section (left) 23 a calculated based on the result of the marker recognition for the left real world image, and the orientation of the outer imaging section (right) 23 b calculated based on the result of the marker recognition for the right real world image.
  • positions separated, by distances of EyeWidth/2 in the respective directions opposite to each other and perpendicular to the shooting direction of the virtual cameras, from a position in the virtual space corresponding to the average position of the position of the outer imaging section (left) 23 a calculated based on the result of the marker recognition for the left real world image, and the position of the outer imaging section (right) 23 b calculated based on the result of the marker recognition for the right real world image, may be calculated, and the positions of the left virtual camera 63 L and the right virtual camera 63 R may be determined so as to be the calculated positions.
  • the marker recognition processing is performed for only the main real world image.
  • the marker recognition processing may be always performed for both the left real world image and the right real world image.
  • the upper LCD 22 is a stereoscopic display apparatus of a parallax barrier type.
  • the upper LCD 22 may be a stereoscopic display apparatus of any other type such as lenticular lens type.
  • a stereoscopic display apparatus of a lenticular lens type an image for a left eye and an image for a right eye may be combined with each other by the CPU 311 or another processor, and the resultant composite image may be supplied to the stereoscopic display apparatus of a lenticular lens type.
  • the game apparatus 10 is used for superimposing a virtual object onto the real world image and displaying the resultant image.
  • any information processing apparatus or any information processing system for example, a PDA (Personal Digital Assistant), a mobile phone, a personal computer, or a camera
  • a PDA Personal Digital Assistant
  • the image display processing is executed by using only one information processing apparatus (game apparatus 10 ).
  • a plurality of information processing apparatuses, included in an image display system, that can communicate with each other may share the execution of the image display processing.

Abstract

First, the interval between a left virtual camera and a right virtual camera is determined based on the relative position and the relative orientation of an outer imaging section (left) with respect to a marker, which are calculated from a result of marker recognition in the left real world image, and on the relative position and the relative orientation of an outer imaging section (right) with respect to the marker, which are calculated from a result of marker recognition in the right real world image. After the interval between the left virtual camera and the right virtual camera is determined, the positions and the orientations of the left virtual camera and the right virtual camera are determined based on the determined interval such that the relationship between the position and the orientation of the left virtual camera, and the position and the orientation of the right virtual camera is ideal.

Description

    CROSS REFERENCE TO RELATED APPLICATION
  • The disclosures of Japanese Patent Application No. 2010-134545, filed on Jun. 11, 2010, Japanese Patent Application No. 2010-134575, filed on Jun. 11, 2010, and Japanese Patent Application No. 2010-135507, filed on Jun. 14, 2010 are incorporated herein by reference.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates to a computer-readable storage medium, an image display apparatus, an image display system, and an image display method, and more particularly, to a computer-readable storage medium, an image display apparatus, an image display system, and an image display method for stereoscopically displaying a real world image onto which a 3-dimensional virtual object is superimposed, on a screen of a stereoscopic display apparatus capable of providing a stereoscopic view.
  • 2. Description of the Background Art
  • In recent years, development of an AR (Augmented Reality) technique has been advanced in which a virtual object is superimposed onto an image of a real world and the resultant image is displayed, whereby the virtual object appears to be present in the real world.
  • For example, a stereoscopic display apparatus disclosed in Patent Literature 1 (Japanese Laid-Open Patent Publication No. 2008-146109) calculates the relative positions and relative orientations of a camera for a right eye and a camera for a left eye with respect to a marker placed in the real world, from respective images shot by the camera for a right eye and the camera for a left eye, which are attached to a head mounted display, and then, based on the result of the calculation, generates an image of a virtual object for a right eye and an image for the virtual object for a left eye. Then, the stereoscopic display apparatus superimposes the image of the virtual object for a right eye and the image for the virtual object for a left eye onto the images shot by the camera for a right eye and the camera for a left eye, respectively, and displays the resultant composite images on an LCD (liquid crystal display) for a right eye and an LCD for a left eye, respectively.
  • In addition, Non-Patent Literature 1 (Hirokazu Kato, Mark Billinghurst, “Marker Tracking and HMD Calibration for a Video-Based Augmented Reality Conferencing System,” iwar, pp.85, 2nd IEEE and ACM International Workshop on Augmented Reality, 1999) discloses a method of, based on the position and the orientation of a marker in an image shot by a camera, calculating the relative positions and the relative orientations of the marker and the camera with respect to each other in the real world.
  • However, in the case where the relative positions and the relative orientations of the marker and the camera with respect to each other in the real world are calculated based on the position and the orientation of the marker in an image shot by the camera, if the image shot by the camera is blurred or if the accuracy of marker recognition is not perfect, error occurs in a result of calculation. Therefore, in the stereoscopic display apparatus disclosed in Patent Literature 1, the relative position and orientation of the camera for a right eye with respect to the marker calculated based on an image shot by the camera for a right eye are not always accurate, and also, the relative position and orientation of the camera for a left eye with respect to the marker calculated based on an image shot by the camera for a left eye are not always accurate. An image for a right eye including a virtual object, and an image for a left eye including the virtual object that are generated based on a result of calculation that is not accurate, do not have consistency with each other, that is, they are inconsistence with each other. As a result, a user cannot normally stereoscopically view the virtual object.
  • SUMMARY OF THE INVENTION
  • Therefore, an object of the present invention is to provide a computer-readable storage medium, an image display apparatus, an image display system, and an image display method that are capable of displaying a virtual object so as to be normally stereoscopically visible.
  • The present invention has the following features to attain the object mentioned above.
  • A computer-readable storage medium of the present invention is a computer-readable storage medium having stored therein an image display program which stereoscopically displays a real world image onto which a 3-dimensional virtual object is superimposed, on a screen of a stereoscopic display apparatus capable of providing a stereoscopic view, by using outputs from a real camera for a right eye and a real camera for a left eye, the image display program causing a computer to function as first position orientation calculation means, virtual camera setting means, right virtual space image generation means, left virtual space image generation means, and display control means. The first position orientation calculation means recognizes a predetermined shooting target in a piece of real world image data outputted from one of the real camera for a right eye and the real camera for a left eye, among pieces of real world image data respectively outputted from the real camera for a right eye and the real camera for a left eye, and calculates position orientation information indicating relative positions and relative orientations of the one of the real cameras and the predetermined shooting target with respect to each other. The virtual camera setting means determines the position and the orientation of a right virtual camera for generating an image for a right eye, and the position and the orientation of a left virtual camera for generating an image for a left eye, in a predetermined virtual space, by using the position orientation information calculated by the first position orientation calculation means. The right virtual space image generation means generates a right virtual space image indicating the predetermined virtual space as it is looked at from the right virtual camera. The left virtual space image generation means generates a left virtual space image indicating the predetermined virtual space as it is looked at from the left virtual camera. The display control means superimposes the right virtual space image onto the piece of real world image data outputted from the real camera for a right eye, and superimposes the left virtual space image onto the piece of real world image data outputted from the real camera for a left eye, to output images for a stereoscopic view to the stereoscopic display apparatus.
  • It is noted that the virtual camera setting means may determine the position and the orientation of the right virtual camera for generating the image for a right eye, and the position and the orientation of the left virtual camera for generating the image for a left eye, in the predetermined virtual space, without using a result of recognition of the predetermined shooting target in the piece of real world image data outputted from the other one of the real cameras which is different from the one of the real camera in the first position orientation calculation means.
  • According to the above configuration, even if the predetermined shooting target can be recognized only in one of the two pieces of real world image data respectively outputted from the two real cameras, the position and the orientation of one of the two virtual cameras are determined based on the position orientation information calculated based on the one, of the two pieces of real world image data, in which the predetermined shooting target can be recognized, and the position and the orientation of the other one of the two virtual cameras are determined based on the position and the orientation of the one of the two virtual cameras which have been determined. Therefore, even if the predetermined shooting target can be recognized only in one of the two pieces of real world image data respectively outputted from the two real cameras, a virtual object can be appropriately displayed in a stereoscopic manner. In addition, even if the predetermined shooting target can be recognized in both the two pieces of real world image data respectively outputted from the two real cameras, a virtual object can be appropriately displayed in a stereoscopic manner merely by recognizing the predetermined shooting target in one of the two pieces of real world image data. Thus, a load of processing on the computer can be reduced.
  • In another preferable configuration, the virtual camera setting means may determine the positions and the orientations of the right virtual camera and the left virtual camera such that the relationship between the relative orientations of the right virtual camera and the left virtual camera with respect to each other is the same as the relationship between designed relative orientations of the real camera for a right eye and the real camera for a left eye with respect to each other.
  • According to the above configuration, even if the position orientation information calculated based on the piece of real world image data outputted from one of the two real cameras, and the position orientation information calculated based on the piece of real world image data outputted from the other one of the two real cameras do not accurately correspond to the respective relative orientations of the two real cameras because of error in recognition of the predetermined shooting target, error in attachment of the two real cameras, or the like, the relative orientations of the right virtual camera and the left virtual camera can be set appropriately. Therefore, a virtual object can be appropriately displayed in a stereoscopic manner.
  • In another preferable configuration, the virtual camera setting means may include: a first virtual camera orientation determination section for determining the orientation of one of the right virtual camera and the left virtual camera that corresponds to the one of the real cameras in the first position orientation calculation means, by using the position orientation information calculated by the first position orientation calculation means; and a second virtual camera orientation determination section for, based on the orientation of the one of the right virtual camera and the left virtual camera determined by the first virtual camera orientation determination section, determining the orientation of the other one of the right virtual camera and the left virtual camera such that the relationship between the relative orientations of the right virtual camera and the left virtual camera with respect to each other is the same as the relationship between designed relative orientations of the real camera for a right eye and the real camera for a left eye with respect to each other.
  • In another preferable configuration, the image display program may further cause the computer to function as virtual camera relative position relationship determination means for determining a relative position relationship between the right virtual camera and the left virtual camera. The virtual camera setting means may include: a first virtual camera position determination section for determining the position of one of the right virtual camera and the left virtual camera that corresponds to the one of the real cameras in the first position orientation calculation means, by using the position orientation information calculated by the first position orientation calculation means; and a second camera position determination section for determining the position of the other one of the right virtual camera and the left virtual camera so as to be separated, by the relative position determined by the virtual camera relative position relationship determination means, from the position of the one of the right virtual camera and the left virtual camera determined by the first virtual camera position determination section.
  • It is noted that the “relative position relationship” mentioned above may be the distance between the left virtual camera and the right virtual camera, or may be the relative position of one of the virtual cameras with respect to the other one of the virtual cameras.
  • It is noted that the virtual camera relative position relationship determination means may determine the relative position relationship between the right virtual camera and the left virtual camera, based on a result of recognition of the predetermined shooting target in the two pieces of real world image data respectively outputted from the two real cameras, or may determine the relative position relationship between the right virtual camera and the left virtual camera, based on the designed relative relationship between the real camera for a right eye and the real camera for a left eye.
  • According to the above configuration, even if the position orientation information calculated based on the piece of real world image data outputted from one of the two real cameras, and the position orientation information calculated based on the piece of real world image data outputted from the other one of the two real cameras do not accurately correspond to the relative position relationship between the two real cameras because of error in recognition of the predetermined shooting target, error in attachment of the two real cameras, or the like, the relative position relationship between the right virtual camera and the left virtual camera can be set appropriately. Therefore, a virtual object can be appropriately displayed in a stereoscopic manner.
  • In another preferable configuration, the virtual camera setting means may include: a first virtual camera position determination section for determining the position of one of the right virtual camera and the left virtual camera that corresponds to the one of the real cameras in the first position orientation calculation means, by using the position orientation information calculated by the first position orientation calculation means; a first virtual camera orientation determination section for determining the orientation of the one of the right virtual camera and the left virtual camera that corresponds to the one of the real cameras in the first position orientation calculation means, by using the position orientation information calculated by the first position orientation calculation means; and a second virtual camera position determination section for determining the position of the other one of the right virtual camera and the left virtual camera so as to be present, in a direction based on the orientation of the one of the right virtual camera and the left virtual camera determined by the first virtual camera orientation determination section, from the position of the one of the right virtual camera and the left virtual camera determined by the first virtual camera position determination section,
  • It is noted that the second virtual camera position determination section may determine the position of the other one of the right virtual camera and the left virtual camera so as to be present, in the lateral direction of the orientation of the one of the right virtual camera and the left virtual camera determined by the first virtual camera orientation determination section, from the position of the one of the right virtual camera and the left virtual camera determined by the first virtual camera position determination section.
  • In another preferable configuration, the image display program may further cause the computer to function as virtual cameras distance determination means for determining the distance between the right virtual camera and the left virtual camera. The second virtual camera position determination section may determine the position of the other one of the right virtual camera and the left virtual camera so as to be separated by the distance determined by the virtual cameras distance determination means, in the direction based on the orientation of the one of the right virtual camera and the left virtual camera determined by the first virtual camera orientation determination section, from the position of the one of the right virtual camera and the left virtual camera determined by the first virtual camera position determination section.
  • In another preferable configuration, the virtual camera relative position relationship determination means may determine the relative position relationship between the right virtual camera and the left virtual camera, based on a parallax between the pieces of real world image data respectively outputted from the real camera for a right eye and the real camera for a left eye.
  • According to the above configuration, the relative position relationship between the right virtual camera and the left virtual camera is determined based on the parallax between the two pieces of real world image data. Therefore, even if the relative position relationship between the two real cameras is not known, or even if the relative position relationship between the two real cameras has error because of error in the attachment of the two real cameras, the right virtual camera and the left virtual camera can be set appropriately.
  • In another preferable configuration, the relative position relationship which is determined by the relative position relationship determination means may be the interval between the right virtual camera and the left virtual camera.
  • In another preferable configuration, the image display program may further cause the computer to function as second position orientation calculation means for recognizing the predetermined shooting target in the piece of real world image data outputted from the other one of the real cameras which is different from the one of the real cameras in the first position orientation calculation means, among the two pieces of real world image data respectively outputted from the two real cameras, and calculating position orientation information indicating relative positions and relative orientations of the other one of the real cameras and the predetermined shooting target with respect to each other. The virtual camera relative position relationship determination means may calculate the distance between the one of the real cameras and the other one of the real cameras, based on the information, about the relative positions of the one of the real cameras and the predetermined shooting target, which is calculated by the first position orientation calculation means, and on the information, about the relative positions of the other one of the real cameras and the predetermined shooting target, which is calculated by the second position orientation calculation means.
  • In another preferable configuration, the first position orientation calculation means may include first transform matrix generation means for, based on the piece of real world image data outputted from the one of the real cameras, generating a first transform matrix that transforms a coordinate value represented in a coordinate system whose origin is the position of the predetermined shooting target, into a coordinate value represented in a first shooting section coordinate system whose origin is the position of the one of the real cameras. The second position orientation calculation means may include second transform matrix generation means for, based on the piece of real world image data outputted from the other one of the real cameras, generating a second transform matrix that transforms a coordinate value represented in a coordinate system whose origin is the position of the predetermined shooting target, into a coordinate value represented in a second shooting section coordinate system whose origin is the position of the other one of the real cameras.
  • In another preferable configuration, the virtual camera relative position relationship determination means may execute processing of calculating the relative position relationship between the right virtual camera and the left virtual camera every time a new piece of real world image data is outputted from each of the two real cameras, and may determine the relative position relationship between the right virtual camera and the left virtual camera, based on results of a plurality of the calculations of the relative position relationship obtained by the processing being executed a plurality of times.
  • According to the above configuration, influence of error in recognition of the predetermined shooting target in each piece of real world image data is reduced. Therefore, reliability of the relative position relationship between the right virtual camera and the left virtual camera determined by the virtual camera relative position relationship determination means is improved.
  • In another preferable configuration, the virtual camera relative position relationship determination means may determine the relative position relationship between the right virtual camera and the left virtual camera, based on the results of the plurality of calculations of the relative position relationship, only when all the results of the plurality of calculations of the relative position relationship are within a predetermined range.
  • According to the above configuration, reliability of the relative position relationship between the right virtual camera and the left virtual camera determined by the virtual camera relative position relationship determination means is improved.
  • In another preferable configuration, the image display program may further cause the computer to function as depth variation determination means for determining whether or not the depth distance from each of the two real cameras to the predetermined shooting target has varied beyond a predetermined range in comparison with the depth distance as it was when the virtual camera relative position relationship determination means determined the relative position relationship between the right virtual camera and the left virtual camera. The virtual camera relative position relationship determination means may determine again the relative position relationship between the right virtual camera and the left virtual camera, when the result of the determination by the depth variation determination means is positive.
  • The magnitude of influence of error in attachment of the two real cameras varies depending on the depth distance from each of the two real cameras to the predetermined shooting target. According to the above configuration, the influence of error in attachment of the two real cameras can appropriately be corrected as necessary at any time.
  • In another preferable configuration, the image display program may further cause the computer to function as depth distance calculation means for calculating the depth distance from each of the two real cameras to the predetermined shooting target, based on a piece of real world image data that includes the predetermined shooting target. The depth variation determination means may determine whether or not the depth distance from each of the two real cameras to the predetermined shooting target has varied beyond the predetermined range by comparing a reference depth distance calculated by the depth distance calculation means at a time when the virtual camera relative position relationship determination means determined the relative position relationship between the right virtual camera and the left virtual camera, with the latest depth distance calculated by the depth distance calculation means after the calculation of the reference depth distance.
  • In another preferable configuration, the depth distance calculation means may execute processing of calculating the depth distance from each of the two real cameras to the predetermined shooting target every time a new piece of real world image data is outputted from each of the two real cameras, and may calculate the reference depth distance, based on results of a plurality of the calculations of the depth distance obtained by the processing being executed a plurality of times.
  • According to the above configuration, influence of error in recognition of the predetermined shooting target in each piece of real world image data is reduced. Therefore, reliability of the depth distance from each of the two real cameras to the predetermined shooting target calculated by the depth distance calculation means is improved.
  • In another preferable configuration, the depth distance calculation means may determine the reference depth distance, based on the results of the plurality of calculations of the depth distance, only when all the results of the plurality of calculations of the depth distance are within a predetermined range.
  • According to the above configuration, reliability of the depth distance from each of the two real cameras to the predetermined shooting target calculated by the depth distance calculation means is improved.
  • In another preferable configuration, the image display program may further cause the computer to function as: second position orientation calculation means for recognizing the predetermined shooting target in the piece of real world image data outputted from the other one of the real cameras which is different from the one of the real cameras in the first position orientation calculation means, among the two pieces of real world image data respectively outputted from the two real cameras, and calculating position orientation information indicating relative positions and relative orientations of the other one of the real cameras and the predetermined shooting target with respect to each other; and real camera selection means for selecting one of the two real cameras in accordance with the position of the predetermined shooting target in at least one of the two pieces of real world image data respectively outputted from the two real cameras. The virtual camera setting means may, if the real camera selected by the real camera selection means is the one of the real cameras, determine the position and the orientation of the right virtual camera, and the position and the orientation of the left virtual camera, by using the position orientation information indicating the relative positions and relative orientations of the one of the real cameras and the predetermined shooting target, which is calculated by the first position orientation calculation means, and if the real camera selected by the real camera selection means is the other one of the real cameras, determine the position and the orientation of the right virtual camera, and the position and the orientation of the left virtual camera, by using the position orientation information indicating the relative positions and relative orientations of the other one of the real cameras and the predetermined shooting target, which is calculated by the second position orientation calculation means.
  • According to the above configuration, even if the predetermined shooting target is out of a shooting area of one of the real cameras, the right virtual camera and the left virtual camera can be set appropriately. Therefore, it is possible to keep stereoscopically displaying a virtual object.
  • In another preferable configuration, the real camera selection means may switch the real camera for a left eye to the real camera for a right eye, in accordance with entrance of the position of the predetermined shooting target in the piece of real world image data outputted from the real camera for a left eye of the two real cameras into a right end area of the piece of real world image data, and may switch the real camera for a right eye to the real camera for a left eye, in accordance with entrance of the position of the predetermined shooting target in the piece of real world image data outputted from the real camera for a right eye of the two real cameras into a left end area of the piece of real world image data.
  • In another preferable configuration, the pieces of real world image data to be used by the first position orientation calculation means and the display control means may be outputted in real time from the real cameras.
  • According to the above configuration, it is possible to make a virtual object on the screen of the stereoscopic display apparatus appear as if the virtual object were present in real time in the real world.
  • In another preferable configuration, the computer may be included in an information processing apparatus having the two real cameras and the stereoscopic display apparatus.
  • An image display apparatus according to the present invention stereoscopically displays a real world image onto which a 3-dimensional virtual object is superimposed, on a screen of a stereoscopic display apparatus capable of providing a stereoscopic view, by using outputs from a real camera for a right eye and a real camera for a left eye, and comprises first position orientation calculation means, virtual camera setting means, right virtual space image generation means, left virtual space image generation means, and display control means. The first position orientation calculation means recognizes a predetermined shooting target in a piece of real world image data outputted from one of the real camera for a right eye and the real camera for a left eye, among pieces of real world image data respectively outputted from the real camera for a right eye and the real camera for a left eye, and calculates position orientation information indicating relative positions and relative orientations of the one of the real cameras and the predetermined shooting target with respect to each other. The virtual camera setting means determines the position and the orientation of a right virtual camera for generating an image for a right eye, and the position and the orientation of a left virtual camera for generating an image for a left eye, in a predetermined virtual space, by using the position orientation information calculated by the first position orientation calculation means. The right virtual space image generation means generates a right virtual space image indicating the predetermined virtual space as it is looked at from the right virtual camera. The left virtual space image generation means generates a left virtual space image indicating the predetermined virtual space as it is looked at from the left virtual camera. The display control means superimposes the right virtual space image onto the piece of real world image data outputted from the real camera for a right eye, and superimposes the left virtual space image onto the piece of real world image data outputted from the real camera for a left eye, to output images for a stereoscopic view to the stereoscopic display apparatus.
  • An image display system according to the present invention stereoscopically displays a real world image onto which a 3-dimensional virtual object is superimposed, on a screen of a stereoscopic display apparatus capable of providing a stereoscopic view, by using outputs from a real camera for a right eye and a real camera for a left eye, and comprises a predetermined shooting target, first position orientation calculation means, virtual camera setting means, right virtual space image generation means, left virtual space image generation means, and display control means. The first position orientation calculation means recognizes the predetermined shooting target in a piece of real world image data outputted from one of the real camera for a right eye and the real camera for a left eye, among pieces of real world image data respectively outputted from the real camera for a right eye and the real camera for a left eye, and calculates position orientation information indicating relative positions and relative orientations of the one of the real cameras and the predetermined shooting target with respect to each other. The virtual camera setting means determines the position and the orientation of a right virtual camera for generating an image for a right eye, and the position and the orientation of a left virtual camera for generating an image for a left eye, in a predetermined virtual space, by using the position orientation information calculated by the first position orientation calculation means. The right virtual space image generation means generates a right virtual space image indicating the predetermined virtual space as it is looked at from the right virtual camera. The left virtual space image generation means generates a left virtual space image indicating the predetermined virtual space as it is looked at from the left virtual camera. The display control means superimposes the right virtual space image onto the piece of real world image data outputted from the real camera for a right eye, and superimposes the left virtual space image onto the piece of real world image data outputted from the real camera for a left eye, to output images for a stereoscopic view to the stereoscopic display apparatus.
  • An image display method according to the present invention stereoscopically displays a real world image onto which a 3-dimensional virtual object is superimposed, on a screen of a stereoscopic display apparatus capable of providing a stereoscopic view, by using outputs from a real camera for a right eye and a real camera for a left eye, and comprises a first position orientation calculation step, a virtual camera setting step, a right virtual space image generation step, a left virtual space image generation step, and a display control step. The first position orientation calculation step recognizes a predetermined shooting target in a piece of real world image data outputted from one of the real camera for a right eye and the real camera for a left eye, among pieces of real world image data respectively outputted from the real camera for a right eye and the real camera for a left eye, and calculates position orientation information indicating relative positions and relative orientations of the one of the real cameras and the predetermined shooting target with respect to each other. The virtual camera setting step determines the position and the orientation of a right virtual camera for generating an image for a right eye, and the position and the orientation of a left virtual camera for generating an image for a left eye, in a predetermined virtual space, by using the position orientation information calculated in the first position orientation calculation step. The right virtual space image generation step generates a right virtual space image indicating the predetermined virtual space as it is looked at from the right virtual camera. The left virtual space image generation step generates a left virtual space image indicating the predetermined virtual space as it is looked at from the left virtual camera. The display control step superimposes the right virtual space image onto the piece of real world image data outputted from the real camera for a right eye, and superimposes the left virtual space image onto the piece of real world image data outputted from the real camera for a left eye, to output images for a stereoscopic view to the stereoscopic display apparatus.
  • According to the present invention, it is possible to display a virtual object so as to be normally stereoscopically visible.
  • These and other objects, features, aspects and advantages of the present invention will become more apparent from the following detailed description of the present invention when taken in conjunction with the accompanying drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a front view of a game apparatus 10 in its opened state;
  • FIG. 2 is a side view of the game apparatus 10 in its opened state;
  • FIG. 3 shows a left side view, a front view, a right side view, and a rear view of the game apparatus 10 in its closed state;
  • FIG. 4 is a cross-sectional view of an upper housing 21 shown in FIG. 1 taken along a line A-A′;
  • FIG. 5A is a diagram illustrating a state in which a slider 25 a of a 3D adjustment switch 25 is positioned at the lowermost position (a third position);
  • FIG. 5B is a diagram illustrating a state in which the slider 25 a of the 3D adjustment switch 25 is positioned at a position (a first position) above the lowermost position;
  • FIG. 5C is a diagram illustrating a state in which the slider 25 a of the 3D adjustment switch 25 is positioned at the uppermost position (a second position);
  • FIG. 6 is a block diagram illustrating an internal configuration of the game apparatus 10;
  • FIG. 7 shows an example of a stereoscopic image displayed on a screen of an upper LCD 22;
  • FIG. 8 shows another example of a stereoscopic image displayed on the screen of the upper LCD 22;
  • FIG. 9 shows a marker 61;
  • FIG. 10 shows another example of a stereoscopic image displayed on the screen of the upper LCD 22;
  • FIG. 11 shows a memory map of a main memory 32 of the game apparatus 10;
  • FIG. 12 shows an example of variables stored in the main memory 32;
  • FIG. 13 is a flowchart showing a flow of marker processing;
  • FIG. 14 is a flowchart showing a flow of main processing;
  • FIG. 15 is a flowchart showing the details of updating processing;
  • FIG. 16 is a flowchart showing the details of virtual cameras interval determination processing;
  • FIG. 17 is a flowchart showing the details of view matrix generation processing;
  • FIG. 18 is a flowchart showing the details of main real world image setting processing;
  • FIG. 19 is a flowchart showing the details of display mode switching processing;
  • FIG. 20 shows an example of a left real world image and a right real world image;
  • FIG. 21 shows the position and the orientation of a left virtual camera 63L calculated in accordance with a result of marker recognition processing;
  • FIG. 22 shows the position and the orientation of a right virtual camera 63R calculated in accordance with a result of marker recognition processing
  • FIG. 23 shows an area cut from the left real world image, based on a stereoscopic view zero distance;
  • FIG. 24 shows an area cut from the right real world image, based on the stereoscopic view zero distance;
  • FIG. 25 shows a position relationship between a virtual object 62 and a left virtual camera 63L;
  • FIG. 26 shows a method of generating an image for a left eye;
  • FIG. 27 shows a method of calculating coordinates of a right virtual camera 63R represented in a left virtual camera coordinate system;
  • FIG. 28 shows ew and ed;
  • FIG. 29 shows a method of calculating coordinates of the left virtual camera 63L and coordinates of the right virtual camera 63R represented in a marker coordinate system;
  • FIG. 30 shows the position of the right virtual camera 63R determined based on the position and the orientation of the left virtual camera 63L;
  • FIG. 31 shows the position of the left virtual camera 63L determined based on the position and the orientation of the right virtual camera 63R;
  • FIG. 32 shows |V.z|;
  • FIG. 33 is a diagram illustrating a reason for calculating EyeWidth again;
  • FIG. 34 is a diagram illustrating the reason for calculating EyeWidth again;
  • FIG. 35 is a diagram illustrating the reason for calculating EyeWidth again;
  • FIG. 36 is a diagram illustrating the reason for calculating EyeWidth again;
  • FIG. 37 is a diagram illustrating a left projection matrix;
  • FIG. 38 is a diagram illustrating a right projection matrix;
  • FIG. 39 is a diagram illustrating a method of determining whether or not to switch a main real world image from the left real world image to the right real world image; and
  • FIG. 40 is a diagram illustrating a method of determining whether or not to switch the main real world image from the right real world image to the left real world image.
  • DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • (Structure of Game Apparatus)
  • Hereinafter, a game apparatus according to one embodiment of the present invention will be described. FIG. 1 to FIG. 3 are each a plan view of an outer appearance of a game apparatus 10. The game apparatus 10 is a hand-held game apparatus, and is configured to be foldable as shown in FIG. 1 to FIG. 3. FIG. 1 and FIG. 2 show the game apparatus 10 in an opened state, and FIG. 3 shows the game apparatus 10 in a closed state. FIG. 1 is a front view of the game apparatus 10 in the opened state, and FIG. 2 is a right side view of the game apparatus 10 in the opened state. The game apparatus 10 is able to shoot an image by means of an imaging section, display the shot image on a screen, and store data of the shot image. The game apparatus 10 can execute a game program which is stored in an exchangeable memory card or a game program which is received from a server or another game apparatus, and can display, on the screen, an image generated by computer graphics processing, such as an image shot by a virtual camera set in a virtual space, for example.
  • Initially, an external structure of the game apparatus 10 will be described with reference to FIG. 1 to FIG. 3. The game apparatus 10 includes a lower housing 11 and an upper housing 21 as shown in FIG. 1 to FIG. 3. The lower housing 11 and the upper housing 21 are connected to each other so as to be openable and closable (foldable). In the present embodiment, the lower housing 11 and the upper housing 21 are each formed in a horizontally long plate-like rectangular shape, and are connected to each other at long side portions thereof so as to be pivotable with respect to each other.
  • As shown in FIG. 1 and FIG. 2, projections 11A each of which projects in a direction orthogonal to an inner side surface (main surface) 11B of the lower housing 11 are provided at the upper long side portion of the lower housing 11, whereas a projection 21A which projects from the lower side surface of the upper housing 21 in a direction orthogonal to the lower side surface of the upper housing 21 is provided at the lower long side portion of the upper housing 21. Since the projections HA of the lower housing 11 and the projection 21A of the upper housing 21 are connected to each other, the lower housing 11 and the upper housing 21 are foldably connected to each other.
  • (Description of Lower Housing)
  • Initially, a structure of the lower housing 11 will be described. As shown in FIG. 1 to FIG. 3, in the lower housing 11, a lower LCD (Liquid Crystal Display) 12, a touch panel 13, operation buttons 14A to 14L (FIG. 1, FIG. 3), an analog stick 15, an LED 16A and an LED 16B, an insertion opening 17, and a microphone hole 18 are provided. Hereinafter, these components will be described in detail.
  • As shown in FIG. 1, the lower LCD 12 is accommodated in the lower housing 11. The lower LCD 12 has a horizontally long shape, and is located such that a long side direction thereof corresponds to a long side direction of the lower housing 11. The lower LCD 12 is positioned at the center of the lower housing 11. The lower LCD 12 is provided on the inner side surface (main surface) of the lower housing 11, and a screen of the lower LCD 12 is exposed at an opening of the lower housing 11. When the game apparatus 10 is not used, the game apparatus 10 is in the closed state, thereby preventing the screen of the lower LCD 12 from becoming unclean and damaged. The number of pixels of the lower LCD 12 may be, for example, 256 dots×192 dots (width×height). The lower LCD 12 is a display device for displaying an image in a planar manner (not in a stereoscopically visible manner), which is different from the upper LCD 22 as described below. Although an LCD is used as a display device in the present embodiment, any other display device such as a display device using an EL (Electro Luminescence), or the like may be used. In addition, a display device having any resolution may be used as the lower LCD 12.
  • As shown in FIG. 1, the game apparatus 10 includes the touch panel 13 as an input device. The touch panel 13 is mounted on the screen of the lower LCD 12. In the present embodiment, the touch panel 13 may be, but is not limited to, a resistive film type touch panel. A touch panel of any type such as electrostatic capacitance type may be used. In the present embodiment, the touch panel 13 has the same resolution (detection accuracy) as that of the lower LCD 12. However, the resolution of the touch panel 13 and the resolution of the lower LCD 12 may not necessarily be the same. Further, the insertion opening 17 (indicated by dashed line in FIG. 1 and FIG. 3( d)) is provided on the upper side surface of the lower housing 11. The insertion opening 17 is used for accommodating a touch pen 28 which is used for performing an operation on the touch panel 13. Although an input on the touch panel 13 is usually made by using the touch pen 28, a finger of a user may be used for making an input on the touch panel 13, besides the touch pen 28.
  • The operation buttons 14A to 14L are each an input device for making a predetermined input. As shown in FIG. 1, among operation buttons 14A to 14L, a cross button 14A (a direction input button 14A), a button 14B, a button 14C, a button 14D, a button 14E, a power button 14F, a selection button 14J, a HOME button 14K, and a start button 14L are provided on the inner side surface (main surface) of the lower housing 11. The cross button 14A is cross-shaped, and includes buttons for indicating an upward, a downward, a leftward, or a rightward direction. The button 14B, button 14C, button 14D, and button 14E are positioned so as to form a cross shape. The buttons 14A to 14E, the selection button 14J, the HOME button 14K, and the start button 14L are assigned functions, respectively, in accordance with a program executed by the game apparatus 10, as necessary. For example, the cross button 14A is used for selection operation and the like, and the operation buttons 14B to 14E are used for, for example, determination operation and cancellation operation. The power button 14F is used for powering the game apparatus 10 on/off.
  • The analog stick 15 is a device for indicating a direction, and is provided to the left of the lower LCD 12 in an upper portion of the inner side surface of the lower housing 11. As shown in FIG. 1, the cross button 14A is provided to the left of the lower LCD 12 in the lower portion of the lower housing 11. That is, the analog stick 15 is provided above the cross button 14A. The analog stick 15 and the cross button 14A are positioned so as to be operated by a thumb of a left hand with which the lower housing is held. Further, the analog stick 15 is provided in the upper area, and thus the analog stick 15 is positioned such that a thumb of a left hand with which the lower housing 11 is held is naturally positioned on the position of the analog stick 15, and the cross button 14A is positioned such that the thumb of the left hand is positioned on the position of the cross button 14A when the thumb of the left hand is slightly moved downward from the analog stick 15. The analog stick 15 has a top, corresponding to a key, which slides parallel to the inner side surface of the lower housing 11. The analog stick 15 acts in accordance with a program executed by the game apparatus 10. For example, when a game in which a predetermined object appears in a 3-dimensional virtual space is executed by the game apparatus 10, the analog stick 15 acts as an input device for moving the predetermined object in the 3-dimensional virtual space. In this case, the predetermined object is moved in a direction in which the top corresponding to the key of the analog stick 15 slides. As the analog stick 15, a component which enables an analog input by being tilted by a predetermined amount, in any direction, such as the upward, the downward, the rightward, the leftward, or the diagonal direction, may be used.
  • Four buttons, that is, the button 14B, the button 14C, the button 14D, and the button 14E, which are positioned so as to form a cross shape, are positioned such that a thumb of a right hand with which the lower housing 11 is held is naturally positioned on the positions of the four buttons. Further, the four buttons and the analog stick 15 sandwich the lower LCD 12, so as to be bilaterally symmetrical in position with respect to each other. Thus, depending on a game program, for example, a left-handed person can make a direction instruction input by using these four buttons.
  • Further, the microphone hole 18 is provided on the inner side surface of the lower housing 11. Under the microphone hole 18, a microphone (see FIG. 7) is provided as a sound input device described below, and the microphone detects for a sound from the outside of the game apparatus 10.
  • FIG. 3( a) is a left side view of the game apparatus 10 in the closed state. FIG. 3( b) is a front view of the game apparatus 10 in the closed state. FIG. 3( c) is a right side view of the game apparatus 10 in the closed state. FIG. 3( d) is a rear view of the game apparatus 10 in the closed state. As shown in FIG. 3( b) and FIG. 3( d), an L button 14G and an R button 14H are provided on the upper side surface of the lower housing 11. The L button 14G is positioned on the left end portion of the upper side surface of the lower housing 11 and the R button 14H is positioned on the right end portion of the upper side surface of the lower housing 11. The L button 14G and the R button 14H can function as shutter buttons (imaging instruction buttons) of the imaging section, for example. Further, as shown in FIG. 3( a), a sound volume button 14I is provided on the left side surface of the lower housing 11. The sound volume button 14I is used for adjusting a sound volume of a speaker of the game apparatus 10.
  • As shown in FIG. 3( a), a cover section 11C is provided on the left side surface of the lower housing 11 so as to be openable and closable. Inside the cover section 11C, a connector (not shown) is provided for electrically connecting between the game apparatus 10 and an external data storage memory 45. The external data storage memory 45 is detachably connected to the connector. The external data storage memory 45 is used for, for example, recording (storing) data of an image shot by the game apparatus 10. The connector and the cover section 11C may be provided on the right side surface of the lower housing 11.
  • Further, as shown in FIG. 3( d), an insertion opening 11D through which an external memory 44 having a game program stored therein is inserted is provided on the upper side surface of the lower housing 11. A connector (not shown) for electrically connecting between the game apparatus 10 and the external memory 44 in a detachable manner is provided inside the insertion opening 11D. A predetermined game program is executed by connecting the external memory 44 to the game apparatus 10. The connector and the insertion opening 11D may be provided on another side surface (for example, the right side surface) of the lower housing 11.
  • Further, as shown in FIG. 1 and FIG. 3( c), a first LED 16A for notifying a user of an ON/OFF state of a power supply of the game apparatus 10 is provided on the lower side surface of the lower housing 11, and a second LED 16B for notifying a user of an establishment state of a wireless communication of the game apparatus 10 is provided on the right side surface of the lower housing 11. The game apparatus 10 can make wireless communication with other devices, and the second LED 16B is lit up when the wireless communication is established. The game apparatus 10 has a function of connecting to a wireless LAN in a method based on, for example, IEEE 802.11.b/g standard. A wireless switch 19 for enabling/disabling the function of the wireless communication is provided on the right side surface of the lower housing 11 (see FIG. 3( c)).
  • A rechargeable battery (not shown) acting as a power supply for the game apparatus 10 is accommodated in the lower housing 11, and the battery can be charged through a terminal provided on a side surface (for example, the upper side surface) of the lower housing 11.
  • (Description of Upper Housing)
  • Next, a structure of the upper housing 21 will be described. As shown in FIG. 1 to FIG. 3, in the upper housing 21, an upper LCD (Liquid Crystal Display) 22, an outer imaging section 23 (an outer imaging section (left) 23 a and an outer imaging section (right) 23 b), an inner imaging section 24, a 3D adjustment switch 25, and a 3D indicator 26 are provided. Hereinafter, theses components will be described in detail.
  • As shown in FIG. 1, the upper LCD 22 is accommodated in the upper housing 21. The upper LCD 22 has a horizontally long shape, and is located such that a long side direction thereof corresponds to a long side direction of the upper housing 21. The upper LCD 22 is positioned at the center of the upper housing 21. The area of a screen of the upper LCD 22 is set so as to be greater than the area of the screen of the lower LCD 12. Further, the screen of the upper LCD 22 is horizontally elongated as compared to the screen of the lower LCD 12. Specifically, a rate of the horizontal width in the aspect ratio of the screen of the upper LCD 22 is set so as to be greater than a rate of the horizontal width in the aspect ratio of the screen of the lower LCD 12.
  • The screen of the upper LCD 22 is provided on the inner side surface (main surface) 21B of the upper housing 21, and the screen of the upper LCD 22 is exposed at an opening of the upper housing 21. Further, as shown in FIG. 2, the inner side surface of the upper housing 21 is covered with a transparent screen cover 27. The screen cover 27 protects the screen of the upper LCD 22, and integrates the upper LCD 22 and the inner side surface of the upper housing 21 with each other, thereby achieving unity. The number of pixels of the upper LCD 22 may be, for example, 640 dots×200 dots (the horizontal line×the vertical line). Although, in the present embodiment, the upper LCD 22 is an LCD, a display device using an EL (Electro Luminescence), or the like may be used. In addition, a display device having any resolution may be used as the upper LCD 22.
  • The upper LCD 22 is a display device capable of displaying a stereoscopically visible image. Further, in the present embodiment, an image for a left eye and an image for a right eye are displayed by using substantially the same display area. Specifically, the upper LCD 22 may be a display device using a method in which the image for a left eye and the image for a right eye are alternately displayed in the horizontal direction in predetermined units (for example, every other line). Alternatively, a display device using a method in which the image for a left eye and the image for a right eye are alternately displayed for a predetermined time period may be used. Further, in the present embodiment, the upper LCD 22 is a display device capable of displaying an image which is stereoscopically visible with naked eyes. A lenticular lens type display device or a parallax barrier type display device is used which enables the image for a left eye and the image for a right eye, which are alternately displayed in the horizontal direction, to be separately viewed by the left eye and the right eye, respectively. In the present embodiment, the upper LCD 22 of a parallax barrier type is used. The upper LCD 22 displays, by using the image for a right eye and the image for a left eye, an image (a stereoscopic image) which is stereoscopically visible with naked eyes. That is, the upper LCD 22 allows a user to view the image for a left eye with her/his left eye, and the image for a right eye with her/his right eye by utilizing a parallax barrier, so that a stereoscopic image (a stereoscopically visible image) exerting a stereoscopic effect for a user can be displayed. Further, the upper LCD 22 may disable the parallax barrier. When the parallax barrier is disabled, an image can be displayed in a planar manner (it is possible to display a planar visible image which is different from a stereoscopically visible image as described above. Specifically, a display mode is used in which the same displayed image is viewed with a left eye and a right eye). Thus, the upper LCD 22 is a display device capable of switching between a stereoscopic display mode for displaying a stereoscopically visible image and a planar display mode (for displaying a planar visible image) for displaying an image in a planar manner. The switching of the display mode is performed by the 3D adjustment switch 25 described later.
  • Two imaging sections (23 a and 23 b) provided on the outer side surface (the back surface reverse of the main surface on which the upper LCD 22 is provided) 21D of the upper housing 21 are generically referred to as the outer imaging section 23. The imaging directions of the outer imaging section (left) 23 a and the outer imaging section (right) 23 b are each the same as the outward normal direction of the outer side surface 21D. Further, these imaging sections are each designed so as to be positioned in a direction which is opposite to the normal direction of the display surface (inner side surface) of the upper LCD 22 by 180 degrees. Specifically, the imaging direction of the outer imaging section (left) 23 a and the imaging direction of the outer imaging section (right) 23 b are parallel to each other. The outer imaging section (left) 23 a and the outer imaging section (right) 23 b can be used as a stereo camera depending on a program executed by the game apparatus 10. Further, depending on a program, when any one of the two outer imaging sections (23 a and 23 b) is used alone, the outer imaging section 23 may be used as a non-stereo camera. Further, depending on a program, images shot by the two outer imaging sections (23 a and 23 b) may be combined with each other or may compensate for each other, thereby enabling imaging using an extended imaging range. In the present embodiment, the outer imaging section 23 is structured so as to include two imaging sections, that is, the outer imaging section (left) 23 a and the outer imaging section (right) 23 b. Each of the outer imaging section (left) 23 a and the outer imaging section (right) 23 b includes an imaging device, such as a CCD image sensor or a CMOS image sensor, having a common predetermined resolution, and a lens. The lens may have a zooming mechanism.
  • As indicated by dashed lines in FIG. 1 and by solid lines in FIG. 3( b), the outer imaging section (left) 23 a and the outer imaging section (right) 23 b forming the outer imaging section 23 are aligned so as to be parallel to the horizontal direction of the screen of the upper LCD 22. Specifically, the outer imaging section (left) 23 a and the outer imaging section (right) 23 b are positioned such that a straight line connecting between the two imaging sections is parallel to the horizontal direction of the screen of the upper LCD 22. Reference numerals 23 a and 23 b which are indicated as dashed lines in FIG. 1 represent the outer imaging section (left) 23 a and the outer imaging section (right) 23 b, respectively, which are positioned on the outer side surface reverse of the inner side surface of the upper housing 21. As shown in FIG. 1, when a user views the screen of the upper LCD 22 from the front thereof, the outer imaging section (left) 23 a is positioned to the left of the upper LCD 22 and the outer imaging section (right) 23 b is positioned to the right of the upper LCD 22. When a program for causing the outer imaging section 23 to function as a stereo camera is executed, the outer imaging section (left) 23 a shoots an image for a left eye, which is viewed by a left eye of a user, and the outer imaging section (right) 23 b shoots an image for a right eye, which is viewed by a right eye of the user. A distance between the outer imaging section (left) 23 a and the outer imaging section (right) 23 b is set so as to be approximately the same as a distance between both eyes of a person, that is, may be set so as to be within a range from 30 mm to 70 mm, for example. However, the distance between the outer imaging section (left) 23 a and the outer imaging section (right) 23 b is not limited to a distance within the range described above.
  • In the present embodiment, the outer imaging section (left) 23 a and the outer imaging section (right) 23 b are secured to the housing, and the imaging directions thereof cannot be changed.
  • Further, the outer imaging section (left) 23 a and the outer imaging section (right) 23 b are positioned to the left and to the right, respectively, of the upper LCD 22 (on the left side and the right side, respectively, of the upper housing 21) so as to be horizontally symmetrical with respect to the center of the upper LCD 22. Specifically, the outer imaging section (left) 23 a and the outer imaging section (right) 23 b are positioned so as to be symmetrical with respect to a line which divides the upper LCD 22 into two equal parts, that is, the left part and the right part. Further, the outer imaging section (left) 23 a and the outer imaging section (right) 23 b are positioned at positions which are reverse of positions above the upper edge of the screen of the upper LCD 22 and which are on the upper portion of the upper housing 21 in an opened state. Specifically, when the upper LCD 22 is projected on the outer side surface of the upper housing 21, the outer imaging section (left) 23 a and the outer imaging section (right) 23 b are positioned, on the outer side surface of the upper housing 21, at a position above the upper edge of the screen of the upper LCD 22 having been projected.
  • As described above, the two imaging sections (23 a and 23 b) of the outer imaging section 23 are positioned to the left and the right of the upper LCD 22 so as to be horizontally symmetrical with respect to the center of the upper LCD 22. Therefore, when a user views the upper LCD 22 from the front thereof, the imaging direction of the outer imaging section 23 can be the same as the direction of the line of sight of the user. Further, the outer imaging section 23 is positioned at a position reverse of a position above the upper edge of the screen of the upper LCD 22. Therefore, the outer imaging section 23 and the upper LCD 22 do not interfere with each other inside the upper housing 21. Therefore, the upper housing 21 may have a reduced thickness as compared to a case where the outer imaging section 23 is positioned on a position reverse of a position of the screen of the upper LCD 22.
  • The inner imaging section 24 is positioned on the inner side surface (main surface) 21B of the upper housing 21, and acts as an imaging section which has an imaging direction which is the same direction as the inward normal direction of the inner side surface. The inner imaging section 24 includes an imaging device, such as a CCD image sensor and a CMOS image sensor, having a predetermined resolution, and a lens. The lens may have a zooming mechanism.
  • As shown in FIG. 1, when the upper housing 21 is in the opened state, the inner imaging section 24 is positioned, on the upper portion of the upper housing 21, above the upper edge of the screen of the upper LCD 22. Further, in this state, the inner imaging section 24 is positioned at the horizontal center of the upper housing 21 (on a line which separates the upper housing 21 (the screen of the upper LCD 22) into two equal parts, that is, the left part and the right part). Specifically, as shown in FIG. 1 and FIG. 3( b), the inner imaging section 24 is positioned on the inner side surface of the upper housing 21 at a position reverse of the middle position between the left and the right imaging sections (the outer imaging section (left) 23 a and the outer imaging section (right) 23 b) of the outer imaging section 23. Specifically, when the left and the right imaging sections of the outer imaging section 23 provided on the outer side surface of the upper housing 21 are projected on the inner side surface of the upper housing 21, the inner imaging section 24 is positioned at the middle position between the left and the right imaging sections having been projected. The dashed line 24 indicated in FIG. 3( b) represents the inner imaging section 24 positioned on the inner side surface of the upper housing 21.
  • As described above, the inner imaging section 24 is used for shooting an image in the direction opposite to that of the outer imaging section 23. The inner imaging section 24 is positioned on the inner side surface of the upper housing 21 at a position reverse of the middle position between the left and the right imaging sections of the outer imaging section 23. Thus, when a user views the upper LCD 22 from the front thereof, the inner imaging section 24 can shoot an image of a face of the user from the front thereof. Further, the left and the right imaging sections of the outer imaging section 23 do not interfere with the inner imaging section 24 inside the upper housing 21, thereby enabling reduction of the thickness of the upper housing 21.
  • The 3D adjustment switch 25 is a slide switch, and is used for switching a display mode of the upper LCD 22 as described above. Further, the 3D adjustment switch 25 is used for adjusting the stereoscopic effect of a stereoscopically visible image (stereoscopic image) which is displayed on the upper LCD 22. As shown in FIG. 1 to FIG. 3, the 3D adjustment switch 25 is provided at the end portions of the inner side surface and the right side surface of the upper housing 21, and is positioned at a position at which the 3D adjustment switch 25 is visible to a user when the user views the upper LCD 22 from the front thereof. Further, an operation section of the 3D adjustment switch 25 projects on the inner side surface and the right side surface, and can be viewed and operated from both sides. All the switches other than the 3D adjustment switch 25 are provided on the lower housing 11.
  • FIG. 4 is a cross-sectional view of the upper housing 21 shown in FIG. 1 taken along a line A-A′. As shown in FIG. 4, a recessed portion 21C is formed at the right end portion of the inner side surface of the upper housing 21, and the 3D adjustment switch 25 is provided in the recessed portion 21C. The 3D adjustment switch 25 is provided so as to be visible from the front surface and the right side surface of the upper housing 21 as shown in FIG. 1 and FIG. 2. A slider 25 a of the 3D adjustment switch 25 is slidable to any position in a predetermined direction (the height direction), and a display mode of the upper LCD 22 is determined in accordance with the position of the slider 25 a.
  • FIG. 5A to FIG. 5C are each a diagram illustrating a state in which the slider 25 a of the 3D adjustment switch 25 slides. FIG. SA is a diagram illustrating a state in which the slider 25 a of the 3D adjustment switch 25 is positioned at the lowermost position (a third position). FIG. 5B is a diagram illustrating a state in which the slider 25 a of the 3D adjustment switch 25 is positioned above the lowermost position (a first position). FIG. 5C is a diagram illustrating a state in which the slider 25 a of the 3D adjustment switch 25 is positioned at the uppermost position (a second position).
  • As shown in FIG. 5A, when the slider 25 a of the 3D adjustment switch 25 is positioned at the lowermost position (the third position), the upper LCD 22 is set to the planar display mode, and a planar image is displayed on the screen of the upper LCD 22 (the upper LCD 22 may remain set to the stereoscopic display mode, and the same image may be used for the image for a left eye and the image for a right eye, to perform planar display). On the other hand, when the slider 25 a is positioned between a position shown in FIG. 5B (a position (first position) above the lowermost position) and a position shown in FIG. 5C (the uppermost position (the second position)), the upper LCD 22 is set to the stereoscopic display mode. In this case, a stereoscopically visible image is displayed on the screen of the upper LCD 22. When the slider 25 a is positioned between the first position and the second position, a manner in which the stereoscopic image is visible is adjusted in accordance with the position of the slider 25 a. Specifically, an amount of deviation in the horizontal direction between a position of an image for a right eye and a position of an image for a left eye is adjusted in accordance with the position of the slider 25 a. The slider 25 a of the 3D adjustment switch 25 is configured so as to be fixed at the third position, and is slidable, along the height direction, to any position between the first position and the second position. For example, the slider 25 a is fixed at the third position by a projection (not shown) which projects, from the side surface of the 3D adjustment switch 25, in the lateral direction shown in FIG. 5A, and does not slide upward from the third position unless a predetermined force or a force greater than the predetermined force is applied upward. When the slider 25 a is positioned between the third position and the first position, the manner in which the stereoscopic image is visible is not adjusted, which is intended as a margin. In another embodiment, the third position and the first position may be the same position, and, in this case, no margin is provided. Further, the third position may be provided between the first position and the second position. In this case, a direction in which an amount of deviation in the horizontal direction between a position of an image for a right eye and a position of an image for a left eye is adjusted when the slider is moved from the third position toward the first position, is opposite to a direction in which an amount of deviation in the horizontal direction between the position of the image for the right eye and the position of the image for the left eye is adjusted when the slider is moved from the third position toward the second position.
  • The 3D indicator 26 indicates whether or not the upper LCD 22 is in the stereoscopic display mode. The 3D indicator 26 is implemented as a LED, and is lit up when the stereoscopic display mode of the upper LCD 22 is enabled. The 3D indicator 26 may be lit up only when the program processing for displaying a stereoscopically visible image is performed (namely, image processing in which an image for a left eye is different from an image for a right eye is performed in the case of the 3D adjustment switch being positioned between the first position and the second position) in a state where the upper LCD 22 is in the stereoscopic display mode. As shown in FIG. 1, the 3D indicator 26 is positioned near the screen of the upper LCD 22 on the inner side surface of the upper housing 21. Therefore, when a user views the screen of the upper LCD 22 from the front thereof, the user can easily view the 3D indicator 26. Therefore, also when a user is viewing the screen of the upper LCD 22, the user can easily recognize the display mode of the upper LCD 22.
  • Further, a speaker hole 21E is provided on the inner side surface of the upper housing 21. A sound is outputted through the speaker hole 21E from a speaker 43 described below.
  • (Internal Configuration of Game Apparatus 10)
  • Next, an internal electrical configuration of the game apparatus 10 will be described with reference to FIG. 6. FIG. 6 is a block diagram illustrating an internal configuration of the game apparatus 10. As shown in FIG. 6, the game apparatus 10 includes, in addition to the components described above, electronic components such as an information processing section 31, a main memory 32, an external memory interface (external memory I/F) 33, an external data storage memory I/F 34, an internal data storage memory 35, a wireless communication module 36, a local communication module 37, a real-time clock (RTC) 38, an acceleration sensor 39, a power supply circuit 40, an interface circuit (I/F circuit) 41, and the like. These electronic components are mounted on an electronic circuit substrate, and accommodated in the lower housing 11 (or the upper housing 21).
  • The information processing section 31 is information processing means which includes a CPU (Central Processing Unit) 311 for executing a predetermined program, a GPU (Graphics Processing Unit) 312 for performing image processing, and the like. The CPU 311 of the information processing section 31 executes a program stored in a memory (for example, the external memory 44 connected to the external memory I/F 33 or the internal data storage memory 35) inside the game apparatus 10, thereby executing processing (for example, shooting processing, or image display processing described later) corresponding to the program. The program executed by the CPU 311 of the information processing section 31 may be acquired from another device through communication with the other device. The information processing section 31 further includes a VRAM (Video RAM) 313. The GPU 312 of the information processing section 31 generates an image in accordance with an instruction from the CPU 311 of the information processing section 31, and renders the image in the VRAM 313. The GPU 312 of the information processing section 31 outputs the image rendered in the VRAM 313, to the upper LCD 22 and/or the lower LCD 12, and the image is displayed on the upper LCD 22 and/or the lower LCD 12.
  • To the information processing section 31, the main memory 32, the external memory I/F 33, the external data storage memory I/F 34, and the internal data storage memory 35 are connected. The external memory I/F 33 is an interface for detachably connecting to the external memory 44. The external data storage memory I/F 34 is an interface for detachably connecting to the external data storage memory 45.
  • The main memory 32 is volatile storage means used as a work area and a buffer area for (the CPU 311 of) the information processing section 31. That is, the main memory 32 temporarily stores various types of data used for the processing based on the program, and temporarily stores a program acquired from the outside (the external memory 44, another device, or the like), for example. In the present embodiment, for example, a PSRAM (Pseudo-SRAM) is used as the main memory 32.
  • The external memory 44 is nonvolatile storage means for storing a program executed by the information processing section 31. The external memory 44 is implemented as, for example, a read-only semiconductor memory. When the external memory 44 is connected to the external memory I/F 33, the information processing section 31 can load a program stored in the external memory 44. Predetermined processing is performed by the program loaded by the information processing section 31 being executed. The external data storage memory 45 is implemented as a non-volatile readable and writable memory (for example, a NAND flash memory), and is used for storing predetermined data. For example, images shot by the outer imaging section 23 and/or images shot by another device are stored in the external data storage memory 45. When the external data storage memory 45 is connected to the external data storage memory I/F 34, the information processing section 31 loads an image stored in the external data storage memory 45, and the image can be displayed on the upper LCD 22 and/or the lower LCD 12.
  • The internal data storage memory 35 is implemented as a non-volatile readable and writable memory (for example, a NAND flash memory), and is used for storing predetermined data. For example, data and/or programs downloaded through the wireless communication module 36 by wireless communication are stored in the internal data storage memory 35.
  • The wireless communication module 36 has a function of connecting to a wireless LAN by using a method based on, for example, IEEE 802.11.b/g standard. The local communication module 37 has a function of performing wireless communication with the same type of game apparatus in a predetermined communication method (for example, communication by a unique protocol, or infrared communication). The wireless communication module 36 and the local communication module 37 are connected to the information processing section 31. The information processing section 31 can perform data transmission to and data reception from another device via the Internet by using the wireless communication module 36, and can perform data transmission to and data reception from the same type of another game apparatus by using the local communication module 37.
  • The acceleration sensor 39 is connected to the information processing section 31. The acceleration sensor 39 detects magnitudes of accelerations (linear accelerations) in the directions of the straight lines along the three axial (xyz axial) directions, respectively. The acceleration sensor 39 is provided inside the lower housing 11. In the acceleration sensor 39, as shown in FIG. 1, the long side direction of the lower housing 11 is defined as x axial direction, the short side direction of the lower housing 11 is defined as y axial direction, and the direction orthogonal to the inner side surface (main surface) of the lower housing 11 is defined as z axial direction, thereby detecting magnitudes of the linear accelerations for the respective axes. The acceleration sensor 39 is, for example, an electrostatic capacitance type acceleration sensor. However, another type of acceleration sensor may be used. The acceleration sensor 39 may be an acceleration sensor for detecting a magnitude of acceleration for one axial direction or two-axial directions. The information processing section 31 can receive data (acceleration data) representing accelerations detected by the acceleration sensor 39, and detect an orientation and a motion of the game apparatus 10.
  • The RTC 38 and the power supply circuit 40 are connected to the information processing section 31. The RTC 38 counts time, and outputs the time to the information processing section 31. The information processing section 31 calculates a current time (date) based on the time counted by the RTC 38. The power supply circuit 40 controls power from the power supply (the rechargeable battery accommodated in the lower housing 11 as described above) of the game apparatus 10, and supplies power to each component of the game apparatus 10.
  • The I/F circuit 41 is connected to the information processing section 31. The microphone 42 and the speaker 43 are connected to the I/F circuit 41. Specifically, the speaker 43 is connected to the I/F circuit 41 through an amplifier which is not shown. The microphone 42 detects a voice from a user, and outputs a sound signal to the I/F circuit 41. The amplifier amplifies a sound signal outputted from the I/F circuit 41, and a sound is outputted from the speaker 43. The touch panel 13 is connected to the I/F circuit 41. The I/F circuit 41 includes a sound control circuit for controlling the microphone 42 and the speaker 43 (amplifier), and a touch panel control circuit for controlling the touch panel. The sound control circuit performs A/D conversion and D/A conversion on the sound signal, and converts the sound signal to a predetermined form of sound data, for example. The touch panel control circuit generates a predetermined form of touch position data based on a signal outputted from the touch panel 13, and outputs the touch position data to the information processing section 31. The touch position data represents a coordinate of a position, on an input surface of the touch panel 13, on which an input is made. The touch panel control circuit reads a signal outputted from the touch panel 13, and generates the touch position data every predetermined time. The information processing section 31 acquires the touch position data, to recognize a position on which an input is made on the touch panel 13.
  • The operation button 14 includes the operation buttons 14A to 14L described above, and is connected to the information processing section 31. Operation data representing an input state of each of the operation buttons 14A to 14I is outputted from the operation button 14 to the information processing section 31, and the input state indicates whether or not each of the operation buttons 14A to 14I has been pressed. The information processing section 31 acquires the operation data from the operation button 14 to perform processing in accordance with the input on the operation button 14.
  • The lower LCD 12 and the upper LCD 22 are connected to the information processing section 31. The lower LCD 12 and the upper LCD 22 each display an image in accordance with an instruction from (the GPU 312 of) the information processing section 31. In the present embodiment, the information processing section 31 causes the upper LCD 22 to display a stereoscopic image (image that is stereoscopically visible).
  • Specifically, the information processing section 31 is connected to an LCD controller (not shown) of the upper LCD 22, and causes the LCD controller to set the parallax barrier to ON or OFF. When the parallax barrier is set to ON in the upper LCD 22, an image for a right eye and an image for a left eye, which are stored in the VRAM 313 of the information processing section 31 are outputted to the upper LCD 22. More specifically, the LCD controller alternately repeats reading of pixel data of the image for a right eye for one line in the vertical direction, and reading of pixel data of the image for a left eye for one line in the vertical direction, thereby reading, from the VRAM 313, the image for a right eye and the image for a left eye. Thus, an image to be displayed is divided into the images for a right eye and the images for a left eye each of which is a rectangle-shaped image having one line of pixels aligned in the vertical direction, and an image, in which the rectangle-shaped image for the left eye which is obtained through the division, and the rectangle-shaped image for the right eye which is obtained through the division are alternately aligned, is displayed on the screen of the upper LCD 22. A user views the images through the parallax barrier in the upper LCD 22, so that the image for the right eye is viewed by the user's right eye, and the image for the left eye is viewed by the user's left eye. Thus, the stereoscopically visible image is displayed on the screen of the upper LCD 22.
  • The outer imaging section 23 and the inner imaging section 24 are connected to the information processing section 31. The outer imaging section 23 and the inner imaging section 24 each shoot an image in accordance with an instruction from the information processing section 31, and output data of the shot image to the information processing section 31.
  • The 3D adjustment switch 25 is connected to the information processing section 31. The 3D adjustment switch 25 transmits, to the information processing section 31, an electrical signal in accordance with the position of the slider 25 a.
  • The 3D indicator 26 is connected to the information processing section 31. The information processing section 31 controls whether or not the 3D indicator 26 is to be lit up. For example, the information processing section 31 lights up the 3D indicator 26 when the upper LCD 22 is in the stereoscopic display mode. The game apparatus 10 has the internal configuration as described above.
  • (Outline of Operation of Game Apparatus 10)
  • Hereinafter, the outline of operation of the game apparatus 10 according to the present embodiment will be described. In the present embodiment, a composite image obtained by combining an image of the real world currently shot by the outer imaging section 23 (the outer imaging section (left) 23 a and the outer imaging section (right) 23 b), and an image of a virtual object present in a 3-dimensional virtual space is displayed in a stereoscopically visible manner on the screen of the upper LCD 22, based on an image display program.
  • (Stereoscopic Display of Real World Image)
  • Two images shot by the outer imaging section 23 are supplied to the upper LCD such that the two images have a predetermined parallax, and then are stereoscopically displayed.
  • FIG. 7 shows an example of a stereoscopic image displayed on the screen of the upper LCD 22 when three balls 60 a to 60 c (real objects) are shot by the outer imaging section 23. When the balls 60 a to 60 e are shot by the outer imaging section 23, the balls 60 a to 60 c are displayed in a stereoscopically visible manner on the screen of the upper LCD 22. As shown in FIG. 7, on the screen of the upper LCD 22, the ball 60 a which is nearest the outer imaging section 23 appears to be at the nearest position when viewed by the user, and the ball 60 c which is farthest from the outer imaging section 23 appears to be at the farthest position when viewed by the user. It is noted that in FIG. 7, the state in which balls are partially beyond the screen of the upper LCD 22 is shown for the purpose of facilitating the understanding, but actually, the balls are displayed in the screen. The same holds true for FIG. 8 and FIG. 10 described later. It is noted that in FIG. 7, since a marker 61 (see FIG. 8) described later is not shot by the outer imaging section 23, a virtual object is not displayed on the screen of the upper LCD 22. In addition, although in FIG. 7, FIG. 8, FIG. 10, and the like, the states in which a display target appears to protrude when viewed by the user are shown, displaying in a stereoscopically visible manner includes not only the case where the display target appears to protrude when viewed by the user, but also the case where the display target is displayed with depth so as to appear to be in the back of the screen.
  • (Stereoscopic Display of Real World Image and CG Image)
  • FIG. 8 shows an example of a stereoscopic image displayed on the screen of the upper LCD 22 when the marker 61 (real object) is being shot by the outer imaging section 23. As shown in FIG. 8, a square including an arrow is drawn on the marker 61 The CPU 311 performs image processing such as pattern matching for an image obtained by the outer imaging section 23, thereby determining whether or not the marker is included in the image. When the marker 61 is being shot by the outer imaging section 23, the marker 61 is displayed as a real world image in a stereoscopically visible manner on the upper LCD 22, and a virtual object 62 (for example, a virtual object representing a dog) is displayed being superimposed onto the real world image, in a stereoscopically visible manner at the position of the marker 61. It is noted that as shown in FIG. 9, directions (forward direction, rightward direction, and upward direction) are set for the marker 61, a virtual object can be placed with its orientation corresponding to the orientation of the marker 61. For example, the virtual object 62 can be placed on the marker 61 such that the forward direction of the virtual object 62 coincides with the forward direction of the marker 61.
  • FIG. 10 shows another example of a stereoscopic image displayed on the screen of the upper LCD 22 when the marker 61 is being shot by the outer imaging section 23. As shown in FIG. 10, if the position and the direction of the marker 61 displayed on the screen of the upper LCD 22 have changed by the user moving the game apparatus 10, the position and the direction of the virtual object 62 change so as to follow the change. Therefore, it appears to the user as if the virtual object 62 were actually present in the real world.
  • Hereinafter, with reference to FIG. 11 to FIG. 40, the details of image display processing executed based on the image display program by the game apparatus 10 will be described.
  • (Memory Map)
  • First, main data stored in the main memory 32 when the image display processing is performed will be described. FIG. 11 is a diagram showing a memory map of the main memory 32 of the game apparatus 10. As shown in FIG. 11, the main memory 32 stores an image display program 70, a latest left real world image 71L, a latest right real world image 71R, a display left real world image 72L, a display right real world image 72R, a stereoscopic view zero distance 73, a virtual object information 74, a left transform matrix 75L, a right transform matrix 75R, a left view matrix 76L, a right view matrix 76R, a left projection matrix 77L, a right projection matrix 77R, a display mode 78, a marker recognition mode 79, a main real world image identification information 80, variables 81, and the like.
  • The image display program 70 is a program for causing the CPU 311 to execute the image display processing.
  • The latest left real world image 71L is the latest image shot by the outer imaging section (left) 23 a.
  • The latest right real world image 71R is the latest image shot by the outer imaging section (right) 23 b.
  • The display left real world image 72L is the latest image that has been determined as an image that is to be displayed on the upper LCD 22, among images shot by the outer imaging section (left) 23 a.
  • The display left real world image 72R is the latest image that has been determined to be displayed on the upper LCD 22, among images shot by the outer imaging section (right) 23 b.
  • The stereoscopic view zero distance 73 is a variable indicating the distance, of an object from the outer imaging section 23 in the shooting direction, that allows the object to appear to be present at a position with the same depth as the screen of the upper LCD 22 when the object is stereoscopically displayed on the screen of the upper LCD 22. For example, the stereoscopic view zero distance 73 is represented by unit of centimeter. In the present embodiment, the stereoscopic view zero distance 73 has a fixed value of 25 cm, but this value is merely an example. The stereoscopic view zero distance 73 may be changed anytime as needed, in accordance with instructions from the user, or automatically by a computer. The stereoscopic view zero distance 73 may be defined by a distance from the marker 61 or a ratio about the distance between the outer imaging section 23 and the marker 61. Alternatively, the stereoscopic view zero distance 73 does not necessarily need to be defined as a distance in the real world, but may be defined as a distance in the virtual world. If, for example, the size of the marker 61 is known, it is possible to cause the unit of length in the real world and the unit of length in the virtual space to coincide with each other. In this way, if it is possible to cause both units to coincide with each other, the stereoscopic view zero distance 73 can be set by using the unit of length in the real world. However, if it is not possible to cause both units to coincide with each other, the stereoscopic view zero distance 73 may be set by using the unit of length in the virtual world. It is noted that, as described later, in the present embodiment, it is possible to cause the unit of length in the real world and the unit of length in the virtual space to coincide with each other without using the size of the marker 61.
  • The virtual object information 74 is information that relates to the aforementioned virtual object 62, and includes 3D model data (polygon data) indicating the shape of the virtual object 62, texture data indicating the pattern of the virtual object 62, information about the position and the orientation of the virtual object 62 in the virtual space, and the like.
  • The left transform matrix 75L is a coordinate transform matrix that transforms coordinates represented in a coordinate system (marker coordinate system) set based on the position and the orientation of the marker 61, into coordinates represented in a coordinate system (outer imaging section (left) coordinate system) set based on the position and the orientation of the outer imaging section (left) 23 a, which matrix is calculated by recognizing the position and the orientation of the marker 61 in a left real world image. The left transform matrix 75L includes information about the relative position and relative orientation of the outer imaging section (left) 23 a with respect to the position and the orientation of the marker 61. More specifically, the left transform matrix 75L includes information about the position and the orientation of the outer imaging section (left) 23 a in the marker coordinate system.
  • The right transform matrix 75R is a coordinate transform matrix that transforms coordinates represented in the marker coordinate system, into coordinates represented in a coordinate system (outer imaging section (right) coordinate system) set based on the position and the orientation of the outer imaging section (right) 23 b, which matrix is calculated by recognizing the position and the orientation of the marker 61 in a right real world image. The right transform matrix 75R includes information about the relative position and relative orientation of the outer imaging section (right) 23 b with respect to the position and the orientation of the marker 61. More specifically, the right transform matrix 75R includes information about the position and the orientation of the outer imaging section (right) 23 b in the marker coordinate system.
  • It is noted that in the present specification, a transform matrix that transforms the marker coordinate system into the outer imaging section (left) coordinate system or the outer imaging section (right) coordinate system, is referred to as a “marker-camera transform matrix”. The left transform matrix 75L and the right transform matrix 75R are the “marker-camera transform matrices”.
  • The left view matrix 76L is a coordinate transform matrix that transforms coordinates represented in a world coordinate system of the virtual world, into coordinates represented in a left virtual camera coordinate system, which matrix is used for drawing the virtual object 62 as it is looked at by the left virtual camera. The left view matrix 76L includes information about the position and the orientation of the left virtual camera in the world coordinate system of the virtual world.
  • The right view matrix 76R is a coordinate transform matrix that transforms coordinates represented in the world coordinate system of the virtual world, into coordinates represented in a right virtual camera coordinate system, which matrix is used for drawing the virtual object 62 as it is looked at by the right virtual camera. The right view matrix 76R includes information about the position and the orientation of the right virtual camera in the world coordinate system of the virtual world.
  • The left projection matrix 77L is a coordinate transform matrix that transforms coordinates represented in a left virtual camera coordinate system, into coordinates represented in a screen coordinate system, which matrix is used for drawing the virtual world (the virtual object 62 present in the virtual world) as it is looked at by the left virtual camera.
  • The right projection matrix 77R is a coordinate transform matrix that transforms coordinates represented in a right virtual camera coordinate system, into coordinates represented in the screen coordinate system, which matrix is used for drawing the virtual world (the virtual object 62 present in the virtual world) as it is looked at by the right virtual camera.
  • The display mode 78 is data indicating the current display mode. More specifically, the display mode 78 is data indicating a synchronous display mode in which the real world image and the virtual space image are superimposed and displayed being synchronized with each other, or an asynchronous display mode in which the real world image and the virtual space image are superimposed and displayed without synchronizing the real world image and the virtual space image with each other.
  • The marker recognition mode 79 is data indicating the current marker recognition mode. More specifically, the marker recognition mode 79 is data indicating a one-image recognition mode in which marker recognition processing is performed for only one of the left real world image and the right real world image, or a both-image recognition mode in which the marker recognition processing is performed for both the left real world image and the right real world image.
  • The main real world image identification information 80 is data indicating which of the left real world image and the right real world image is a main real world image. In the above one-image recognition mode, the marker recognition processing is performed for only the main real world image, whereas the marker recognition processing is not performed for the other real world image (hereinafter, referred to as a sub real world image).
  • The variables 81 are used in executing the image display program 70, and include variables shown in FIG. 12. The meanings of the variables will be described below as needed.
  • When the game apparatus 10 is powered on, the information processing section 31 (CPU 311) of the game apparatus 10 executes a boot program stored in a ROM, which is not shown, thereby initializing units such as the main memory 32. Next, the image display program stored in the data storage internal memory 35 is loaded onto the main memory 32, and then the CPU 311 of the information processing section 31 starts to execute the image display program.
  • Hereinafter, with reference to flowcharts shown in FIG. 13 to FIG. 19, the flow of processing executed based on the image display program will be described. It is noted that the flowcharts shown in FIG. 13 to FIG. 19 are merely an example. Therefore, the order of the processing steps may be changed as long as the same result is obtained. In addition, the values of the variables, and threshold values used in determination steps are merely an example, and other values may be used as the values of the variables or the threshold values as appropriate. In addition, although in the present embodiment, the CPU 311 executes processing of all steps in the flowcharts shown in FIG. 13 to FIG. 19, a processor or a dedicated circuit other than the CPU 311 may execute some of the steps in the flowcharts shown in FIG. 13 to FIG. 19.
  • (Marker Processing)
  • FIG. 13 is a flowchart showing a flow of marker processing executed based on the image display program 70 by the CPU 311. The marker processing is executed in parallel with main processing described later. In the present embodiment, the marker processing is executed when the CPU 311 is in an idle state. Hereinafter, the details of the marker processing will be described.
  • In step S10 in FIG. 13, the CPU 311 determines whether or not both real world images (that is, the left real world image and the right real world image) have been obtained by the outer imaging section 23. If both real world images have been obtained, the CPU 311 proceeds to step S11. The latest one of left real world images obtained by the outer imaging section 23 is stored, as the latest left real world image 71L, in the main memory 32, and the latest one of right real world images obtained by the outer imaging section 23 is stored, as the latest right real world image 71R, in the main memory 32.
  • It is noted that, as previously described, the outer imaging section (left) 23 a and the outer imaging section (right) 23 b of the upper housing 21 are separated by a certain distance away (for example, 3.5 cm, hereinafter, referred to as imaging sections distance). Therefore, when the marker 61 is shot by the outer imaging section (left) 23 a and the outer imaging section (right) 23 b at the same time, a difference due to a parallax occurs between each of the position and the orientation of the marker 61 in the left real world image shot by the outer imaging section (left) 23 a, and each of the position and the orientation of the marker 61 in the right real world image shot by the outer imaging section (right) 23 b, as shown in FIG. 20.
  • In step S11, the CPU 311 performs the marker recognition processing for the main real world image. More specifically, the CPU 311 determines whether or not the marker 61 is included in the main real world image by a pattern matching method or the like. If the marker 61 is included in the main real world image, the CPU 311 calculates the left transform matrix 75L or the right transform matrix 75R, based on the position and the orientation of the marker 61 in the main real world image (if the main real world image is the left real world image, the CPU 61 calculates the left transform matrix 75L, and if the main real world image is the right real world image, the CPU 61 calculates the right transform matrix 75R).
  • It is noted that the left transform matrix 75L is a matrix reflecting therein the position and the orientation of the outer imaging section (left) 23 a which are calculated based on the position and the orientation of the marker 61 in the left real world image. To be more exact, as shown in FIG. 21, the left transform matrix 75L is a coordinate transform matrix that transforms coordinates represented in the marker coordinate system (in which the position of the marker 61 in the real world is the origin, and the forward direction, the lateral direction, and the normal direction of the marker 61 correspond to the respective axes), into coordinates represented in the outer imaging section (left) coordinate system based on the position and the orientation of the outer imaging section (left) 23 a which are calculated based on the position and the orientation of the marker 61 in the left real world image.
  • In addition, the right transform matrix 75R is a matrix reflecting therein the position and the orientation of the outer imaging section (right) 23 b which are calculated based on the position and the orientation of the marker 61 in the right real world image. To be more exact, as shown in FIG. 22, the right transform matrix 75R is a coordinate transform matrix that transforms coordinates represented in the marker coordinate system, into coordinates represented in the outer imaging section (right) coordinate system based on the position and the orientation of the outer imaging section (right) 23 b which are calculated based on the position and the orientation of the marker 61 in the right real world image.
  • It is noted that if it is assumed that the accuracy of the marker recognition is sufficiently high so that there is no error, and that the accuracy in the attachment of the outer imaging section (left) 23 a and the outer imaging section (right) 23 b to the game apparatus 10 is sufficiently high so that there is no error, the position of the outer imaging section (right) 23 b indicated by the right transform matrix 75R which is the result of the marker recognition for the right real world image coincides with a position separated by the certain distance (imaging sections distance), along the x-axis direction of the outer imaging section (left) coordinate system (which direction is the lateral direction of the game apparatus 10 and is included in the horizontal directions when the game apparatus 10 is used), from the position of the outer imaging section (left) 23 a indicated by the left transform matrix 75L which is the result of the marker recognition for the left real world image. In addition, the orientation of the outer imaging section (right) 23 b indicated by the right transform matrix 75R, and the orientation of the outer imaging section (left) 23 a indicated by the left transform matrix 75L are the same as the state of the attachment of the outer imaging section (left) 23 a and the outer imaging section (right) 23 b to the game apparatus 10. In the present embodiment, the game apparatus 10 is designed such that the outer imaging section (left) 23 a and the outer imaging section (right) 23 b are attached so as to be in parallel with each other. Therefore, the x-axis, the y-axis, and the z-axis of the outer imaging section (left) coordinate system are in parallel with the x-axis, the y-axis, and the z-axis of the outer imaging section (right) coordinate system. However, actually, the marker recognition has some error, and the attachment of the outer imaging section (left) 23 a and the outer imaging section (right) 23 b to the game apparatus 10 has some error. Therefore, the relationship between each of the position and the orientation of the outer imaging section (left) 23 a indicated by the left transform matrix 75L, and each of the position and the orientation of the outer imaging section (right) 23 b indicated by the right transform matrix 75R, is not ideal. For example, the left transform matrix 75L and the right transform matrix 75R are such that the outer imaging section (left) 23 a and the outer imaging section (right) 23 b are rather close to each other or rather away from each other, or such that the orientation of the outer imaging section (left) 23 a and the orientation of the outer imaging section (right) 23 b are different. Alternatively, the left transform. matrix 75L and the right transform matrix 75R are such that the orientation of the outer imaging section (left) 23 a and the orientation of the outer imaging section (right) 23 b are not in parallel with each other.
  • Here, in the AR technology, a matrix that transforms the marker coordinate system into the outer imaging section (left) coordinate system or the outer imaging section (right) coordinate system is designated as a view matrix of a virtual camera, to superimpose a CG image onto the real world image. However, if the left transform matrix 75L is merely designated as the left view matrix 76L, and the right transform matrix 75R is merely designated as the right view matrix 76R in stereoscopically displaying a virtual space image on the upper LCD 22, a normal stereoscopic view of the virtual object 62 might not be provided. Accordingly, in the present embodiment, as is obvious from the description below, based on the position and the orientation (view matrix) of one of the virtual cameras calculated from the result of the marker recognition (marker-camera transform matrix) for the corresponding one of the left real world image and the right real world image, the position and the orientation (view matrix) of the other one of the virtual cameras are determined such that the relationship between each of the position and the orientation of the left virtual camera 63L, and each of the position and the orientation of the right virtual camera 63R, is ideal.
  • It is noted that if the marker 61 has not been recognized in the main real world image in step S11, a null value is stored in the left transform matrix 75L or the right transform matrix 75R, whereby the failure of the recognition of the marker 61 in the left real world image or the right real world image is recorded.
  • In step S12, the CPU 311 determines whether or not the current marker recognition mode is the one-image recognition mode. Then, if the current marker recognition mode is the one-image recognition mode, the CPU 311 proceeds to step S13, and if the current marker recognition mode is not the one-image recognition mode (that is, the marker recognition mode is the both-image recognition mode), the CPU 311 proceeds to step S14.
  • In step S13, the CPU 311 determines, as a failure, the result of the marker recognition for one of the left real world image and the right real world image that is not the main real world image (hereinafter, the one is referred to as a sub real world image). More specifically, if the main real world image is the left real world image, a null value is stored in the right transform matrix 75R, and if the main real world image is the right real world image, a null value is stored in the left transform matrix 75L.
  • In step S14, the CPU 311 performs the marker recognition processing for the sub real world image. More specifically, the CPU 311 determines whether or not the marker 61 is included in the sub real world image by a pattern matching method or the like. If the marker 61 is included in the sub real world image, the CPU 311 calculates the left transform matrix 75L or the right transform matrix 75R, based on the position and the orientation of the marker 61 in the sub real world image (if the sub real world image is the left real world image, the CPU 311 calculates the left transform matrix 75L, and if the sub real world image is the right real world image, the CPU 311 calculates the right transform matrix 75R).
  • In step S15, the CPU 311 determines whether or not the current display mode is the synchronous display mode. Then, if the current display mode is the synchronous display mode, the CPU 311 proceeds to step S17, and if the current display mode is not the synchronous display mode (that is, the current display mode is the asynchronous display mode), the CPU 311 proceeds to step S16.
  • In step S16, the CPU 311 stores the latest left real world image 71L and the latest right real world image 71R as the display left real world image 72L and the display right real world image 72R, respectively, in the main memory 32. Then, the CPU 311 returns to step S10.
  • In step S17, the CPU 311 determines whether or not the current marker recognition mode is the one-image recognition mode. Then, if the current marker recognition mode is the one-image recognition mode, the CPU 311 proceeds to step S18, and if the current marker recognition mode is the one-image recognition mode (the current marker recognition mode is the both-image recognition mode), the CPU 311 proceeds to step S19.
  • In step S18, the CPU 311 determines whether or not the marker recognition for the main real world image has been successful. More specifically, if the main real world image is the left real world image, the CPU 311 determines whether or not a valid matrix which is not a null value is stored in the left transform matrix 75L, and if the main real world image is the right real world image, the CPU 311 determines whether or not a valid matrix which is not a null value is stored in the right transform matrix 75R. Then, if the marker recognition for the main real world image has been successful, the CPU 311 proceeds to step S16, and if the marker recognition for the main real world image has not been successful, the CPU 311 returns to step S10.
  • In step S19, the CPU 311 determines whether or not at least one of the marker recognition for the left real world image and the marker recognition for the right real world image has been successful. More specifically, the CPU 311 determines whether or not valid matrices which are not null values are stored in the left transform matrix 75L and the right transform matrix 75R. Then, if at least one of the marker recognition for the left real world image and the marker recognition for the right real world image has been successful, the CPU 311 proceeds to step S16, and if not, the CPU 311 returns to step S10.
  • (Main Processing)
  • FIG. 14 is a flowchart showing a flow of main processing executed based on the image display program 70 by the CPU 311. The main processing is executed in parallel with the marker processing. Hereinafter, the details of the main processing will be described.
  • In step S20 in FIG. 14, the CPU 311 places the virtual object 62 at a predetermined position in the 3-dimensional virtual space. In the present embodiment, the virtual object 62 is placed at the origin of the virtual space (the origin of the world coordinate system).
  • In step S21, the CPU 311 executes updating processing. In the updating processing, the CPU 311 updates variables used for drawing the virtual object 62. The details of the updating processing will be described later.
  • In step S22, the CPU 311 executes virtual object processing. The virtual object processing relates to the virtual object 62 placed in the virtual space. For example, as needed, the size of the virtual object 62 is changed, or the virtual object 62 is caused to take a predetermined action (The virtual object 62 is caused to move in the virtual space. For example, if the virtual object 62 is caused to move around the origin of the virtual space, the virtual object 62 walking around the origin of the marker coordinate system is displayed). It is noted that control of a movement of the virtual object 62 is realized by changing the coordinates of the position of the virtual object 62 in the world coordinate system in the virtual space.
  • In step S23, in accordance with the stereoscopic view zero distance 73, the CPU 311 determines a drawing area that is an area, of the left real world image, used for display. More specifically, as shown in FIG. 23, a point that is separated by the value of the stereoscopic view zero distance 73 (for example, 30 cm) in the shooting direction from the outer imaging section 23, and that is at the same distance from the outer imaging section (left) 23 a and the outer imaging section (right) 23 b, is defined as a stereoscopic view zero point. An area whose center overlaps with the stereoscopic view zero point as they are viewed from the outer imaging section (left) 23 a, is determined as the drawing area of the left real world image. For example, as shown in FIG. 23, the length in the lateral direction of the drawing area is obtained as a ratio on a segment, taken from a line perpendicular to the shooting direction of the outer imaging section (left) 23 a, between the lines creating the angle of view of the outer imaging section (left) 23 a. The ratio is applied to the length in the lateral direction of the real world image, whereby the length of the drawing area of the real world image is determined. It is noted that the length in the height direction of the drawing area is determined based on the length in the lateral direction so as to adapt to the ratio of the height and the width of the display screen.
  • It is noted that in step S27 for determining a drawing area of the right real world image, which step will be described later, an area whose center overlaps with the stereoscopic view zero point as they are viewed from the outer imaging section (right) 23 b, is determined as the drawing area of the right real world image, as shown in FIG. 24. As a result, on the screen of the upper LCD 22, the position of the stereoscopic view zero point on an image for a left eye, and the position of the stereoscopic view zero point on an image for a right eye coincide with each other, and appear to have the same depth when viewed by the user.
  • In step S24, the CPU 311 draws a drawing area of the display left real world image 72L which is determined in step S23, in a predetermined storage area (hereinafter, referred to as a left frame buffer) for temporarily storing an image for a left eye to be supplied to the upper LCD 22, in the VRAM 313.
  • In step S25, the CPU 311 determines whether or not the value of an AR-Active (see FIG. 12) is true. The AR-Active is a variable (flag) indicating whether or not a virtual object can be displayed being superimposed onto the real world image. If a virtual object can be displayed being superimposed onto the real world image, the value of the AR-Active is set at true, and if a virtual object cannot be displayed being superimposed onto the real world image (for example, in the case where the marker 61 cannot be recognized at all), the value of the AR-Active is set at false (initial value). If the value of the AR-Active is true, the CPU 311 proceeds to step S26, and if the value of the AR-Active is not true (that is, the value of the AR-Active is false), the CPU 311 proceeds to step S27.
  • In step S26, for example, the CPU 311 overwrites the left frame buffer with the virtual space (hereinafter, referred to as a virtual space image for a left eye) as it is viewed from the left virtual camera 63L, as shown in FIG. 23 (actually and typically, the GPU 312 draws the virtual space in accordance with an instruction from the CPU 311). As a result, the virtual space image for a left eye is superimposed onto the left real world image drawn in the left frame buffer in step S24, as shown in FIG. 26. The resultant image in the left frame buffer is supplied, as an image for a left eye, to the upper LCD 22 at a predetermined timing. It is noted that the background of the virtual space in the virtual space image for a left eye is transparent. Therefore, superimposing the virtual space image onto the real world image generates an image in which the virtual object 62 is present on the real world image.
  • In step S27, the CPU 311 determines a drawing area of the right real world image in accordance with the stereoscopic view zero distance 73. It is noted that the details of processing of determining the drawing area are omitted because the processing is the same as processing of determining the drawing area of the left real world image.
  • In step S28, the CPU 311 draws a drawing area of the display right real world image 72R which is determined in step S27, in a predetermined storage area (hereinafter, referred to as a right frame buffer) for temporarily storing an image for a right eye to be supplied to the upper LCD 22, in the VRAM 313.
  • In step S29, the CPU 311 determines whether or not the value of the AR-Active is true. If the value of the AR-Active is true, the CPU 311 proceeds to step S30, and if the value of the AR-Active is not true (that is, the value of the AR-Active is false), the CPU 311 proceeds to step S31.
  • In step S30, the CPU 311 overwrites the right frame buffer with the virtual space (hereinafter, referred to as a virtual space image for a right eye) as it is viewed from the right virtual camera 63R (actually and typically, the GPU 312 draws the virtual space in accordance with an instruction from the CPU 311). As a result, the virtual space image for a right eye is superimposed onto the right real world image drawn in the right frame buffer in step S28. The resultant image in the right frame buffer is supplied, as an image for a right eye, to the upper LCD 22 at a predetermined timing.
  • In step S31, the CPU 311 waits for an interrupt signal (vertical synchronization interrupt) from the upper LCD 22. If the interrupt signal is generated, the CPU 311 returns to step S21. As a result, steps S21 to S31 are repeated in cycles of a predetermined period (for example, 1/60 second).
  • (Updating Processing)
  • Next, the details of the updating processing in step S21 in the main processing will be described with reference to the flowchart in FIG. 15.
  • In step S40 in FIG. 15, the CPU 311 determines whether or not the result of the marker recognition (that is, the left transform matrix 75L and the right transform matrix 75R) in the aforementioned marker processing has been updated. If the result of the marker recognition has been updated, the CPU 311 proceeds to step S41, and if the result of the marker recognition has not been updated, the CPU 311 proceeds to step S44.
  • In step S41, the CPU 311 determines whether or not both the marker recognition for the left real world image and the marker recognition for the right real world image have been successful. If both the marker recognition for the left real world image and the marker recognition for the right real world image have been successful, the CPU 311 proceeds to step S42, and if at least one of or both of the marker recognition for the left real world image and the marker recognition for the right real world image have been failed, the CPU 311 proceeds to step S44.
  • In step S42, the CPU 311 determines whether or not the value of EyeWidth (see FIG. 12) is 0, or whether or not the value of EyeMeasure (see FIG. 12) is true. The EyeWidth indicates the distance, between the outer imaging section (left) 23 a and the outer imaging section (right) 23 b, that is based on the position relationship therebetween, and is calculated by using the left transform matrix 75L and the right transform matrix 75R. It is noted that the distance does not need to be calculated on a scale of the real world, and may be calculated as a distance in the virtual space. Specifically, a relationship between the position of the outer imaging section (left) 23 a and the position of the outer imaging section (right) 23 b is calculated based on the position of the marker 61, by using a relationship between the position of the marker 61 and the position of the outer imaging section (left) 23 a, which is indicated by the left transform matrix 75L, and a relationship between the position of the marker 61 and the position of the outer imaging section (right) 23 b, which is indicated by the right transform matrix 75R, whereby the distance is calculated. It is noted that if the distance is yet to be determined, the value of the distance is set at 0 (initial value). It is noted that in the present embodiment, the EyeWidth is a scalar value. However, the EyeWidth may be a vector connecting the position of the outer imaging section (left) 23 a indicated by the left transform matrix 75L, and the position of the outer imaging section (right) 23 b indicated by the right transform matrix 75R. In addition, the EyeWidth may be the magnitude of the component, perpendicular to the shooting direction, of the vector connecting the position of the outer imaging section (left) 23 a indicated by the left transform matrix 75L, and the position of the outer imaging section (right) 23 b indicated by the right transform matrix 75R. It is noted that as described later, the EyeWidth is used for setting the distance (interval) between the left virtual camera 63L and the right virtual camera 63R in the virtual space. The EyeMeasure is a variable (flag) indicating whether or not the EyeWidth needs to be calculated again. If the EyeWidth needs to be calculated again, the EyeMeasure is set at true, and if the EyeWidth does not need to be calculated again, the EyeMeasure is set at false (initial value). If the value of the EyeWidth is 0 or if the value of the EyeMeasure is true, the CPU 311 proceeds to step S43, and if not (that is, if the value of the Eye Width is not 0 and the value of the EyeMeasure is false), the CPU 311 proceeds to step S44.
  • In step S43, the CPU 311 executes virtual cameras interval determination processing. In the virtual cameras interval determination processing, the interval between left virtual camera 63L and the right virtual camera 63R (that is, the EyeWidth) is determined and updated, as appropriate. The details of the virtual cameras interval determination processing will be described later.
  • In step S44, the CPU 311 executes view matrix generation processing. In the view matrix generation processing, the left view matrix 76L and the right view matrix 76R are calculated, based on the interval between the left virtual camera 63L and the right virtual camera 63R (that is, the EyeWidth) which has been already determined. The details of the view matrix generation processing will be described later.
  • In step S45, the CPU 311 executes main real world image setting processing. In the main real world image setting processing, the left real world image and the right real world image take turns switching to the main real world image, as appropriate (this is equivalent to the outer imaging section (left) 23 a and the outer imaging section (right) 23 b taking turns switching to a main imaging section). The details of the main real world image setting processing will be described later.
  • In step S46, the CPU 311 executes display mode switching processing, and ends the updating processing. In the display mode switching processing, the display mode is switched between the synchronous display mode and the asynchronous display mode, as appropriate. The details of the display mode switching processing will be described later.
  • (Virtual Cameras Interval Determination Processing)
  • Next, the details of the virtual cameras interval determination processing in step S43 in the updating processing (FIG. 15) will be described with reference to the flowchart in FIG. 16. In this processing, as previously described, the distance between the position of the outer imaging section (left) 23 a relative to the position of the marker 61, which is indicated by the left transform matrix 75L, and the position of the outer imaging section (right) 23 b relative to the position of the marker 61, which is indicated by the right transform matrix 75R, is calculated. There are several methods for calculating the distance. In the present embodiment, the following method is employed as an example of the methods for the calculation.
  • In step S50 in FIG. 16, the CPU 311 calculates coordinates V0, based on the left transform matrix 75L and the right transform matrix 75R. Hereinafter, with reference to FIG. 27 to FIG. 29, a method of calculating the V0 will be described. It is noted that in FIG. 27 to FIG. 29, the shooting direction of the outer imaging section (left) 23 a, and the shooting direction of the outer imaging section (right) 23 b are drawn so as to be greatly different from each other, for the purpose of emphasizing that a relationship between each of the position and the orientation of the outer imaging section (left) 23 a relative to the marker 61, which are calculated based on the left real world image, and each of the position and the orientation of the outer imaging section (right) 23 b relative to the marker 61, which are calculated based on the right real world image, is not always ideal, as previously described.
  • First, as shown in FIG. 27, (0, 0, 0) is multiplied by the left transform matrix 75L, thereby obtaining coordinates V1, of the origin of the marker coordinate system, that are represented in the outer imaging section (left) coordinate system. The coordinates V1 indicate the position, of the marker 61 relative to the outer imaging section (left) 23 a, which is calculated based on the left real world image. When the coordinates V1 is considered as coordinates represented in the outer imaging section (right) coordinate system, the coordinates V1 indicates a position shown in FIG. 28, in the outer imaging section (right) coordinate system. The resultant coordinates V1 is multiplied by the inverse matrix of the right transform matrix 75R. Multiplication by the inverse matrix of the right transform matrix 75R corresponds to transform of coordinates represented in the outer imaging section (right) coordinate system into coordinates represented in the marker coordinate system. Therefore, the multiplication by the inverse matrix transforms the coordinate V1 (FIG. 28) represented in the outer imaging section (right) coordinate system into the coordinates V0 represented in the marker coordinate system as shown in FIG. 29. The coordinate V0 thus calculated indicates the difference between the position (coordinates, of the origin of the marker coordinate system, that are represented in the outer imaging section (left) coordinate system) of the marker 61 relative to the outer imaging section (left) 23 a, which is calculated based on the left real world image, and the position (coordinates, of the origin of the marker coordinate system, that are represented in the outer imaging section (right) coordinate system) of the marker 61 relative to the outer imaging section (right) 23 a, which is calculated based on the right real world image. In the present embodiment, the difference between the relative positions of the marker 61 is considered as being due to the difference between the attachment positions of the outer imaging section (left) 23 a and the outer imaging section (right) 23 b, thereby estimating the attachment positions of the outer imaging section (left) 23 a and the outer imaging section (right) 23 b.
  • In step S51, based on the left transform matrix 75L, the CPU 311 calculates the coordinates V1, of the origin of the marker coordinate system, that are represented in the outer imaging section (left) coordinate system. Specifically, (0, 0, 0) is multiplied by the left transform matrix 75L, thereby obtaining the coordinates V1, of the origin of the marker coordinate system, that are represented in the outer imaging section (left) coordinate system. It is noted that in the present embodiment, the coordinates, of the origin of the marker coordinate system, that are represented in the outer imaging section (left) coordinate system, are defined as V1. However, instead, the coordinates, of the origin of the marker coordinate system, that are represented in the outer imaging section (right) coordinate system, may be defined as V1.
  • In step S52, the CPU 311 stores, in ew (see FIG. 12), the magnitude (the distance from the origin) (see FIG. 29) of the V0 obtained in step S50, and stores, in ed (see FIG. 12), the absolute value (see FIG. 27) of the z-axis component of the V1 obtained in step S51. However, the value of ew is calculated in unit of length in the marker coordinate system. Therefore, the unit of ew does not coincide with unit of length in the real world. Moreover, as previously described, the actual distance between the outer imaging section (left) 23 a and the outer imaging section (right) 23 b is known (for example, 3.5 cm). Therefore, by using the actual distance and the value of ew, it is possible to match the unit of length in virtual space to the unit of length in the real world. It is noted that if the size of the marker 61 is known, it is also possible to match the unit of length in virtual space to the unit of length in the real world, by using a correspondence relationship between the size of the marker 61 calculated based on the result of recognition of the marker image, and the size of the marker 61 in the real world.
  • It is noted that in the case where the shooting direction of the outer imaging section (left) 23 a and the shooting direction of the outer imaging section (right) 23 b are not in parallel with each other owing to error in the attachment of the outer imaging section (left) 23 a and the outer imaging section (right) 23 b to the game apparatus 10, the distance (ew) between the outer imaging section (left) 23 a and the outer imaging section (right) 23 b calculated as described above varies in accordance with the distance in the shooting direction between the outer imaging section 23 and the marker 61. Therefore, as described later, when the distance in the shooting direction between the outer imaging section 23 and the marker 61 has varied, the distance (EyeWidth described later) between the outer imaging section (left) 23 a and the outer imaging section (right) 23 b is calculated again by using the marker-camera transform matrix as it is at this time.
  • It is noted that in the present embodiment, the ew indicates the magnitude of the V0. However, the ew may indicate the component of the V0 in the direction of a line connecting the position of the outer imaging section (left) 23 a, which is indicated by the left transform matrix 75L, and the position of the outer imaging section (right) 23 b, which is indicated by the right transform matrix 75R.
  • The ew calculated as described above indicates the interval between the outer imaging section (left) 23 a and the outer imaging section (right) 23 b in the marker coordinate system. The ew is used as the interval between the left virtual camera 63R and the right virtual camera 63R (in step S65 or S68 described later).
  • It is noted that there is another method of calculating the ew other than the above-described method. For example, (0, 0, 0) is multiplied by the left transform matrix 75L, thereby obtaining the coordinates V1, of the origin of the marker coordinate system, that are represented in the outer imaging section (left) coordinate system (that is, the position of the marker 61 relative to the outer imaging section (left) 23 a). Moreover, (0, 0, 0) is multiplied by the right transform matrix 75R, thereby obtaining coordinates Vr, of the origin of the marker coordinate system, that are represented in the outer imaging section (right) coordinate system (that is, the position of the marker 61 relative to the outer imaging section (right) 23 b). The ew may be obtained by calculating the distance between the coordinates V1 and the coordinates Vr calculated as described above.
  • In step S53, the CPU 311 determines whether or not the value of LogIndex (see FIG. 12) is larger than 1 (initial value). If the value of the LogIndex is larger than 1, the CPU 311 proceeds to step S57, and if not (that is, if the value of the LogIndex is 1), the CPU 311 proceeds to step S54. The LogIndex is a variable for identifying each component of arrays (LogWidth and LogDepth).
  • In step S54, the CPU 311 stores the value of the ew obtained in step S52 in the array LogWidth [LogIndex]. Moreover, the CPU 311 stores the value of the ed obtained in step S52 in the array LogDepth [LogIndex]. Moreover, the value of the LogIndex is incremented. The LogWidth is array variables for storing a plurality of values of the ew. The LogDepth is array variables for storing a plurality of values of the ed.
  • In step S55, the CPU 311 determines whether or not the value of the LogIndex is larger than the value of LogMax. If the value of the LogIndex is larger than the value of the LogMax, the CPU 311 proceeds to step S56, and if not (that is, if the value of the LogIndex is equal to or smaller than the value of the LogMax), the CPU 311 ends the virtual cameras interval determination processing.
  • In step S56, the CPU 311 calculates the average value of the components of the LogWidth, and stores the average value in the EyeWidth. Moreover, the CPU 311 calculates the average value of the components of the LogDepth, and stores the average value in the EyeDepth. Moreover, the CPU 311 sets the value of the EyeMeasure at false. Moreover, the CPU 311 sets the marker recognition mode 79 at the one-image recognition mode. The EyeDepth is a variable indicating the depth distance (the depth, the distance in the shooting direction) to the origin of the marker coordinate system from the coordinates of the position of the outer imaging section (left) 23 a in the marker coordinate system, which is indicated by the left transform matrix 75L (or the coordinates of the position of the outer imaging section (right) 23 b in the marker coordinate system, which is indicated by the right transform matrix 75R). The EyeDepth is used as a reference value for determination in step S72 in FIG. 17 described later. The initial value of the EyeDepth is 0. If processing in step S56 has been finished, the virtual cameras interval determination processing is ended.
  • In step S57, the CPU 311 determines whether or not: the absolute value of the difference between the value of the ew obtained in step S52 and the value of the LogWidth [1] is smaller than 10 percent of the value of the LogWidth [1]; and the absolute value of the difference between the value of the ed obtained in step S52 and the value of the LogDepth [1] is smaller than 10 percent of the value of the LogDepth [1]. If the result of the determination in step S57 is positive, the CPU 311 proceeds to step S54, and if the result of the determination in step S57 is negative, the CPU 311 proceeds to step S58.
  • In step S58, the CPU 311 resets the value of the LogIndex at 1 (initial value), and ends the virtual cameras interval determination processing.
  • As described above, in the virtual cameras interval determination processing, the ew and the ed are calculated based on the position of the outer imaging section (left) 23 a calculated based on the position and the orientation of the marker 61 in the left real world image, and the position of the outer imaging section (right) 23 b calculated based on the position and the orientation of the marker 61 in the right real world image. The values of the ew and the values of the ed calculated in this manner are sequentially stored in the LogWidth and the LogDepth, respectively. At this time, if the value of the ew newly calculated is out of a predetermined range (±10%) from the value of the ew that was stored earliest in the LogWidth (that is, the value of the LogWidth [1]), or if the value of the ed newly calculated is out of a predetermined range (±10%) from the value of the ed that was stored earliest in the LogDepth (that is, the value of the LogDepth [1]), values of the ew or values of the ed are stored again in the LogWidth or the LogDepth from the beginning. Therefore, only when values of the ew and values of the ed which are sequentially calculated do not vary significantly (that is, values of the ew and the values of the ed have been stable, to a certain extent, for a certain period), the average value of the values of the ew and the average value of the values of the ed are stored in the EyeWidth and the EyeDepth.
  • It is noted that when the user is moving or turning the game apparatus 10, the left real world image and the right real world image shot by the outer imaging section 23 are likely to be blurred, and the accuracy of the recognition of the marker 61 significantly decreases. As a result, values of the ew or values of the ed which are sequentially detected are likely to vary significantly. It is desired to avoid determining the value of the EyeWidth and the value of the EyeDepth, based on values of the ew and values of the ed, having low reliability, detected in such a situation as described above. Therefore, in the present embodiment, only when values of the ew and values of the ed which are sequentially calculated do not vary significantly, the value of the EyeWidth and the value of the EyeDepth are determined based on the values of the ew and the values of the ed. In addition, the accuracies of the value of the EyeWidth and the value of the EyeDepth can be enhanced because the average value of a plurality of values of the ew and the average value of a plurality of values of the ed obtained by calculations performed a plurality of times are used as the value of the EyeWidth and the value of the EyeDepth, respectively.
  • (View Matrix Generation Processing)
  • Next, the view matrix generation processing in step S44 in the updating processing (FIG. 15) will be described with reference to the flowchart in FIG. 17.
  • In step S60 in FIG. 17, the CPU 311 determines whether or not the value of the EyeWidth is larger than 0. If the value of the EyeWidth is larger than 0, the CPU 311 proceeds to step S63, and if not (that is, if the value of the EyeWidth is 0), the CPU 311 proceeds to step S61.
  • In step S61, the CPU 311 determines whether or not the display mode is the synchronous display mode. If the display mode is the synchronous display mode, the CPU 311 ends the view matrix generation processing, and if not (that is, if the display mode is the asynchronous display mode), the CPU 311 proceeds to step S62.
  • In step S62, the CPU 311 sets the value of the AR-Active at false. Then, the CPU 311 ends the view matrix generation processing.
  • In step S63, the CPU 311 determines whether or not the marker recognition for the left real world image has been successful, with reference to the left transform matrix 75L. If the marker recognition for the left real world image has been successful, the CPU 311 proceeds to step S64, and if not (that is, the value of the left transform matrix 75L is a null value), the CPU 311 proceeds to step S66.
  • In step S64, the CPU 311 stores the value of the left transform matrix 75L in the left view matrix 76L. This means that the position and the orientation of the outer imaging section (left) 23 a in the marker coordinate system, which are calculated based on the position and the orientation of the marker 61 in the left real world image, are directly used as the position and the orientation of the left virtual camera 63L for generating a virtual space image for a left eye.
  • In step S65, the CPU 311 stores, in the right view matrix 76R, the value of the left view matrix 76L multiplied by a parallel movement matrix (−EyeWidth, 0, 0). This means that a position that is separated by the value of the EyeWidth in the positive direction of the x-axis in the left virtual camera coordinate system from the position of the left virtual camera 63L in the world coordinate system of the virtual space, which has been set in step S64, is used as the position of the right virtual camera 63R for generating a virtual space image for a right eye, as shown in FIG. 30. In addition, the orientation of the right virtual camera 63R is the same as that of the left virtual camera 63L (that is, the x-axis, the y-axis, and the z-axis in the left virtual camera coordinate system are in parallel with those in the right virtual camera coordinate system). As a result, the position and the orientation of the left virtual camera 63L, and the position and the orientation of the right virtual camera 63R are kept consistent with each other, whereby the virtual object 62 can be displayed on the upper LCD 22 so as to provide a normal stereoscopic view.
  • In step S66, the CPU 311 determines whether or not the marker recognition for the right real world image has been successful, with reference to the right transform matrix 75R. If the marker recognition for the right real world image has been successful, the CPU 311 proceeds to step S67, and if not (that is, the value of the right transform matrix 75R is a null value), the CPU 311 ends the view matrix generation processing.
  • In step S67, the CPU 311 stores the value of the right transform matrix 75R in the right view matrix 76R. This means that the position and the orientation of the outer imaging section (right) 23 b in the marker coordinate system, which are calculated based on the position and the orientation of the marker 61 in the right real world image, are directly used as the position and the orientation of the right virtual camera 63R for generating a virtual space image for a right eye.
  • In step S68, the CPU 311 stores, in the left view matrix 76L, the value of the right view matrix 76R multiplied by a parallel movement matrix (EyeWidth, 0, 0). This means that a position that is separated by the value of the EyeWidth in the negative direction of the x-axis in the right virtual camera coordinate system from the position of the right virtual camera 63R in the world coordinate system of the virtual space, which has been set in step S67, is used as the position of the left virtual camera 63L for generating a virtual space image for a left eye, as shown in FIG. 31. In addition, the orientation of the left virtual camera 63L is the same as that of the right virtual camera 63R (that is, the x-axis, the y-axis, and the z-axis in the left virtual camera coordinate system are in parallel with those in the right virtual camera coordinate system). As a result, the position and the orientation of the left virtual camera 63L, and the position and the orientation of the right virtual camera 63R are kept consistent with each other, whereby the virtual object 62 can be displayed on the upper LCD 22 so as to provide normal stereoscopic viewing.
  • As described above, in the present embodiment, while the position and the orientation of one of the virtual cameras (for example, the left virtual camera 63L) are set by using (more specifically, directly using) a “marker-camera transform matrix” calculated from an image shot by one of the outer imaging sections (for example, the outer imaging section (left) 23 a), the position and the orientation of the other one of the virtual cameras (for example, the right virtual camera 63R) are set without using a “marker-camera transform matrix” calculated from an image shot by the other one of the outer imaging sections (for example, the outer imaging section (right) 23 b).
  • In providing a stereoscopic view of AR by using a stereo camera, two virtual cameras for right and left need to be set, and a transform matrix (left transform matrix 25L) for the outer imaging section (left) 23 a and a transform matrix (right transform matrix 25R) for the outer imaging section (right) 23 b are used as the “marker-camera transform matrices”. In the present embodiment, in setting the virtual cameras 63L and 63R, the respective transform matrices 25L and 25R are not used. Instead, one transform matrix is set (one of the transform matrices 25L and 25R is directly used, or one new transform matrix (for example, using an average position and an average orientation) is generated from both the transform matrices 25L and 25R), and then the positions and the orientations of both the virtual cameras 63L and 63R are set by using the one transform matrix that has been set. In this way, problems on the accuracy of AR recognition can be solved.
  • In step S69, the CPU 311 sets the value of AR-Active at true. As a result, processing of displaying the virtual object 62 being superimposed onto the real world image is started, or restarted.
  • In step S70, the CPU 311 determines whether or not the value of the EyeMeasure is true. If the value of the EyeMeasure is true, the CPU 311 ends the view matrix generation processing, and if not (that is, if the value of the EyeMeasure is false), the CPU 311 proceeds to step S71.
  • In step S71, based on the left view matrix 76L determined in step S64 or step S68, the CPU 311 calculates coordinates V, of the origin of the virtual space, that are represented in the left virtual camera coordinate system. Specifically, as shown in FIG. 32, (0, 0, 0) is multiplied by the left view matrix 76L, thereby obtaining the coordinates V, of the origin of the virtual space, that are represented in the left virtual camera coordinate system. It is noted that in the present embodiment, the coordinates, of the origin of the virtual space, that are represented in the left virtual camera coordinate system are defined as V. However, instead, coordinates, of the origin of the virtual space, that are represented in the right virtual camera coordinate system, which coordinates are calculated based on the right view matrix 76R determined in step S65 or step S67, may be defined as V. The V calculated as described above is substantially the same as the position of the marker 61 in the outer imaging section (left) coordinate system (it is noted that the value of the V is calculated in unit of length in the virtual space or the marker coordinate system, and does not coincides with unit of length in the real world).
  • In step S72, the CPU 311 determines whether or not the absolute value of the difference between the absolute value (|V.z|) of the z-coordinate value of the V and the value of the EyeDepth is larger than 20 percent of the value of the EyeDepth. If the absolute value of the difference is larger than 20 percent of the value of the EyeDepth, the CPU 311 proceeds to step S73, and if not, the CPU 311 ends the view matrix generation processing. The |V.z| is substantially the same as the distance (depth), in the shooting direction, from the outer imaging section 23 to the marker 61. That is, in step S72, the CPU 311 determines whether or not the depth distance (depth) from the outer imaging section 23 to the marker 61 has varied beyond a range of ±20 percent in comparison with the depth distance as it was when the value of the EyeWidth was calculated in the virtual cameras interval determination processing (FIG. 16).
  • In step S73, the CPU 311 sets the value of the EyeMeasure at true, resets the value of the LogIndex at 1, and sets the marker recognition mode 79 at the both-image recognition mode, thereby ending the view matrix generation processing. As a result, calculation of the EyeWidth in the virtual cameras interval determination processing is started again.
  • The reason why, as described above, the EyeWidth is calculated again when the depth distance (depth) from the outer imaging section 23 to the marker 61 has varied beyond a predetermined range in comparison with the depth distance as it was when the value of the EyeWidth was previously calculated in the virtual cameras interval determination processing, is because the optimum interval (EyeWidth) between the virtual cameras varies in accordance with the depth distance (depth) from the outer imaging section 23 to the marker 61 owing to error in the attachment of the outer imaging section (left) 23 a and the outer imaging section (right) 23 b to the game apparatus 10. For example, under the condition that the shooting direction of the outer imaging section (left) 23 a and the shooting direction of the outer imaging section (right) 23 b are not in parallel with each other, it is assumed that when the depth distance from the outer imaging section 23 to the marker 61 is D1, the value of the EyeWidth calculated in the virtual cameras interval determination processing is EyeWidth 1, as shown in FIG. 33. In this case, it is appropriate that in the virtual space, the left virtual camera 63L and the right virtual camera 63R are placed with an interval of the EyeWidth 1 provided therebetween as shown in FIG. 34 (as a result, for example, a virtual object placed at the origin of the virtual space is appropriately displayed in a stereoscopic manner as if the virtual object were really existent on the marker 61). Then, if the depth distance from the outer imaging section 23 to the marker 61 varies to D2 that is smaller than D1 as shown in FIG. 35, the value of the EyeWidth calculated in the virtual cameras interval determination processing is EyeWidth 2 that is smaller than the EyeWidth2. Therefore, it is appropriate that in the virtual space, the left virtual camera 63L and the right virtual camera 63R are placed with an interval of the EyeWidth 2 provided therebetween as shown in FIG. 36.
  • (Main Real World Image Setting Processing)
  • Next, the details of the main real world image setting processing in step S45 in the updating processing (FIG. 15) will be described with reference to the flowchart in FIG. 18.
  • In step S80 in FIG. 18, the CPU 311 determines whether or not the value of the AR-Active is true. If the value of the AR-Active is true, the CPU 311 proceeds to step S81, and if not (that is, if the value of the AR-Active is false), the CPU 311 ends the main real world image setting processing.
  • In step S81, the CPU 311 generates the left projection matrix 77L and the right projection matrix 77R. The left projection matrix 77L is a matrix for specifying a drawing area of the virtual space as viewed from the left virtual camera 63L. In the present embodiment, as shown in FIG. 37, the left projection matrix 77L is generated so as to correspond to the drawing area of the left real world image determined in step S23 in the main processing. The right projection matrix 77R is a matrix for specifying a drawing area of the virtual space as viewed from the right virtual camera 63R. In the present embodiment, as shown in FIG. 38, the right projection matrix 77R is generated so as to correspond to the drawing area of the right real world image determined in step S27 in the main processing.
  • Specifically, the projection matrix for the left virtual camera 63L is set as a projection matrix that defines a view volume that has an angle of view such that the ratio of the angle of view in the horizontal direction of the outer imaging section (left) 23 a, and the angle of view in the horizontal direction of the left virtual camera 63L is the same as the ratio representing the position and the size in the lateral direction of the drawing area with respect to the length in the lateral direction of the left real world image.
  • In step S82, the CPU 311 determines whether or not the left real world image is the main real world image. If the left real world image is the main real world image, the CPU 311 proceeds to step S83, and if not (that is, if the right real world image is the main real world image), the CPU 311 proceeds to step S86.
  • In step S83, the CPU 311 multiplies (0, 0, 0, 1) by the left view matrix 76L and the left projection matrix 77L, thereby obtaining a vector V.
  • In step S84, the CPU 311 determines whether or not a value (V.x/V.w) obtained by multiplying the first component (x) of the vector V by the fourth component (w) is larger than 0.5. If the value (V.x/V.w) is larger than 0.5, the CPU 311 proceeds to step S85, and if not, the CPU 311 ends the main real world image setting processing. The value of the V.x/V.w indicates the position in the lateral direction, in the virtual space image for a left eye, where the origin of the world coordinate system in the virtual space is present (it is noted that the position in the lateral direction, in the virtual space image for a left eye, where the origin of the world coordinate system in the virtual space is present, is equivalent to the position in the lateral direction, in an image shot by the outer imaging section (left) 23 a, where the origin of the marker coordinate system is present). If the origin of the world coordinate system in the virtual space is at the middle of the virtual space image for a left eye, the value of the V.x/V.w is 0. The value of the V.x/V.w approaches −1.0 as the origin of the world coordinate system in the virtual space approaches the left end of the virtual space image for a left eye, and the value of the V.x/V.w approaches +1.0 as the origin of the world coordinate system in the virtual space approaches the right end of the virtual space image for a left eye. If the value of V.x/V.w is larger than 0.5, the origin of the world coordinate system in the virtual space is in the right end area (diagonal-line area in FIG. 39) of the virtual space image for a left eye. In other words, the marker 61 is in the right end area of the left real world image shot by the outer imaging section (left) 23 a.
  • In step S85, the CPU 311 changes the main real world image from the left real world image to the right real world image. That is, in the virtual space image shot by the left virtual camera 63L, if the origin in the virtual space is present to the right of a position separated rightward by a predetermined distance (or by a predetermined percent of the lateral width of the virtual space image) from the middle of the virtual space image, the CPU 311 changes the main real world image to the right real world image. Alternatively, in the image shot by the outer imaging section (left) 23 a, if the origin of the marker coordinate system is present to the right of a position separated rightward by a predetermined distance (or by a predetermined percent of the lateral width of the shot image) from the middle of the shot image, the CPU 311 changes the main real world image to the right real world image. Thus, even if, for example, the position of the marker 61 in the left real world image gradually moves rightward to end up disappearing from the left real world image, the CPU 311 changes the main real world image to the right real world image before the marker 61 disappears from the left real world image. Therefore, it is possible to continuously recognize the marker 61 in the one-image recognition mode. If processing in step S85 has been finished, the main real world image setting processing is ended.
  • In step S86, the CPU 311 multiplies (0, 0, 0, 1) by the right view matrix 76R and the right projection matrix 77R, thereby obtaining the vector V.
  • In step S87, the CPU 311 determines whether or not the value (V.x/V.w) obtained by multiplying the first component (x) of the vector V by the fourth component (w) is smaller than −0.5. If the value (V.x/V.w) is smaller than −0.5, the CPU 311 proceeds to step S87, and if not, the CPU 311 ends the main real world image setting processing. The value of the V.x/V.w indicates the position in the lateral direction, in the virtual space image for a right eye, where the origin of the world coordinate system in the virtual space is present. If the origin of the world coordinate system in the virtual space is at the middle of the virtual space image for a right eye, the value of the V.x/V.w is 0. The value of the V.x/V.w approaches −1.0 as the origin of the world coordinate system in the virtual space approaches the left end of the virtual space image for a right eye, and the value of the V.x/V.w approaches +1.0 as the origin of the world coordinate system in the virtual space approaches the right end of the virtual space image for a right eye. If the value of V.x/V.w is smaller than −0.5, the origin of the world coordinate system in the virtual space is in the left end area (diagonal-line area in FIG. 40) of the virtual space image for a right eye. In other words, the marker 61 is in the left end area of the right real world image shot by the outer imaging section (right) 23 b.
  • In step S88, the CPU 311 changes the main real world image from the right real world image to the left real world image. Thus, even if, for example, the position of the marker 61 in the right real world image gradually moves leftward to end up disappearing from the right real world image, the CPU 311 changes the main real world image to the left real world image before the marker 61 disappears from the right real world image. Therefore, it is possible to continuously recognize the marker 61 in the one-image recognition mode. If processing in step S88 has been finished, the main real world image setting processing is ended.
  • (Display Mode Switching Processing)
  • Next, the details of the display mode switching processing in step S46 in the updating processing (FIG. 15) will be described with reference to the flowchart in FIG. 19.
  • In step S90 in FIG. 19, the CPU 311 determines whether or not the display mode is the synchronous display mode. If the display mode is the synchronous display mode, the CPU 311 proceeds to step S91, and if not (that is, if the display mode is the asynchronous display mode), the CPU 311 proceeds to step S96.
  • In step S91, the CPU 311 determines whether or not: the EyeWidth is larger than 0; and the marker recognition for the left real world image or the marker recognition for the right real world image has been successful. Then, if the result of the determination is positive, the CPU 311 proceeds to step S92, and if the result of the determination is negative (that is, if the EyeWidth is 0 or if both the marker recognition for the left real world image and the marker recognition for the right real world image have been failed), the CPU 311 proceeds to step S93.
  • In step S92, the CPU 311 sets the value of SyncCount (see FIG. 12) at 20.The SyncCount is a variable for determining a timing of switching the display mode from the synchronous display mode to the asynchronous display mode. If processing in step S92 has been finished, the display mode switching processing is ended.
  • In step S93, the CPU 311 decrements the value of SyncCount.
  • In step S94, the CPU 311 determines whether or not the value of the SyncCount is larger than 0. If the value of the SyncCount is larger than 0, the CPU 311 ends the display mode switching processing, and if not (that is, the value of the SyncCount is 0), the CPU 311 proceeds to step S95.
  • In step S95, the CPU 311 sets the value of the LogIndex at 1, sets the value of the EyeWidth and the value of the EyeDepth at 0, changes the display mode 78 from the synchronous display mode to the asynchronous display mode, and sets the marker recognition mode 79 to the both-image recognition mode. If processing in step S95 has been finished, the display mode switching processing is ended.
  • In step S96, the CPU 311 determines whether or not: the EyeWidth is larger than 0; and the marker recognition for the left real world image or the marker recognition for the right real world image has been successful. Then, if the result of the determination is positive, the CPU 311 proceeds to step S97, and if the result of the determination is negative (that is, if the EyeWidth is 0 or if both the marker recognition for the left real world image and the marker recognition for the right real world image have been failed), the CPU 311 ends the display mode switching processing.
  • In step S97, the CPU 311 sets the value of SyncCount at 20, and changes the display mode from the asynchronous display mode to the synchronous display mode. If processing in step S97 has been finished, the display mode switching processing is ended.
  • As described above, the display mode switching processing appropriately switches the display mode between the synchronous display mode and the asynchronous display mode. More specifically, just after execution of the image display program has been started, the display mode is the asynchronous display mode, and the latest real world image is constantly displayed on the upper LCD 22. Thereafter, at the time when the marker 61 has been recognized and it has become possible to display the virtual object 62 being superimposed onto the real world image displayed on the upper LCD 22, the display mode is changed from the asynchronous display mode to the synchronous display mode. In the synchronous display mode, in order to superimpose and display the virtual object 62 at a proper position in the real world image, a composite image obtained by superimposing the virtual object 62 onto the last real world image in which the marker 61 has been recognized (which is not always the latest real world image) is displayed on the upper LCD 22. In this way, the position on the real world image where the virtual object 62 is superimposed and displayed can be prevented from being out of a proper position. Thereafter, at the time when the marker 61 has continued to be unrecognized for a predetermined time period, the display mode is changed from the synchronous display mode to the asynchronous display mode, and the latest real world image is constantly displayed on the upper LCD 22. In this way, when the marker 61 continues to be unrecognized, it is possible to prevent an old image from continuing to be displayed on the upper LCD 22.
  • (Effects of the Present Embodiment)
  • As described above, in the present embodiment, the position and the orientation of one of the virtual cameras is determined based on the position and the orientation of one of the outer imaging sections (the outer imaging section (left) 23 a or the outer imaging section (right) 23 b) in the marker coordinate system, which are calculated from the result of the marker recognition for the corresponding one of the left real world image and the right real world image. In addition, the position and the orientation of the other one of the virtual cameras are determined such that the relationship between each of the position and the orientation of the left virtual camera 63L, and each of the position and the orientation of the right virtual camera 63R, is ideal. Therefore, the virtual object 62 can be displayed so as to provide a normal stereoscopic view, on a display apparatus capable of providing a stereoscopic view.
  • In addition, the interval between the two outer imaging sections are calculated based on the position of the outer imaging section (left) 23 a calculated based on the result of the marker recognition for the left real world image, and the position of the outer imaging section (right) 23 b calculated based on the result of the marker recognition for the right real world image. The virtual cameras interval (EyeWidth) is determined based on the result of the calculation. A position separated, by the calculated interval in the direction perpendicular to the shooting direction in a coordinate system of one of the virtual cameras, from the position of the one of the virtual cameras, is calculated, to set the other one of the virtual cameras at the calculated position. As a result, both the virtual cameras are set so as to be placed along the direction perpendicular to the shooting direction. In addition, even if the interval between the outer imaging section (left) 23 a and the outer imaging section (right) 23 b is not known, or even if the accuracy in the attachment of the outer imaging section (left) 23 a and the outer imaging section (right) 23 b to the game apparatus 10 is bad, it is possible to place the left virtual camera 63L and the right virtual camera 63R with an ideal interval provided therebetween.
  • In addition, after the virtual cameras interval (EyeWidth) has been determined in the virtual cameras interval determination processing, the marker recognition processing needs to be performed only for the main real world image. Therefore, a load of processing can be reduced in comparison with the case where the marker recognition processing is always performed for both the left real world image and the right real world image.
  • It is noted that if the interval between the outer imaging section (left) 23 a and the outer imaging section (right) 23 b is known, the interval (EyeWidth), between the outer imaging section (left) 23 a and the outer imaging section (right) 23 b in the marker coordinate system, that corresponds to the interval (for example, 3.5 cm) between the outer imaging section (left) 23 a and the outer imaging section (right) 23 b in the real world, is figured out based on the result of the virtual cameras interval determination processing. Therefore, for example, it is possible to, in the virtual space, perform processing based on a scale of the real world such as displaying a character (virtual object) having a height of 30 cm being superimposed onto the real world image, or moving a character (virtual object) displayed being superimposed onto the real world image real world image at a velocity of 10 cm per second.
  • (Variations)
  • It is noted that in the present embodiment, the position and the orientation of the marker 61 in the real world image are recognized, and the virtual object 62 is superimposed onto the real world image in accordance with the result of the recognition. However, in other embodiments, instead of the marker 61, the position and/or the orientation of any recognition target may be recognized, and the virtual object 62 may be superimposed onto the real world image in accordance with the result of the recognition. An example of a recognition target is the face of a person.
  • In addition, in the present embodiment, a stereoscopic image is displayed on the upper LCD 22, based on the real world image being shot in real time by the outer imaging section 23. However, in other embodiments, a stereoscopic image may be displayed on the upper LCD 22, based on data of moving images shot in the past by the outer imaging section 23, an external stereo camera, or the like.
  • In addition, in the present embodiment, the outer imaging section 23 is provided in the game apparatus 10 in advance. However, in other embodiments, an external camera attachable to and detachable from the game apparatus 10 may be used.
  • In addition, in the present embodiment, the upper LCD 22 is provided in the game apparatus 10 in advance. However, in other embodiments, an external stereoscopic display attachable to and detachable from the game apparatus 10 may be used.
  • In addition, in the present embodiment, the virtual object 62 is placed at the position of the origin of the marker coordinate system. However, in other embodiments, the virtual object 62 may be placed at a position away from the origin of the marker coordinate system.
  • In addition, in the present embodiment, one virtual object is placed in the virtual space. However, in other embodiments, a plurality of virtual objects may be placed in the virtual space.
  • In addition, in the present embodiment, in the virtual cameras interval determination processing, the interval (EyeWidth) between the outer imaging section (left) 23 a and the outer imaging section (right) 23 b in the marker coordinate system is calculated, and based on the interval, the position and the orientation of one of the left virtual camera 63L and the right virtual camera 63R are determined from the position and the orientation of the other one of the virtual cameras calculated based on the result of the marker recognition. However, in other embodiments, the position and the orientation of the outer imaging section 23 (for example, the average position of the position of the outer imaging section (left) 23 a and the position of the outer imaging section (right) 23 b, and the average orientation of the orientation of the outer imaging section (left) 23 a and the orientation of the outer imaging section (right) 23 b) may be calculated based on the position and the orientation of the outer imaging section (left) 23 a calculated based on the result of the marker recognition for the left real world image, and the position and the orientation of the outer imaging section (right) 23 b calculated based on the result of the marker recognition for the right real world image. Based on the result of the calculation, the positions and/or the orientations of the left virtual camera 63L and the right virtual camera 63R may be determined. For example, the orientations of the left virtual camera 63L and the right virtual camera 63R may be determined such that both the orientations are intermediate between the orientation of the outer imaging section (left) 23 a calculated based on the result of the marker recognition for the left real world image, and the orientation of the outer imaging section (right) 23 b calculated based on the result of the marker recognition for the right real world image. In addition, for example, after the interval (EyeWidth) between the outer imaging section (left) 23 a and the outer imaging section (right) 23 b is calculated in the virtual cameras interval determination processing, positions separated, by distances of EyeWidth/2 in the respective directions opposite to each other and perpendicular to the shooting direction of the virtual cameras, from a position in the virtual space corresponding to the average position of the position of the outer imaging section (left) 23 a calculated based on the result of the marker recognition for the left real world image, and the position of the outer imaging section (right) 23 b calculated based on the result of the marker recognition for the right real world image, may be calculated, and the positions of the left virtual camera 63L and the right virtual camera 63R may be determined so as to be the calculated positions.
  • In addition, in the present embodiment, after the virtual cameras interval (EyeWidth) is calculated in the virtual cameras interval determination processing, the marker recognition processing is performed for only the main real world image. However, in other embodiments, the marker recognition processing may be always performed for both the left real world image and the right real world image.
  • In addition, in the present embodiment, the upper LCD 22 is a stereoscopic display apparatus of a parallax barrier type. However, in other embodiments, the upper LCD 22 may be a stereoscopic display apparatus of any other type such as lenticular lens type. For example, if a stereoscopic display apparatus of a lenticular lens type is used, an image for a left eye and an image for a right eye may be combined with each other by the CPU 311 or another processor, and the resultant composite image may be supplied to the stereoscopic display apparatus of a lenticular lens type.
  • In addition, in the present embodiment, the game apparatus 10 is used for superimposing a virtual object onto the real world image and displaying the resultant image. However, in another embodiment, any information processing apparatus or any information processing system (for example, a PDA (Personal Digital Assistant), a mobile phone, a personal computer, or a camera) may be used for superimposing a virtual object onto the real world image and displaying the resultant image.
  • In addition, in the present embodiment, the image display processing is executed by using only one information processing apparatus (game apparatus 10). However, in other embodiments, a plurality of information processing apparatuses, included in an image display system, that can communicate with each other may share the execution of the image display processing.
  • While the invention has been described in detail, the foregoing description is in all aspects illustrative and not restrictive. It will be understood that numerous other modifications and variations can be devised without departing from the scope of the invention.

Claims (23)

1. A computer-readable storage medium having stored therein an image display program which stereoscopically displays a real world image onto which a 3-dimensional virtual object is superimposed, on a screen of a stereoscopic display apparatus capable of providing a stereoscopic view, by using outputs from a real camera for a right eye and a real camera for a left eye, the image display program causing a computer to function as:
first position orientation calculation means for recognizing a predetermined shooting target in a piece of real world image data outputted from one of the real camera for a right eye and the real camera for a left eye, among pieces of real world image data respectively outputted from the real camera for a right eye and the real camera for a left eye, and calculating position orientation information indicating relative positions and relative orientations of the one of the real camera for a right eye and the real camera for a left eye, and the predetermined shooting target with respect to each other;
virtual camera setting means for determining the position and the orientation of a right virtual camera for generating an image for a right eye, and the position and the orientation of a left virtual camera for generating an image for a left eye, in a predetermined virtual space, by using the position orientation information calculated by the first position orientation calculation means;
right virtual space image generation means for generating a right virtual space image indicating the predetermined virtual space as it is looked at from the right virtual camera;
left virtual space image generation means for generating a left virtual space image indicating the predetermined virtual space as it is looked at from the left virtual camera; and
display control means for superimposing the right virtual space image onto the piece of real world image data outputted from the real camera for a right eye, and superimposing the left virtual space image onto the piece of real world image data outputted from the real camera for a left eye, to output images for a stereoscopic view to the stereoscopic display apparatus.
2. The computer-readable storage medium according to claim 1, wherein
the virtual camera setting means determines the positions and the orientations of the right virtual camera and the left virtual camera such that the relationship between the relative orientations of the right virtual camera and the left virtual camera with respect to each other is the same as the relationship between designed relative orientations of the real camera for a right eye and the real camera for a left eye with respect to each other.
3. The computer-readable storage medium according to claim 2, wherein
the virtual camera setting means includes:
a first virtual camera orientation determination section for determining the orientation of one of the right virtual camera and the left virtual camera that corresponds to the one of the real camera for a right eye and the real camera for a left eye in the first position orientation calculation means, by using the position orientation information calculated by the first position orientation calculation means; and
a second virtual camera orientation determination section for, based on the orientation of the one of the right virtual camera and the left virtual camera determined by the first virtual camera orientation determination section, determining the orientation of the other one of the right virtual camera and the left virtual camera such that the relationship between the relative orientations of the right virtual camera and the left virtual camera with respect to each other is the same as the relationship between designed relative orientations of the real camera for a right eye and the real camera for a left eye with respect to each other.
4. The computer-readable storage medium according to claim 2, the image display program further causing the computer to function as virtual camera relative position relationship determination means for determining a relative position relationship between the right virtual camera and the left virtual camera,
wherein the virtual camera setting means includes:
a first virtual camera position determination section for determining the position of one of the right virtual camera and the left virtual camera that corresponds to the one of the real camera for a right eye and the real camera for a left eye in the first position orientation calculation means, by using the position orientation information calculated by the first position orientation calculation means; and
a second virtual camera position determination section for determining the position of the other one of the right virtual camera and the left virtual camera so as to be separated, by the relative position determined by the virtual camera relative position relationship determination means, from the position of the one of the right virtual camera and the left virtual camera determined by the first virtual camera position determination section.
5. The computer-readable storage medium according to claim 2, wherein the virtual camera setting means includes:
a first virtual camera position determination section for determining the position of one of the right virtual camera and the left virtual camera that corresponds to the one of the real camera for a right eye and the real camera for a left eye in the first position orientation calculation means, by using the position orientation information calculated by the first position orientation calculation means;
a first virtual camera orientation determination section for determining the orientation of the one of the right virtual camera and the left virtual camera that corresponds to the one of the real camera for a right eye and the real camera for a left eye in the first position orientation calculation means, by using the position orientation information calculated by the first position orientation calculation means; and
a second virtual camera position determination section for determining the position of the other one of the right virtual camera and the left virtual camera so as to be present, in a direction based on the orientation of the one of the right virtual camera and the left virtual camera determined by the first virtual camera orientation determination section, from the position of the one of the right virtual camera and the left virtual camera determined by the first virtual camera position determination section.
6. The computer-readable storage medium according to claim 5, the image display program further causing the computer to function as virtual cameras distance determination means for determining the distance between the right virtual camera and the left virtual camera,
wherein the second virtual camera position determination section determines the position of the other one of the right virtual camera and the left virtual camera so as to be separated by the distance determined by the virtual cameras distance determination means, in the direction based on the orientation of the one of the right virtual camera and the left virtual camera determined by the first virtual camera orientation determination section, from the position of the one of the right virtual camera and the left virtual camera determined by the first virtual camera position determination section.
7. The computer-readable storage medium according to claim 4, wherein the virtual camera relative position relationship determination means determines the relative position relationship between the right virtual camera and the left virtual camera, based on a parallax between the pieces of real world image data respectively outputted from the real camera for a right eye and the real camera for a left eye.
8. The computer-readable storage medium according to claim 7, wherein the relative position relationship which is determined by the relative position relationship determination means is the interval between the right virtual camera and the left virtual camera.
9. The computer-readable storage medium according to claim 4, the image display program further causing the computer to function as second position orientation calculation means for recognizing the predetermined shooting target in the piece of real world image data outputted from the other one of the real camera for a right eye and the real camera for a left eye, which is different from the one of the real camera for a right eye and the real camera for a left eye in the first position orientation calculation means, among the two pieces of real world image data respectively outputted from the real camera for a right eye and the real camera for a left eye, and calculating position orientation information indicating relative positions and relative orientations of the other one of the real camera for a right eye and the real camera for a left eye, and the predetermined shooting target with respect to each other,
wherein the virtual camera relative position relationship determination means calculates the distance between the one and the other one of the real camera for a right eye and the real camera for a left eye, based on the information, about the relative positions of the one of the real camera for a right eye and the real camera for a left eye, and the predetermined shooting target, which is calculated by the first position orientation calculation means, and on the information, about the relative positions of the other one of the real camera for a right eye and the real camera for a left eye, and the predetermined shooting target, which is calculated by the second position orientation calculation means.
10. The computer-readable storage medium according to claim 9, wherein
the first position orientation calculation means includes first transform matrix generation means for, based on the piece of real world image data outputted from the one of the real camera for a right eye and the real camera for a left eye, generating a first transform matrix that transforms a coordinate value represented in a coordinate system whose origin is the position of the predetermined shooting target, into a coordinate value represented in a first shooting section coordinate system whose origin is the position of the one of the real camera for a right eye and the real camera for a left eye, and
the second position orientation calculation means includes second transform matrix generation means for, based on the piece of real world image data outputted from the other one of the real camera for a right eye and the real camera for a left eye, generating a second transform matrix that transforms a coordinate value represented in a coordinate system whose origin is the position of the predetermined shooting target, into a coordinate value represented in a second shooting section coordinate system whose origin is the position of the other one of the real camera for a right eye and the real camera for a left eye.
11. The computer-readable storage medium according to claim 7, wherein the virtual camera relative position relationship determination means executes processing of calculating the relative position relationship between the right virtual camera and the left virtual camera every time a new piece of real world image data is outputted from each of the real camera for a right eye and the real camera for a left eye, and determines the relative position relationship between the right virtual camera and the left virtual camera, based on results of a plurality of the calculations of the relative position relationship obtained by the processing being executed a plurality of times.
12. The computer-readable storage medium according to claim 11, wherein the virtual camera relative position relationship determination means determines the relative position relationship between the right virtual camera and the left virtual camera, based on the results of the plurality of calculations of the relative position relationship, only when all the results of the plurality of calculations of the relative position relationship are within a predetermined range.
13. The computer-readable storage medium according to claim 7, the image display program further causing the computer to function as depth variation determination means for determining whether or not the depth distance from each of the real camera for a right eye and the real camera for a left eye to the predetermined shooting target has varied beyond a predetermined range in comparison with the depth distance as it was when the virtual camera relative position relationship determination means determined the relative position relationship between the right virtual camera and the left virtual camera,
wherein the virtual camera relative position relationship determination means determines again the relative position relationship between the right virtual camera and the left virtual camera, when the result of the determination by the depth variation determination means is positive.
14. The computer-readable storage medium according to claim 13, the image display program further causing the computer to function as depth distance calculation means for calculating the depth distance from each of the real camera for a right eye and the real camera for a left eye to the predetermined shooting target, based on a piece of real world image data that includes the predetermined shooting target,
wherein the depth variation determination means determines whether or not the depth distance from each of the real camera for a right eye and the real camera for a left eye to the predetermined shooting target has varied beyond the predetermined range by comparing a reference depth distance calculated by the depth distance calculation means at a time when the virtual camera relative position relationship determination means determined the relative position relationship between the right virtual camera and the left virtual camera, with the latest depth distance calculated by the depth distance calculation means after the calculation of the reference depth distance.
15. The computer-readable storage medium according to claim 14, wherein the depth distance calculation means executes processing of calculating the depth distance from each of the real camera for a right eye and the real camera for a left eye to the predetermined shooting target every time a new piece of real world image data is outputted from each of the real camera for a right eye and the real camera for a left eye, and calculates the reference depth distance, based on results of a plurality of the calculations of the depth distance obtained by the processing being executed a plurality of times.
16. The computer-readable storage medium according to claim 15, wherein the depth distance calculation means determines the reference depth distance, based on the results of the plurality of calculations of the depth distance, only when all the results of the plurality of calculations of the depth distance are within a predetermined range.
17. The computer-readable storage medium according to claim 1, the image display program further causing the computer to function as:
second position orientation calculation means for recognizing the predetermined shooting target in the piece of real world image data outputted from the other one of the real camera for a right eye and the real camera for a left eye which is different from the one of the real camera for a right eye and the real camera for a left eye in the first position orientation calculation means, among the two pieces of real world image data respectively outputted from the real camera for a right eye and the real camera for a left eye, and calculating position orientation information indicating relative positions and relative orientations of the other one of the real camera for a right eye and the real camera for a left eye, and the predetermined shooting target with respect to each other; and
real camera selection means for selecting one of the real camera for a right eye and the real camera for a left eye in accordance with the position of the predetermined shooting target in at least one of the two pieces of real world image data respectively outputted from the real camera for a right eye and the real camera for a left eye,
wherein the virtual camera setting means,
if the real camera selected by the real camera selection means is the one of the real camera for a right eye and the real camera for a left eye in the first position orientation calculation means, determines the position and the orientation of the right virtual camera, and the position and the orientation of the left virtual camera, by using the position orientation information indicating the relative positions and relative orientations of the one of the real camera for a right eye and the real camera for a left eye, and the predetermined shooting target, which is calculated by the first position orientation calculation means, and
if the real camera selected by the real camera selection means is the other one of the real camera for a right eye and the real camera for a left eye in the first position orientation calculation means, determines the position and the orientation of the right virtual camera, and the position and the orientation of the left virtual camera, by using the position orientation information indicating the relative positions and relative orientations of the other one of the real camera for a right eye and the real camera for a left eye, and the predetermined shooting target, which is calculated by the second position orientation calculation means.
18. The computer-readable storage medium according to claim 17, wherein the real camera selection means
switches the real camera to be selected, from the real camera for a left eye to the real camera for a right eye, in accordance with entrance of the position of the predetermined shooting target in the piece of real world image data outputted from the real camera for a left eye into a right end area of the piece of real world image data, and
switches the real camera to be selected, from the real camera for a right eye to the real camera for a left eye, in accordance with entrance of the position of the predetermined shooting target in the piece of real world image data outputted from the real camera for a right eye into a left end area of the piece of real world image data.
19. The computer-readable storage medium according to claim 1, wherein the pieces of real world image data to be used by the first position orientation calculation means and the display control means are outputted in real time from the real camera for a right eye and the real camera for a left eye.
20. The computer-readable storage medium according to claim 1, wherein the computer is included in an information processing apparatus having the real camera for a right eye, the real camera for a left eye, and the stereoscopic display apparatus.
21. An image display apparatus which stereoscopically displays a real world image onto which a 3-dimensional virtual object is superimposed, on a screen of a stereoscopic display apparatus capable of providing a stereoscopic view, by using outputs from a real camera for a right eye and a real camera for a left eye, the image display apparatus comprising:
first position orientation calculation means for recognizing a predetermined shooting target in a piece of real world image data outputted from one of the real camera for a right eye and the real camera for a left eye, among pieces of real world image data respectively outputted from the real camera for a right eye and the real camera for a left eye, and calculating position orientation information indicating relative positions and relative orientations of the one of the real camera for a right eye and the real camera for a left eye, and the predetermined shooting target with respect to each other;
virtual camera setting means for determining the position and the orientation of a right virtual camera for generating an image for a right eye, and the position and the orientation of a left virtual camera for generating an image for a left eye, in a predetermined virtual space, by using the position orientation information calculated by the first position orientation calculation means;
right virtual space image generation means for generating a right virtual space image indicating the predetermined virtual space as it is looked at from the right virtual camera;
left virtual space image generation means for generating a left virtual space image indicating the predetermined virtual space as it is looked at from the left virtual camera; and
display control means for superimposing the right virtual space image onto the piece of real world image data outputted from the real camera for a right eye, and superimposing the left virtual space image onto the piece of real world image data outputted from the real camera for a left eye, to output images for a stereoscopic view to the stereoscopic display apparatus.
22. An image display system which stereoscopically displays a real world image onto which a 3-dimensional virtual object is superimposed, on a screen of a stereoscopic display apparatus capable of providing a stereoscopic view, by using outputs from a real camera for a right eye and a real camera for a left eye, the image display system comprising:
a predetermined shooting target;
first position orientation calculation means for recognizing the predetermined shooting target in a piece of real world image data outputted from one of the real camera for a right eye and the real camera for a left eye, among pieces of real world image data respectively outputted from the real camera for a right eye and the real camera for a left eye, and calculating position orientation information indicating relative positions and relative orientations of the one of the real camera for a right eye and the real camera for a left eye, and the predetermined shooting target with respect to each other;
virtual camera setting means for determining the position and the orientation of a right virtual camera for generating an image for a right eye, and the position and the orientation of a left virtual camera for generating an image for a left eye, in a predetermined virtual space, by using the position orientation information calculated by the first position orientation calculation means;
right virtual space image generation means for generating a right virtual space image indicating the predetermined virtual space as it is looked at from the right virtual camera;
left virtual space image generation means for generating a left virtual space image indicating the predetermined virtual space as it is looked at from the left virtual camera; and
display control means for superimposing the right virtual space image onto the piece of real world image data outputted from the real camera for a right eye, and superimposing the left virtual space image onto the piece of real world image data outputted from the real camera for a left eye, to output images for a stereoscopic view to the stereoscopic display apparatus.
23. An image display method for stereoscopically displaying a real world image onto which a 3-dimensional virtual object is superimposed, on a screen of a stereoscopic display apparatus capable of providing a stereoscopic view, by using outputs from a real camera for a right eye and a real camera for a left eye, the image display method comprising:
a first position orientation calculation step of recognizing a predetermined shooting target in a piece of real world image data outputted from one of the real camera for a right eye and the real camera for a left eye, among pieces of real world image data respectively outputted from the real camera for a right eye and the real camera for a left eye, and calculating position orientation information indicating relative positions and relative orientations of the one of the real camera for a right eye and the real camera for a left eye, and the predetermined shooting target with respect to each other;
a virtual camera setting step of determining the position and the orientation of a right virtual camera for generating an image for a right eye, and the position and the orientation of a left virtual camera for generating an image for a left eye, in a predetermined virtual space, by using the position orientation information calculated in the first position orientation calculation step;
a right virtual space image generation step of generating a right virtual space image indicating the predetermined virtual space as it is looked at from the right virtual camera;
a left virtual space image generation step of generating a left virtual space image indicating the predetermined virtual space as it is looked at from the left virtual camera; and
a display control step of superimposing the right virtual space image onto the piece of real world image data outputted from the real camera for a right eye, and superimposing the left virtual space image onto the piece of real world image data outputted from the real camera for a left eye, to output images for a stereoscopic view to the stereoscopic display apparatus.
US13/158,824 2010-06-11 2011-06-13 Computer-Readable Storage Medium, Image Display Apparatus, Image Display System, and Image Display Method Abandoned US20110304702A1 (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP2010-134575 2010-06-11
JP2010134575 2010-06-11
JP2010-134545 2010-06-11
JP2010134545 2010-06-11
JP2010135507 2010-06-14
JP2010-135507 2010-06-14

Publications (1)

Publication Number Publication Date
US20110304702A1 true US20110304702A1 (en) 2011-12-15

Family

ID=44358250

Family Applications (3)

Application Number Title Priority Date Filing Date
US13/158,736 Active 2031-11-28 US8780183B2 (en) 2010-06-11 2011-06-13 Computer-readable storage medium, image display apparatus, image display system, and image display method
US13/158,939 Active 2032-03-09 US10015473B2 (en) 2010-06-11 2011-06-13 Computer-readable storage medium, image display apparatus, image display system, and image display method
US13/158,824 Abandoned US20110304702A1 (en) 2010-06-11 2011-06-13 Computer-Readable Storage Medium, Image Display Apparatus, Image Display System, and Image Display Method

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US13/158,736 Active 2031-11-28 US8780183B2 (en) 2010-06-11 2011-06-13 Computer-readable storage medium, image display apparatus, image display system, and image display method
US13/158,939 Active 2032-03-09 US10015473B2 (en) 2010-06-11 2011-06-13 Computer-readable storage medium, image display apparatus, image display system, and image display method

Country Status (3)

Country Link
US (3) US8780183B2 (en)
EP (3) EP2395769B1 (en)
CN (2) CN102281455B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130215109A1 (en) * 2012-02-22 2013-08-22 Silka Miesnieks Designating Real World Locations for Virtual World Control
US9268410B2 (en) 2012-02-10 2016-02-23 Sony Corporation Image processing device, image processing method, and program
US10216999B2 (en) 2016-09-16 2019-02-26 Semiconductor Energy Laboratory Co., Ltd. Display system, electronic device, and display method
US10317221B2 (en) 2016-08-05 2019-06-11 Semiconductor Energy Laboratory Co., Ltd. Data processing system
US10506218B2 (en) 2010-03-12 2019-12-10 Nintendo Co., Ltd. Computer-readable storage medium having stored therein display control program, display control apparatus, display control system, and display control method
US11205305B2 (en) 2014-09-22 2021-12-21 Samsung Electronics Company, Ltd. Presentation of three-dimensional video

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4869430B1 (en) * 2010-09-24 2012-02-08 任天堂株式会社 Image processing program, image processing apparatus, image processing system, and image processing method
US9563265B2 (en) * 2012-01-12 2017-02-07 Qualcomm Incorporated Augmented reality with sound and geometric analysis
JP6065370B2 (en) * 2012-02-03 2017-01-25 ソニー株式会社 Information processing apparatus, information processing method, and program
JP5950701B2 (en) * 2012-06-11 2016-07-13 任天堂株式会社 Image display system, puzzle game system, image display method, puzzle game method, image display device, puzzle game device, image display program, and puzzle game program
JP5808502B2 (en) 2012-11-21 2015-11-10 三菱電機株式会社 Image generation device
GB2510811A (en) * 2012-12-18 2014-08-20 Optricks Media Ltd Augmented reality systems
GB2527973B (en) * 2013-05-30 2020-06-10 Anthony Smith Charles HUD object design and display method
CN110234000B (en) * 2013-06-17 2021-07-13 瑞尔D斯帕克有限责任公司 Teleconferencing method and telecommunication system
JP6171671B2 (en) * 2013-07-24 2017-08-02 富士通株式会社 Information processing apparatus, position specifying method, and position specifying program
US10146331B2 (en) * 2014-11-28 2018-12-04 Ricoh Company, Ltd. Information processing system for transforming coordinates of a position designated by a pointer in a virtual image to world coordinates, information processing apparatus, and method of transforming coordinates
US20170061700A1 (en) * 2015-02-13 2017-03-02 Julian Michael Urbach Intercommunication between a head mounted display and a real world object
KR20190025534A (en) * 2016-07-15 2019-03-11 게이트박스 가부시키가이샤 Stereoscopic image display device
FR3058549B1 (en) 2016-11-04 2019-01-25 Stereolabs METHOD FOR DISPLAYING A MIXED REALITY IMAGE
US10424121B1 (en) * 2016-11-06 2019-09-24 Oded Melinek Generated offering exposure
CN110119195A (en) * 2018-02-06 2019-08-13 广东虚拟现实科技有限公司 Visual interactive device and marker
KR102530278B1 (en) 2018-10-18 2023-05-10 삼성디스플레이 주식회사 Electronic device having display module and image display method
US10997630B2 (en) * 2018-12-20 2021-05-04 Rovi Guides, Inc. Systems and methods for inserting contextual advertisements into a virtual environment
KR102198851B1 (en) 2019-11-12 2021-01-05 네이버랩스 주식회사 Method for generating three dimensional model data of an object
US11417020B2 (en) * 2020-04-07 2022-08-16 Zebra Technologies Corporation Detection of calibration errors
JP7257370B2 (en) * 2020-11-18 2023-04-13 任天堂株式会社 Information processing program, information processing device, information processing system, and information processing method

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050018045A1 (en) * 2003-03-14 2005-01-27 Thomas Graham Alexander Video processing
US20050089212A1 (en) * 2002-03-27 2005-04-28 Sanyo Electric Co., Ltd. Method and apparatus for processing three-dimensional images
US20050175257A1 (en) * 2002-05-21 2005-08-11 Yoshihiko Kuroki Information processing apparatus, information processing system, and dialogist displaying method
US20050270309A1 (en) * 2004-05-07 2005-12-08 Namco Ltd. Program product, image generation method and image generation system
US20070038944A1 (en) * 2005-05-03 2007-02-15 Seac02 S.R.I. Augmented reality system with real marker object identification
US7321682B2 (en) * 2002-11-12 2008-01-22 Namco Bandai Games, Inc. Image generation system, image generation method, program, and information storage medium
US20080266386A1 (en) * 2007-04-25 2008-10-30 Canon Kabushiki Kaisha System
JP2009025918A (en) * 2007-07-17 2009-02-05 Canon Inc Image processor and image processing method
US7532224B2 (en) * 2005-04-08 2009-05-12 Canon Kabushiki Kaisha Information processing method and apparatus
WO2009127701A1 (en) * 2008-04-16 2009-10-22 Virtual Proteins B.V. Interactive virtual reality image generating system
US20090262108A1 (en) * 2008-01-18 2009-10-22 Sony Corporation Streaming geometery for use in displaying and editing 3d imagery
US20100026712A1 (en) * 2008-07-31 2010-02-04 Stmicroelectronics S.R.L. Method and system for video rendering, computer program product therefor
US20100085423A1 (en) * 2004-09-30 2010-04-08 Eric Belk Lange Stereoscopic imaging
US20100142801A1 (en) * 2008-12-09 2010-06-10 Microsoft Corporation Stereo Movie Editing
US20110242286A1 (en) * 2010-03-31 2011-10-06 Vincent Pace Stereoscopic Camera With Automatic Obstruction Removal
US20120077582A1 (en) * 2010-09-24 2012-03-29 Hal Laboratory Inc. Computer-Readable Storage Medium Having Program Stored Therein, Apparatus, System, and Method, for Performing Game Processing
US9025007B1 (en) * 2009-04-28 2015-05-05 Lucasfilm Entertainment Company Ltd. Configuring stereo cameras

Family Cites Families (143)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH01160365A (en) 1987-12-16 1989-06-23 Hitachi Ltd Power converter
US5184830A (en) 1989-01-10 1993-02-09 Nintendo Company Limited Compact hand-held video game system
JP2627208B2 (en) 1990-06-14 1997-07-02 株式会社セガ・エンタープライゼス Game device and television tuner cartridge for game device
JP2758274B2 (en) 1991-02-15 1998-05-28 株式会社クボタ Engine lubricating oil pressure abnormality detection device
JP2799126B2 (en) 1993-03-26 1998-09-17 株式会社ナムコ Video game device and game input device
JPH06339155A (en) 1993-05-28 1994-12-06 Sharp Corp Three-dimensional image pickup system
KR0180026B1 (en) * 1994-05-13 1999-05-01 마츠모도 세이야 Stereoscopio display using movement direction detection
US6384859B1 (en) * 1995-03-29 2002-05-07 Sanyo Electric Co., Ltd. Methods for creating an image for a three-dimensional display, for calculating depth information and for image processing using the depth information
CN100524015C (en) 1995-06-22 2009-08-05 3Dv系统有限公司 Method and apparatus for generating range subject distance image
US6005607A (en) 1995-06-29 1999-12-21 Matsushita Electric Industrial Co., Ltd. Stereoscopic computer graphics image generating apparatus and stereoscopic TV apparatus
US5964830A (en) 1995-08-22 1999-10-12 Durrett; Charles M. User portal device for the world wide web to communicate with a website server
US5959663A (en) 1995-10-19 1999-09-28 Sony Corporation Stereoscopic image generation method and apparatus thereof
WO1998002223A1 (en) 1996-07-11 1998-01-22 Sega Enterprises, Ltd. Voice recognizer, voice recognizing method and game machine using them
JP3558104B2 (en) 1996-08-05 2004-08-25 ソニー株式会社 Three-dimensional virtual object display apparatus and method
US20020120925A1 (en) 2000-03-28 2002-08-29 Logan James D. Audio and video program recording, editing and playback systems using metadata
JP3962115B2 (en) * 1996-11-08 2007-08-22 オリンパス株式会社 Video observation device
US6342900B1 (en) 1996-12-06 2002-01-29 Nikon Corporation Information processing apparatus
US6042478A (en) 1997-02-10 2000-03-28 Tiger Electronics, Ltd. Hand held video game
JP3771989B2 (en) 1997-03-24 2006-05-10 オリンパス株式会社 Image / audio communication system and videophone transmission / reception method
US6057833A (en) 1997-04-07 2000-05-02 Shoreline Studios Method and apparatus for providing real time enhancements and animations over a video image
US6252624B1 (en) 1997-07-18 2001-06-26 Idemitsu Kosan Co., Ltd. Three dimensional display
US20020113865A1 (en) 1997-09-02 2002-08-22 Kotaro Yano Image processing method and apparatus
US8202094B2 (en) 1998-02-18 2012-06-19 Radmila Solutions, L.L.C. System and method for training users with audible answers to spoken questions
CN1244762A (en) * 1998-08-10 2000-02-16 袁灿伦 Realizing method and device stereo video image
JP2000069404A (en) 1998-08-25 2000-03-03 Konami Co Ltd Image print creating device
EP0984385B1 (en) 1998-08-31 2013-02-20 Sony Corporation Two-dimensional code recognition processing
JP3786167B2 (en) 1998-08-31 2006-06-14 ソニー株式会社 Two-dimensional code recognition processing method, two-dimensional code recognition processing apparatus, and two-dimensional code recognition processing program storage medium
US6504710B2 (en) * 1998-11-27 2003-01-07 Xplore Technologies Corp. Method of interconnecting of a hand-held auxiliary unit, a portable computer and a peripheral device
US6743104B1 (en) 1999-11-18 2004-06-01 Nintendo Co., Ltd. Portable game machine
JP2001165144A (en) 1999-12-07 2001-06-19 Fujitsu Ltd Folding cellphone
JP2001251399A (en) 2000-03-03 2001-09-14 Sanyo Electric Co Ltd Foldable mobile phone
KR100419927B1 (en) 2000-03-03 2004-02-25 산요덴키가부시키가이샤 Folder type portable phone
JP3392801B2 (en) 2000-03-03 2003-03-31 三洋電機株式会社 Foldable mobile phone
JP3459606B2 (en) 2000-03-03 2003-10-20 三洋電機株式会社 Foldable mobile phone
US6474819B2 (en) 2000-03-21 2002-11-05 Texas Instruments Incorporated Combination overhead projector and electronic display device
US6810463B2 (en) 2000-05-24 2004-10-26 Nintendo Co., Ltd. Gaming machine that is usable with different game cartridge types
US20010050756A1 (en) * 2000-06-07 2001-12-13 Lenny Lipton Software generated color organ for stereoscopic and planar applications
US7134960B1 (en) 2000-08-23 2006-11-14 Nintendo Co., Ltd. External interfaces for a 3D graphics system
JP3625184B2 (en) 2000-09-25 2005-03-02 コナミ株式会社 3D image processing method and apparatus for game, readable recording medium recording game 3D image processing program, and video game apparatus
JP2002159019A (en) 2000-11-16 2002-05-31 Canon Inc Display control device, imaging position estimating device, display system, image pickup system, image positioning method, imaging position estimating method, and recording medium recorded with process program
US6820056B1 (en) 2000-11-21 2004-11-16 International Business Machines Corporation Recognizing non-verbal sound commands in an interactive computer controlled speech word recognition display system
JP2002160365A (en) 2000-11-27 2002-06-04 Seiko Instruments Inc Head chip and its manufacturing method
JP3398796B2 (en) 2001-01-31 2003-04-21 飛島建設株式会社 Image system for 3D survey support using mixed reality
US6997387B1 (en) * 2001-03-28 2006-02-14 The Code Corporation Apparatus and method for calibration of projected target point within an image
US7371163B1 (en) 2001-05-10 2008-05-13 Best Robert M 3D portable game system
IL159537A0 (en) 2001-06-28 2004-06-01 Omnivee Inc Method and apparatus for control and processing of video images
US20040077404A1 (en) 2002-10-17 2004-04-22 Schlottmann Gregory A. Transparent objects on a gaming machine
US7901289B2 (en) 2001-08-09 2011-03-08 Igt Transparent objects on a gaming machine
US20030062675A1 (en) 2001-09-28 2003-04-03 Canon Kabushiki Kaisha Image experiencing system and information processing method
JP3584230B2 (en) 2001-09-28 2004-11-04 キヤノン株式会社 Video experience system, information processing method and program
JP2004007214A (en) 2002-05-31 2004-01-08 Canon Inc Image pickup unit
EP1531606A1 (en) 2002-06-07 2005-05-18 Matsushita Electric Industrial Co., Ltd. Collapsible mobile telephone
US20060203085A1 (en) * 2002-11-28 2006-09-14 Seijiro Tomita There dimensional image signal producing circuit and three-dimensional image display apparatus
JP2004310686A (en) 2003-04-10 2004-11-04 Canon Inc Image processing method and device
JP4378118B2 (en) 2003-06-27 2009-12-02 学校法人早稲田大学 3D image presentation device
GB2407635B (en) * 2003-10-31 2006-07-12 Hewlett Packard Development Co Improvements in and relating to camera control
JP4338500B2 (en) 2003-11-14 2009-10-07 富士フイルム株式会社 Imaging device
JP4069855B2 (en) * 2003-11-27 2008-04-02 ソニー株式会社 Image processing apparatus and method
JP2005165776A (en) 2003-12-03 2005-06-23 Canon Inc Image processing method and image processor
JP2005204754A (en) 2004-01-20 2005-08-04 Nintendo Co Ltd Game system employing touch panel input
JP3904562B2 (en) 2004-02-18 2007-04-11 株式会社ソニー・コンピュータエンタテインメント Image display system, recording medium, and program
JP2005250950A (en) 2004-03-05 2005-09-15 Nippon Telegr & Teleph Corp <Ntt> Marker presentation portable terminal, expanded sense of reality system, and its operation method
JP2005286714A (en) 2004-03-30 2005-10-13 Nec Corp Mobile phone
JP4537104B2 (en) * 2004-03-31 2010-09-01 キヤノン株式会社 Marker detection method, marker detection device, position and orientation estimation method, and mixed reality space presentation method
JP2005353047A (en) 2004-05-13 2005-12-22 Sanyo Electric Co Ltd Three-dimensional image processing method and three-dimensional image processor
US7671916B2 (en) 2004-06-04 2010-03-02 Electronic Arts Inc. Motion sensor using dual camera inputs
JP2006024175A (en) 2004-06-11 2006-01-26 Doshisha Mobile terminal and adapter
US20060244757A1 (en) 2004-07-26 2006-11-02 The Board Of Trustees Of The University Of Illinois Methods and systems for image modification
US8547401B2 (en) 2004-08-19 2013-10-01 Sony Computer Entertainment Inc. Portable augmented reality device and method
JP2006060516A (en) 2004-08-20 2006-03-02 Sharp Corp Foldable cellular phone
JP3841806B2 (en) 2004-09-01 2006-11-08 株式会社ソニー・コンピュータエンタテインメント Image processing apparatus and image processing method
WO2006025137A1 (en) 2004-09-01 2006-03-09 Sony Computer Entertainment Inc. Image processor, game machine, and image processing method
JP4587166B2 (en) 2004-09-14 2010-11-24 キヤノン株式会社 Moving body tracking system, photographing apparatus, and photographing method
JP4302024B2 (en) 2004-09-17 2009-07-22 株式会社ソニー・コンピュータエンタテインメント Button structure, portable electronic device
JP4708752B2 (en) * 2004-09-28 2011-06-22 キヤノン株式会社 Information processing method and apparatus
EP1814101A1 (en) 2004-11-19 2007-08-01 Daem Interactive, Sl Personal device with image-acquisition functions for the application of augmented reality resources and corresponding method
EP3471020B1 (en) 2005-01-27 2021-03-03 Leica Biosystems Imaging Inc. Systems and methods for viewing three dimensional virtual slides
US8009871B2 (en) 2005-02-08 2011-08-30 Microsoft Corporation Method and system to segment depth images and to detect shapes in three-dimensionally acquired data
JP4406615B2 (en) 2005-02-23 2010-02-03 任天堂株式会社 Command processing apparatus and command processing program
JP2006262980A (en) 2005-03-22 2006-10-05 Olympus Corp Information terminal and virtual pet display method
JP2006271663A (en) 2005-03-29 2006-10-12 Namco Bandai Games Inc Program, information storage medium, and image pickup and display device
JP4555722B2 (en) 2005-04-13 2010-10-06 株式会社 日立ディスプレイズ 3D image generator
BRPI0520196A2 (en) * 2005-04-25 2009-04-22 Yappa Corp 3d image generation and display system
KR100662845B1 (en) 2005-06-30 2007-01-02 삼성전자주식회사 Memory card slot door and digital mobile apparatus having the same
KR100722229B1 (en) 2005-12-02 2007-05-29 한국전자통신연구원 Apparatus and method for immediately creating and controlling virtual reality interaction human model for user centric interface
US8189038B2 (en) 2005-12-21 2012-05-29 International Business Machines Corporation Stereographic projection apparatus with passive eyewear utilizing a continuously variable polarizing element
JP5048249B2 (en) 2006-01-27 2012-10-17 任天堂株式会社 GAME DEVICE AND GAME PROGRAM
JP4798358B2 (en) 2006-03-14 2011-10-19 セイコーエプソン株式会社 Image processing system, display device, and image processing method
US20070236514A1 (en) 2006-03-29 2007-10-11 Bracco Imaging Spa Methods and Apparatuses for Stereoscopic Image Guided Surgical Navigation
JP4847195B2 (en) 2006-04-21 2011-12-28 キヤノン株式会社 How to get color information from an image
US8766983B2 (en) 2006-05-07 2014-07-01 Sony Computer Entertainment Inc. Methods and systems for processing an interchange of real time effects during video communication
CN101479765B (en) 2006-06-23 2012-05-23 图象公司 Methods and systems for converting 2d motion pictures for stereoscopic 3d exhibition
JP4804256B2 (en) 2006-07-27 2011-11-02 キヤノン株式会社 Information processing method
JP4757142B2 (en) 2006-08-10 2011-08-24 キヤノン株式会社 Imaging environment calibration method and information processing apparatus
US20080071559A1 (en) 2006-09-19 2008-03-20 Juha Arrasvuori Augmented reality assisted shopping
JP4796926B2 (en) 2006-09-21 2011-10-19 株式会社ソニー・コンピュータエンタテインメント Electronic device casing and electronic device
JP4896230B2 (en) 2006-11-17 2012-03-14 トムソン ライセンシング System and method of object model fitting and registration for transforming from 2D to 3D
JP2008146109A (en) 2006-12-05 2008-06-26 Canon Inc Image processing method and image processor
US8144148B2 (en) 2007-02-08 2012-03-27 Edge 3 Technologies Llc Method and system for vision-based interaction in a virtual environment
GB0703974D0 (en) 2007-03-01 2007-04-11 Sony Comp Entertainment Europe Entertainment device
US8305428B2 (en) 2007-04-18 2012-11-06 Inlife-Handnet Co., Ltd Stereo video shooting and viewing device
JP2008277903A (en) 2007-04-25 2008-11-13 Sony Corp Imaging apparatus and object to be focused determination method
EP2009868B1 (en) 2007-06-29 2016-09-07 Alcatel Lucent Method and system for improving the appearance of a person on the RTP stream coming from a media terminal
JP4947593B2 (en) 2007-07-31 2012-06-06 Kddi株式会社 Apparatus and program for generating free viewpoint image by local region segmentation
JP4260215B1 (en) 2007-08-29 2009-04-30 任天堂株式会社 Imaging device
US8177441B2 (en) 2007-08-29 2012-05-15 Nintendo Co., Ltd. Imaging apparatus
EP2039398B1 (en) 2007-08-29 2016-05-04 Nintendo Co., Ltd. Imaging apparatus
US8917985B2 (en) 2007-08-29 2014-12-23 Nintendo Co., Ltd. Imaging apparatus
DE102007044536A1 (en) 2007-09-18 2009-03-19 Bayerische Motoren Werke Aktiengesellschaft Device for monitoring the environment of a motor vehicle
JP2009151608A (en) 2007-12-21 2009-07-09 Nhn Corp Image formation method and image display method
JP2009205556A (en) 2008-02-28 2009-09-10 Panasonic Corp User interface device
CN101236653B (en) * 2008-03-03 2010-09-29 华为技术有限公司 Image correction method and system
JP2010134575A (en) 2008-12-03 2010-06-17 Sony Corp Apparatus, method, and program for information processing and recording medium
US8477139B2 (en) 2008-06-09 2013-07-02 Apple Inc. Touch screen device, method, and graphical user interface for manipulating three-dimensional virtual objects
US8237807B2 (en) 2008-07-24 2012-08-07 Apple Inc. Image capturing device with touch screen for adjusting camera settings
EP2157545A1 (en) * 2008-08-19 2010-02-24 Sony Computer Entertainment Europe Limited Entertainment device, system and method
US20100048290A1 (en) 2008-08-19 2010-02-25 Sony Computer Entertainment Europe Ltd. Image combining method, system and apparatus
WO2010038296A1 (en) 2008-10-01 2010-04-08 任天堂株式会社 Information processing device, information processing system, boot program and storage medium storing same
JP5358152B2 (en) 2008-10-03 2013-12-04 任天堂株式会社 Handheld information processing device
JP5156571B2 (en) 2008-10-10 2013-03-06 キヤノン株式会社 Image processing apparatus and image processing method
US9480919B2 (en) 2008-10-24 2016-11-01 Excalibur Ip, Llc Reconfiguring reality using a reality overlay device
KR20100053349A (en) * 2008-11-12 2010-05-20 엘지전자 주식회사 Touch module, fabaricating method thereof, and portable terminal having the same
JP5168580B2 (en) * 2008-12-01 2013-03-21 富士通株式会社 Driving simulation device, wide-angle camera image simulation device, and image deformation synthesis device
JP4668315B2 (en) 2008-12-02 2011-04-13 フェリカネットワークス株式会社 Information processing apparatus, communication control method, and program
JP5269568B2 (en) 2008-12-03 2013-08-21 アドヴァンスド・ディスプレイ・プロセス・エンジニアリング・コーポレーション・リミテッド Substrate processing apparatus and method for opening and closing process space inside substrate processing apparatus
JP4653212B2 (en) 2008-12-25 2011-03-16 富士通株式会社 Folding mobile phone
JP4834116B2 (en) 2009-01-22 2011-12-14 株式会社コナミデジタルエンタテインメント Augmented reality display device, augmented reality display method, and program
JP5320133B2 (en) 2009-03-31 2013-10-23 株式会社エヌ・ティ・ティ・ドコモ Information presentation system, information presentation server, communication terminal, and information presentation method
US20100257252A1 (en) 2009-04-01 2010-10-07 Microsoft Corporation Augmented Reality Cloud Computing
JP5247590B2 (en) 2009-05-21 2013-07-24 キヤノン株式会社 Information processing apparatus and calibration processing method
JP5421655B2 (en) 2009-05-28 2014-02-19 任天堂株式会社 GAME PROGRAM AND GAME DEVICE
US8715031B2 (en) 2009-08-06 2014-05-06 Peter Sui Lun Fong Interactive device with sound-based action synchronization
US8711204B2 (en) 2009-11-11 2014-04-29 Disney Enterprises, Inc. Stereoscopic editing for video production, post-production and display adaptation
US8922625B2 (en) 2009-11-19 2014-12-30 Lg Electronics Inc. Mobile terminal and controlling method thereof
US8730309B2 (en) 2010-02-23 2014-05-20 Microsoft Corporation Projectors and depth cameras for deviceless augmented reality and interaction
JP5649846B2 (en) * 2010-03-31 2015-01-07 株式会社バンダイナムコゲームス Program and image generation system
US8549126B2 (en) 2010-04-09 2013-10-01 Sony Computer Entertainment Inc. Application level broadcast in peer overlay network
US9693039B2 (en) 2010-05-27 2017-06-27 Nintendo Co., Ltd. Hand-held electronic device
US8384770B2 (en) 2010-06-02 2013-02-26 Nintendo Co., Ltd. Image display system, image display apparatus, and image display method
JP5602618B2 (en) 2010-06-10 2014-10-08 任天堂株式会社 Image processing program, image processing apparatus, image processing system, and image processing method
JP5186715B2 (en) 2010-06-14 2013-04-24 任天堂株式会社 Display control program, display control device, display control method, and display control system
KR101674956B1 (en) 2010-07-12 2016-11-10 엘지전자 주식회사 MOBILE TERMINAL AND METHOD FOR CONTROLLING A THREE DIMENSION IMAGE in thereof
JP4869430B1 (en) * 2010-09-24 2012-02-08 任天堂株式会社 Image processing program, image processing apparatus, image processing system, and image processing method

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050089212A1 (en) * 2002-03-27 2005-04-28 Sanyo Electric Co., Ltd. Method and apparatus for processing three-dimensional images
US20050175257A1 (en) * 2002-05-21 2005-08-11 Yoshihiko Kuroki Information processing apparatus, information processing system, and dialogist displaying method
US7321682B2 (en) * 2002-11-12 2008-01-22 Namco Bandai Games, Inc. Image generation system, image generation method, program, and information storage medium
US20050018045A1 (en) * 2003-03-14 2005-01-27 Thomas Graham Alexander Video processing
US20050270309A1 (en) * 2004-05-07 2005-12-08 Namco Ltd. Program product, image generation method and image generation system
US20100085423A1 (en) * 2004-09-30 2010-04-08 Eric Belk Lange Stereoscopic imaging
US7532224B2 (en) * 2005-04-08 2009-05-12 Canon Kabushiki Kaisha Information processing method and apparatus
US20070038944A1 (en) * 2005-05-03 2007-02-15 Seac02 S.R.I. Augmented reality system with real marker object identification
US20080266386A1 (en) * 2007-04-25 2008-10-30 Canon Kabushiki Kaisha System
JP2009025918A (en) * 2007-07-17 2009-02-05 Canon Inc Image processor and image processing method
US20090262108A1 (en) * 2008-01-18 2009-10-22 Sony Corporation Streaming geometery for use in displaying and editing 3d imagery
US20090262184A1 (en) * 2008-01-18 2009-10-22 Sony Corporation Method and apparatus for displaying and editing 3d imagery
WO2009127701A1 (en) * 2008-04-16 2009-10-22 Virtual Proteins B.V. Interactive virtual reality image generating system
US20110029903A1 (en) * 2008-04-16 2011-02-03 Virtual Proteins B.V. Interactive virtual reality image generating system
US20100026712A1 (en) * 2008-07-31 2010-02-04 Stmicroelectronics S.R.L. Method and system for video rendering, computer program product therefor
US20100142801A1 (en) * 2008-12-09 2010-06-10 Microsoft Corporation Stereo Movie Editing
US9025007B1 (en) * 2009-04-28 2015-05-05 Lucasfilm Entertainment Company Ltd. Configuring stereo cameras
US20110242286A1 (en) * 2010-03-31 2011-10-06 Vincent Pace Stereoscopic Camera With Automatic Obstruction Removal
US20120077582A1 (en) * 2010-09-24 2012-03-29 Hal Laboratory Inc. Computer-Readable Storage Medium Having Program Stored Therein, Apparatus, System, and Method, for Performing Game Processing

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Japanese Published Patent Application JP-08-047001A published on February 16th, 1996 referred to as Ishi throughout *
Matsui JP2009-025918A referred to as throughout *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10506218B2 (en) 2010-03-12 2019-12-10 Nintendo Co., Ltd. Computer-readable storage medium having stored therein display control program, display control apparatus, display control system, and display control method
US10764565B2 (en) 2010-03-12 2020-09-01 Nintendo Co., Ltd. Computer-readable storage medium having stored therein display control program, display control apparatus, display control system, and display control method
US9268410B2 (en) 2012-02-10 2016-02-23 Sony Corporation Image processing device, image processing method, and program
US20130215109A1 (en) * 2012-02-22 2013-08-22 Silka Miesnieks Designating Real World Locations for Virtual World Control
US11205305B2 (en) 2014-09-22 2021-12-21 Samsung Electronics Company, Ltd. Presentation of three-dimensional video
US10317221B2 (en) 2016-08-05 2019-06-11 Semiconductor Energy Laboratory Co., Ltd. Data processing system
US10216999B2 (en) 2016-09-16 2019-02-26 Semiconductor Energy Laboratory Co., Ltd. Display system, electronic device, and display method

Also Published As

Publication number Publication date
EP2395767A1 (en) 2011-12-14
CN102281455B (en) 2015-12-09
CN102274633B (en) 2015-06-03
EP2395769A1 (en) 2011-12-14
CN102274633A (en) 2011-12-14
EP2395768B1 (en) 2015-02-25
US10015473B2 (en) 2018-07-03
EP2395768A1 (en) 2011-12-14
EP2395767B1 (en) 2014-11-12
CN102281455A (en) 2011-12-14
US8780183B2 (en) 2014-07-15
US20110304701A1 (en) 2011-12-15
US20110304703A1 (en) 2011-12-15
EP2395769B1 (en) 2015-03-04

Similar Documents

Publication Publication Date Title
US8780183B2 (en) Computer-readable storage medium, image display apparatus, image display system, and image display method
US8970678B2 (en) Computer-readable storage medium, image display apparatus, system, and method
US9278281B2 (en) Computer-readable storage medium, information processing apparatus, information processing system, and information processing method
US8902298B2 (en) Storage medium having stored therein stereoscopic image display program, stereoscopic image display device, stereoscopic image display system, and stereoscopic image display method
US8854356B2 (en) Storage medium having stored therein image processing program, image processing apparatus, image processing system, and image processing method
US9530249B2 (en) Computer-readable storage medium having image processing program stored therein, image processing apparatus, image processing system, and image processing method
US8698902B2 (en) Computer-readable storage medium having image processing program stored therein, image processing apparatus, image processing system, and image processing method
US8963955B2 (en) Information processing program, information processing apparatus, information processing system, and information processing method
US20120293549A1 (en) Computer-readable storage medium having information processing program stored therein, information processing apparatus, information processing system, and information processing method
US8749571B2 (en) Storage medium having information processing program stored therein, information processing apparatus, information processing system, and information processing method
US9142058B2 (en) Storage medium, information processing apparatus, information processing method and information processing system
US8872891B2 (en) Storage medium, information processing apparatus, information processing method and information processing system
US9113144B2 (en) Image processing system, storage medium, image processing method, and image processing apparatus for correcting the degree of disparity of displayed objects
US20120306855A1 (en) Storage medium having stored therein display control program, display control apparatus, display control method, and display control system
JP5739673B2 (en) Image display program, apparatus, system and method
JP5739670B2 (en) Image display program, apparatus, system and method
JP5739672B2 (en) Image display program, apparatus, system and method

Legal Events

Date Code Title Description
AS Assignment

Owner name: NINTENDO CO., LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ITO, YUICHIRO;REEL/FRAME:026433/0243

Effective date: 20110525

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION