WO2012077106A1 - System and method for direct retinal display with integrated 3d alignment and a virtual input device - Google Patents

System and method for direct retinal display with integrated 3d alignment and a virtual input device Download PDF

Info

Publication number
WO2012077106A1
WO2012077106A1 PCT/IL2011/000929 IL2011000929W WO2012077106A1 WO 2012077106 A1 WO2012077106 A1 WO 2012077106A1 IL 2011000929 W IL2011000929 W IL 2011000929W WO 2012077106 A1 WO2012077106 A1 WO 2012077106A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
image
display system
retinal
image data
Prior art date
Application number
PCT/IL2011/000929
Other languages
French (fr)
Inventor
Amir Alon
Andrei Sharf
Yuval Sharon
Original Assignee
Amir Alon
Andrei Sharf
Yuval Sharon
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Amir Alon, Andrei Sharf, Yuval Sharon filed Critical Amir Alon
Publication of WO2012077106A1 publication Critical patent/WO2012077106A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/34Stereoscopes providing a stereoscopic pair of separated images corresponding to parallactically displaced views of the same object, e.g. 3D slide viewers

Definitions

  • Communication systems such as mobile phone, on the other hand, have much smaller I/O devices - easy to carry but hard to interact with.
  • the direct-retinal-display system includes a remote-projecting apparatus, including one or two image projectors coupled to respectively operate with the one or two lenses, wherein each of the image projectors is facilitated to project a light beam, carrying the image data generated by the media generation device, onto at least a portion of a respective lens of the direct-retinal-eyewear device.
  • the one or two lenses are preconfigured to focus the image data onto the retina of the one or both eyes of the user, respectively.
  • the direct-retinal-eyewear device and the remote-projecting apparatus are physically separated.
  • the one or two lenses are wavelength selective holographic lenses.
  • An aspect of the present invention is to provide a method for providing a user with image data generated by a media generation device.
  • the method including the steps of providing display system according to the present invention, projecting the image data by the one or two image projectors, in a preconfigured direction, wearing the direct-retinal - eyewear device by the user, coarsely moving the head by the user until observing at least a portion of the projected image data by the user, and fine moving of the head by the user until fully observing the projected image data by the user.
  • Fig. 3 is a view of a portion an image frame, with a virtual keyboard overlay and a hand- image overlay, as projected to the user;
  • the image data projected by a first projector 120a is different from the image data projected by a second projector 120b, wherein the difference between the two simultaneous image frames is preconfigured to cause the user to apprehend the two data images as a 3D image.
  • the virtual-keyboard is activated on a need basis.
  • the virtual-keyboard is drawn on the lower part of the console, preferably in 3D.
  • virtual-keyboard is drawn semitransparent or in wire frame.
  • the virtual-keyboard is operable and preferably, can be dragged in the virtual-screen space. While being dragged the virtual-keyboard is change to reflect the camera point of view and the relative position of the keyboard in the screen space.
  • Step 340 performing fine movements of the head.
  • Step 350 proceed with desired activity.
  • FIG. 8 a schematic flow chart showing an example alignment method 400, using a light spot 135 as a target alignment mark, according to embodiments of the present invention.
  • Method 400 proceeds as follows:
  • the user makes an eye contact with the alignment light spot 135 formed on the diffractive element 130.
  • Step 430 keeping focus on the spatial location of light spot 135 and moving the head towards optical axis 115 of the respective projector 120.
  • the user moves his/her head towards a predicted location the optical axis of the image projectors 120.
  • the user makes fine adjustment to the head spatial position until see the whole image data.
  • Step 470 proceed with desired activity.

Abstract

A direct-retinal-display system for providing a user with image data generated by a media generation device. The direct-retinal-display system includes a direct-retinal-eyewear device, having one or two lenses each coupled to operate respective a eye of the user, and a frame for positioning the lenses at a preconfigured distance from the user's eyes. The direct-retinal-display system includes a remote-projecting apparatus, including one or two image projectors coupled to respectively operate with the lenses, wherein each of the image projectors is facilitated to project a light beam, carrying the image data, onto at least a portion of a respective lens of the direct-retinal-eyewear device. The lenses are preconfigured to focus the image data onto the retina of the user's eyes, respectively. The direct-retinal-eyewear device and the remote-projecting apparatus are physically separated.

Description

SYSTEM AND METHOD FOR DIRECT RETINAL DISPLAY WITH INTEGRATED 3D ALIGNMENT AND A VIRTUAL INPUT DEVICE
CROSS REFERENCE TO RELATED APPLICATIONS
This application claims the benefit under 35 USC 1 19(e) from US provisional application 61/422,143, filed on December 1 1th, 2010, US provisional application 61/478,523, filed on April 24th, 201 1, US provisional application 61/481,206, filed on May 1st, 2011, and US provisional application 61/538,869, filed on September 25th, 201 1 , the disclosures of which are incorporated by reference for all purposes as if fully set forth herein.
FIELD OF THE INVENTION
This invention relates to the field of direct retinal display systems and more particularly, the present invention relates to a direct retinal display system having direct- retinal-eyewear device and a remote-projecting apparatus that are physically separated.
BACKGROUND OF THE INVENTION
A large part of computer's weight and size is consumed by I/O devices such as screens and keyboards - comfortable to interact with but heavy to cany.
Communication systems such as mobile phone, on the other hand, have much smaller I/O devices - easy to carry but hard to interact with.
Various interaction devices such as head mounted displays and touch screen keyboards have been suggested but so far no I/O solution has immerged which is both small, light weight and easy to use.
Direct retinal projection has many advantages in terms of a large field of view and low projection energy. Prior art head mounted displays systems require to physically align a narrow projected beam (a few millimetres usually) to the user's pupil, resulting in heavy and cumbersome head mounted devices.
SUMMARY OF THE INVENTION
The intentions of the present invention include providing a direct retinal display system having direct-retinal-eyewear device and a remote-projecting apparatus that are physically separated. According to the teachings of the present invention, there is provided a direct- retinal-display system for providing a user with image data generated by a media generation device. The direct-retinal-display system includes a direct-retinal-eyewear device, including one or two lenses coupled to operate with one or both eyes of the user, respectively, and a frame for positioning the one or two lenses at a preconfigured distance from the one or both eyes of the user, respectively.
The direct-retinal-display system includes a remote-projecting apparatus, including one or two image projectors coupled to respectively operate with the one or two lenses, wherein each of the image projectors is facilitated to project a light beam, carrying the image data generated by the media generation device, onto at least a portion of a respective lens of the direct-retinal-eyewear device. The one or two lenses are preconfigured to focus the image data onto the retina of the one or both eyes of the user, respectively. The direct-retinal-eyewear device and the remote-projecting apparatus are physically separated.
Preferably, the display system further includes an alignment element facilitating the user to spatially align the one or both eyes of the user with the remote-projecting apparatus.
Optionally, the alignment element is a light spot formed on a diffractive element by the light beam projected by a respective image projector, wherein the diffractive element is preconfigured to divert a portion of the light intensity of the light beam sideways, with respect to the optical axis of the respective image projector. The balance of light intensity of the light beam proceeds substantial coaxial with the optical axis of the respective image projector, wherein the balance of light intensity of the light beam is substantially invisible to the respective naked eye of the one or both eyes of eyes of the user, when the respective eye is not substantially coaxial with respect to the optical axis of the respective image projector.
Optionally, the alignment element is a preconfigured mark disposed at a preconfigured location such that when the user observes preconfigured mark features, the image projectors are spatially aligned with the one or both eyes of the user, respectively.
Preferably, the display system further includes at least one camera, for acquiring a sequence of image frames of a hand of the user, and a processor. The processor is facilitated to analyze movements of the hand of the user, from the acquiring a sequence of image frames, thereby creating hand-motion data. The processor is facilitated to overlay onto the image generated by the media generation device, an image of a keyboard. The processor is further facilitated to overlay a cursor, representing the acquired image of the hand of the user, onto the keyboard image, wherein the position of the cursor is directly correlated with the hand-motion data.
Optionally, the keyboard image is a 3D image.
Optionally, the cursor is the hand image.
The hand-motion data is determined to be an input action selected from the group including a stroke of a keyboard key, a mouse right click, a mouse left click, a mouse click and drag and a screen scroll. Optionally, the input action is emulated on the virtual input device.
Optionally, the hand-motion data is determined to be an input action selected from the group including pan, zoom and rotate, wherein the input action is emulated on the virtual input device.
Preferably, the image data projected by a first projector is different from the image data projected by a second projector, wherein the difference is preconfigured to cause the user to apprehend a 3D image.
Optionally, the projectors can be spatially adjusted laterally, vertically and pivotely.
Optionally, the one or two lenses are wavelength selective holographic lenses. An aspect of the present invention is to provide a method for providing a user with image data generated by a media generation device. The method including the steps of providing display system according to the present invention, projecting the image data by the one or two image projectors, in a preconfigured direction, wearing the direct-retinal - eyewear device by the user, coarsely moving the head by the user until observing at least a portion of the projected image data by the user, and fine moving of the head by the user until fully observing the projected image data by the user.
The light beam carrying the image data is projected onto at least a portion of a respective lens, and the one or two lenses focus the image data onto the retina of the one or both eyes of the user, respectively.
Optionally, the user wears the direct-retinal eyewear device only after the coarsely moving the head by the user. Preferably, the display system further includes an alignment element and wherein the method includes a procedure of aligning the user eyes with the projected image data by aligning the eyes with the alignment element, wherein the aligning procedure replaces the step of coarsely moving the head by the user.
Optionally, the alignment element is a light spot formed on a diffractive element by the light beam, wherein the diffractive element is preconfigured to diverting a portion of the light intensity of the light beam sideways, with respect to the optical axis of the respective image projector. The balance of the light intensity of the light beam proceeds substantial coaxial with the optical axis of the respective image projector.
The aligning procedure include the steps of moving the head by the user until observing the light spot as formed by the diverted portion of light intensity of the light beam on the alignment element, keeping focus on the spatial location of the observed light spot while moving the head by the user towards the optical axis of the one or two image projectors by a distance interval and wearing the direct-retinal-eyewear device by the user. If observing at least a portion of the projected image data by the user, performing the fine moving of the head by the user, until observing the whole of the projected image data by the user; else, removing the direct-retinal-eyewear device and repeating the aligning procedure steps.
Optionally, the moving of the head by the user is at least partially replaced by moving the direction of the optical axis of the respective image projector, typically in a counter direction.
Optionally, the alignment element is a target-mark disposed with alignment to the projector, wherein the target-mark includes at least one pair of stripes, wherein the pair of stripes includes a stripe that is disposed on a straight virtual line, substantially symmetrically about the projecting end of at least one of the projectors, wherein the straight virtual line crosses the optical axis of the projector. The target-mark further includes a pair of bars disposed symmetrically disposed on each side of each of the stripes, wherein the bars are elevated towards the user, with respect to the stripes.
When the respective eye of the user is not situated at or substantially proximal to the optical axis of the projector, at least one of the bars occludes at least a portion of an adjacent stripe. The aligning procedure includes the step of moving the head by the user until none of the bars occludes any of the stripes. BRIEF DESCRIPTION OF THE DRAWINGS
The present invention will become fully understood from the detailed description given herein below and the accompanying drawings, which are given by way of illustration and example only and thus not limitative of the present invention, and wherein: Fig. 1 is a perspective view of a direct-retinal-display system, according to embodiments of the present invention;
Fig. 2 is a perspective view of a direct-retinal-display system, according to other embodiments of the present invention;
Fig. 3 is a view of a portion an image frame, with a virtual keyboard overlay and a hand- image overlay, as projected to the user;
Fig. 4 shows a remote-projecting apparatus having a diffractive element disposed on the optical axis of each projector;
Fig. 5a illustrates a top view of an alignment mark, according to embodiments of the present invention;
Fig. 5b illustrates a top view of another alignment mark, according to variations of the present invention;
Fig. 6 illustrates a configuration of a remote-projecting apparatus, according to embodiments of the present invention, containing a pair of alignment marks;
Fig. 7 is a schematic flow chart showing an example alignment method, according to embodiments of the present invention;
Fig. 8 is a schematic flow chart showing an example alignment method, using a light spot as a target alignment mark, according to embodiments of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
The present invention will now be described more fully hereinafter with reference to the accompanying drawings, in which preferred embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided, so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
An embodiment is an example or implementation of the inventions. The various appearances of "one embodiment," "an embodiment" or "some embodiments" do not necessarily all refer to the same embodiments. Although various features of the invention may be described in the context of a single embodiment, the features may also be provided separately or in any suitable combination. Conversely, although the invention may be described herein in the context of separate embodiments for clarity, the invention may also be implemented in a single embodiment.
Reference in the specification to "one embodiment", "an embodiment", "some embodiments" or "other embodiments" means that a particular feature, structure, or characteristic described in connection with the embodiments is included in at least one embodiments, but not necessarily all embodiments, of the inventions, it is understood that the phraseology and terminology employed herein is not to be construed as limiting and are for descriptive purpose only.
Methods of the present invention may be implemented by performing or completing manually, automatically, or a combination thereof, selected steps or tasks. The order of performing some methods step may vary. The descriptions, examples, methods and materials presented in the claims and the specification are not to be construed as limiting but rather as illustrative only.
Meanings of technical and scientific terms used herein are to be commonly understood as to which the invention belongs, unless otherwise defined. The present invention can be implemented in the testing or practice with methods and materials equivalent or similar to those described herein.
It should be noted that orientation related descriptions such as "bottom", "up", "upper", "down", "lower", "top" and the like, assumes that the associated item is in operationally situated.
Reference is now made to the drawings. Fig. 1 is a perspective view of the basic geometry of a direct-retinal-display system 100, according to embodiments of the present invention. Direct-retinal-display system 100 includes a direct-retinal-eyewear device 170 and a remote-projecting apparatus 110. Direct-retinal-eyewear device 170 includes one or two lenses 172 coupled to operate with one or both eyes 20 of the user, respectively. Direct-retinal-eyewear device 170 includes a frame 174 for positioning lenses 172 at a preconfigured distance from eyes 20 of the user, respectively. Preferably, direct-retinal- display system 100 further includes one or more cameras 150 for acquiring sequences of image frames of hand movements performed by the user.
Remote-projecting apparatus 110 includes one or two image projectors 120 coupled to respectively operate with lenses 172, wherein each of image projectors 120 is facilitated to project a light beam 125 carrying the image data generated by a media generation device 50, onto at least a portion of a respective lens 172 of direct-retinal- eyewear device 170. Each of lenses 172 is preconfigured to focus image data onto the retina 25 of the respective eye 20 of the user. The media generation device 50 is operatively connected to a processor 112 of remote-projecting apparatus 110, wherein processor 112 is facilitated to manipulate the image data provided by media generation device 50.
It should be noted, be noted that direct-retinal-eyewear device 170 and remote- projecting apparatus 110 are physically separated.
The embodiments of the optical elements, for projecting the image data by projectors 120 onto lenses 170, may vary. In the example shown in Fig. 1 , light beam 125 is a diverging beam. In this example, the optical system is sensitive to the distance between projectors 120 and direct-retinal-eyewear device 170. In the example shown in Fig. 2, a collimating lens 160 is used, wherein collimating lens 160 collimates the light beam such that the optical system is substantially indifference to the distance between projectors 120 and direct-retinal-eyewear device 170.
Optionally, the image data projected by a first projector 120a is different from the image data projected by a second projector 120b, wherein the difference between the two simultaneous image frames is preconfigured to cause the user to apprehend the two data images as a 3D image.
Cameras 150 are facilitated to acquire a sequence of image frames of a hand of the user, wherein processor 112 is facilitated to analyze movements of the user's hand of the user thereby creating hand-motion data. Processor 112 is further facilitated to overlay onto an image frame generated by media generation device 50, an image of an input device, such as a keyboard or mouse. Processor 112 is further facilitated to overlay an image of a cursor, representing the position the user's hand, as acquired by cameras 150, onto the keyboard image, wherein the position of the cursor is directly correlated with the hand- motion data. Fig. 3 is a view of a portion of an image frame 30, with a virtual keyboard 200 overlay and a hand-image 210 overlay, as projected to the user. It should be noted that the cursor may be a commonly used cursor, or an image of a hand (as shown by way of example in Fig. 3), or an image of a finger or any other shaped image.
In one embodiment of the present invention, the space in front of the user is (virtually) divided to three regions: virtual -keyboard, virtual-pad and virtual-screen. Preferably, the regions occupied initially the same typical spaces as physical devices do in the physical world, however, processor 112 facilitates to select a space such as virtual -pad and virtual-keyboard, and drag the selected space to a different virtual position. A left hand user may, for example, transfer the virtual-pad to the left side of the key board.
The virtual-pad space is a virtual mouse. Movement of the pointing finger or the hand is typically translated to motion of the virtual cursor on the virtual screen. A click in place of the pointing fmger is translated to a virtual left mouse click and a click of the middle fmger is translated to a virtual right mouse click.
In the virtual-screen space, hands movements are translated to pan, zoom & rotate correspondingly. The click movement of the pointing finger is translated to a virtual select. Double click is translated to a virtual open. Click and remaining down of the pointing fmger facilitates dragging. The hand image or a matched hand model, possibly semi transparent, or a sign/cursor representing the hand, is overlaid on the console. The hand image appears only when the camera detects and identifies a hand in the virtual - screen space. Typically, the virtual-screen space is not movable.
Typically, the virtual-keyboard is activated on a need basis. The virtual-keyboard is drawn on the lower part of the console, preferably in 3D. Optionally, virtual-keyboard is drawn semitransparent or in wire frame. Once the virtual-keyboard, the virtual-keyboard is operable and preferably, can be dragged in the virtual-screen space. While being dragged the virtual-keyboard is change to reflect the camera point of view and the relative position of the keyboard in the screen space.
Projectors 120 are, for example, with no limitations, laser scanner projectors. Since typical laser scanners are designed with substantially more light intensity than needed for direct retinal projection, the optical elements typically include filters to reduce the light intensity to a safe level.
System Alignment
It should be noted that the alignment targets described herein are given by way of example only and other designs of alignment targets can be used within the scope of the present invention.
Since direct-retinal-eyewear device 170 and remote-projecting apparatus 110 are physically separated, the user needs to align his/her pupils with the respective optical axes of the respective projectors 120. Preferably, frame 174 is laterally adjustable, to facilitate alignment of lenses 172 with the eyes of the user. Once lenses 172 are respectively aligned with the pupils of the user, the user needs to align his/her eyes with the respective optical axes of the respective projectors 120. Optionally, frame 174 is preconfigured to fit the distance between the pupils of the user.
To achieve such alignment the user, may search for the projected image, by occasionally wearing direct-retinal-eyewear device 170 and detecting the projected image by estimating the path of the optical axes of the respective projectors 120.
Preferably, to achieve alignment the user looks at an alignment mark, as illustrated in Figs. 5a and 5b. Fig. 5a illustrates a top view of an example alignment mark 180, according to embodiments of the present invention, the view illustrating the aligned state. Fig. 5b illustrates a top view of another example alignment mark 189, according to embodiments of the present invention, the view illustrating the aligned state.
In one variation of the present invention, alignment mark 180 is disposed symmetrically around the output lens of projector 120. The projected beam (not shown) is perpendicular to the mark. Alignment mark 180 includes vertical stripes 183 and horizontal stripes 181 which stripes are coloured or patterned to outstand with respect to the background. Each of stripes 181 and 183 is disposed between two 3D bars 182 and 184, respectively. Bars 182 and 184 extend above the surface on which surface stripes 181 and 183, and bars 182 and 184, are disposed. Hence, only when viewing alignment mark 180 substantially from the optical axis of projector 120, none of the bars (182 and 184) occludes either stripe 181 and 183.
Fig. 6 illustrates a configuration of remote-projecting apparatus 111, according to embodiments of the present invention, containing a pair of alignment marks 180. Remote- projecting apparatus 111 includes a housing 114, two projectors 120 having an optical axis that is tilted upwards towards the user's eyes, a pair of alignment marks 180, each coupled with a respective projector 120 and a pair of cameras 150. The projectors 180 are mounted on a respective adjustable part 116, wherein adjustable part 116 can be pivoted to provide a user comfortable viewing angle. Cameras 150 are positioned such as to capture the space between remote-projecting apparatus 111 and the user, and to facilitate an analysis of the movements of the hands of the user. In the example shown in Fig. 6, camera 150b is also mounted on an adjustable part 118, wherein adjustable part 118 can be pivoted to provide the camera with a higher viewing position. Camera 150a is positioned in a lower viewing angle so as to facilitate easy detection of the user's input action on virtual keyboard 200.
The user can, for example, position remote-projecting apparatus 111 on a table, look at remote-projecting apparatus 111, find the alignment targets and position his eyes as to make stripes 181 and 183 visible. Furthermore, the user can move his hands to provide input to the virtual input devices. Alternatively, the user can hold the remote- projecting apparatus 111 in one hand, align remote-projecting apparatus 111 to his eyes and provide inputs by using his second hand.
It should be noted that stripes 181 and 183 and bars 182 and 184 may have any colour, may take any shape. The number of stripes (181 and 183) and corresponding bars (182 and 184) may also vary. For example, three sets disposed evenly apart, six sets disposed evenly apart, two sets disposed on both sides of projector 120, wherein one set is in a traverse orientation with respect to the other, etc.
In other embodiments of the present invention, the target alignment is a light spot. A portion of the light may be diverted sideways, to form a light spot.
Fig. 4 shows remote-projecting apparatus 110 having a diffractive element 130 disposed on the optical axis 115 of each projector 120. Diffractive element 130 is preconfigured to diverting a portion 142 of the light intensity of the light beam sideways, with respect to optical axis 115 of the respective image projector 120. The balance 140 of the light beam proceeds substantial coaxial with optical axis 115 of the respective image projector 120.
The user moves his/her head until observing light spot 135 as formed by the diverted portion (142) of the light beam on the alignment element. The user keeps focus on the spatial location of the observed light spot 135 while moving the head towards the optical axis of the image projectors 120. After moving the head by some distance interval, the user stops moving his/her head and wears direct-retinal -eyewear device 170. If the user observes at least a portion of the projected image data, he/she perform the fine moving of the head, until observing the whole of the projected image data. Otherwise, the user may remove direct-retinal-eyewear device 170, further moves the head towards the optical axis of the image projectors 120. This is repeated until observing at least a portion of the projected image data.
To ease the alignment process, a mechanism for adjusting the spatial direction of projectors 120. Referring back to Fig. 1, projector 120a may be moved laterally in direction 121, vertically in direction 123 and pivotely in direction 127. Reference is now made to Fig. 7, a schematic flow chart showing an example alignment method 300, according to embodiments of the present invention. A direct- retinal-display system 100, as shown in Figs. 1 and 2, is provided in step 301. Method 300 proceeds as follows:
Step 310: projecting the image data.
Activating remote-projecting apparatus 110, to thereby project at least one light beam, carrying the image data generated by media generation device 50.
Step 320: wearing the direct-retinal eyewear device by the user.
The user wears the direct-retinal eyewear device 170, to thereby facilitate seeing the image data when the user's eye 20 is aligned with optical axis 115 of the respective projector 120.
Step 330: coarsely moving the head towards the optical axis of the projector.
The user moves his/her head towards a predicted location the optical axis of the image projectors 120.
Optionally, the user may wear direct-retinal eyewear device 170 only after coarsely moving coarsely moving the head towards the optical axis of the projectors 120.
Step 335: determining if seeing at least a portion of the image data by the user.
If at least a portion of the image data is not observed by the user, preferably remove direct-retinal eyewear device 170 (step 336) and go to step 330.
Step 340: performing fine movements of the head.
The user makes fine adjustment to the head spatial position until see the whole image data.
Step 350: proceed with desired activity.
Reference is now made to Fig. 8, a schematic flow chart showing an example alignment method 400, using a light spot 135 as a target alignment mark, according to embodiments of the present invention. A direct-retinal-display system 100 including a diffractive element 130, as shown in Figs. 1, 2 and 4, is provided in step 401. Method 400 proceeds as follows:
Step 410: projecting the image data.
Activating remote-projecting apparatus 110, to thereby project at least one light beam, carrying the image data generated by media generation device 50, including a diverted portion of the light beam. Step 420: making eye contact with the alignment light spot.
The user makes an eye contact with the alignment light spot 135 formed on the diffractive element 130.
Step 430: keeping focus on the spatial location of light spot 135 and moving the head towards optical axis 115 of the respective projector 120.
The user moves his/her head towards a predicted location the optical axis of the image projectors 120.
Step 440: wearing the direct-retinal eyewear device by the user.
The user wears the direct-retinal eyewear device 170, to thereby facilitate seeing the image data when the user's pupil is aligned with optical axis 115 of the respective projector 120.
Step 445: determining if seeing at least a portion of the image data by the user.
If at least a portion of the image data is not observed by the user, preferably remove direct-retinal eyewear device 170 (step 450) and go to step 430.
Step 460: performing fine movements of the head.
The user makes fine adjustment to the head spatial position until see the whole image data.
Step 470: proceed with desired activity.
In variations of the present invention, in order to reduce the cost, thickness and/or weight of direct-retinal eyewear device 170, lens 172 is a wavelength selective holographic lens. A holographic lens is a volume hologram tailored to the three laser scanners wavelengths used for projection, wherein the hologram functions as a lens in the visible region, only for the three laser wavelengths used in a laser scanner projector. For other wavelengths in the visible range, the hologram serves as a window, thereby there is no need for the user to repeatably wear and remove the direct-retinal-eyewear device 170 as described in steps 320, 335, 440 and 445. It should be noted that the holographic lens is apochromatic, where the three focal points are combined.
The invention being thus described in terms of several embodiments and examples, it will be obvious that the same may be varied in many ways. Such variations are not to be regarded as a departure from the spirit and scope of the invention, and all such modifications as would be obvious to one skilled in the art.

Claims

WHAT IS CLAIMED IS:
1. A direct-retinal-display system for providing a user with image data generated by a media generation device, the display system comprising:
a) a direct-retinal-eyewear device comprising:
i. one or two lenses coupled to operate with one or both eyes of the user, respectively; and
ii. a frame for positioning said one or two lenses at a preconfigured distance from said one or both eyes of the user, respectively; and
b) a remote-projecting apparatus comprising one or two image projectors coupled to respectively operate with said one or two lenses,
wherein each of said image projectors is facilitated to project a light beam carrying said image data generated by said media generation device, onto at least a portion of a respective lens of said direct-retinal-eyewear device; and
wherein said one or two lenses are preconfigured to focus said image data onto the retina of said one or both eyes of the user, respectively.
2. The display system of claim 1, wherein said direct-retinal-eyewear device and said remote-projecting apparatus are physically separated.
3. The display system of claim 1 further comprising an alignment element facilitating the user to spatially align said one or both eyes of the user with said remote-projecting apparatus.
4. The display system of claim 3, wherein said alignment element is a light spot formed on a diffractive element by said light beam projected by a respective image projector,
wherein said diffractive element is preconfigured to divert a portion of the light intensity of said light beam sideways, with respect to the optical axis of said respective image projector;
wherein the balance of light intensity of said light beam proceeds substantial coaxial with said optical axis of said respective image projector; and
wherein said balance of light intensity of said light beam is substantially invisible to the respective naked eye of said one or both eyes of eyes of the user, when said respective eye is not substantially coaxial with respect to said optical axis of said respective image projector.
5. The display system of claim 3, wherein said alignment element is a preconfigured mark disposed at a preconfigured location such that when the user observes preconfigured mark features, said image projectors are spatially aligned with said one or both eyes of the user, respectively.
6. The display system of claim 1, wherein said display system further includes:
c) at least one camera for acquiring a sequence of image frames of a hand of the user; and d) a processor,
wherein said processor is facilitated to analyze movements of said hand of the user, from said acquiring a sequence of image frames, thereby creating hand-motion data;
wherein said processor is facilitated to overlay onto said image generated by said media generation device, an image of a keyboard; and
wherein said processor is facilitated to overlay a cursor, representing said acquired image of said hand of the user, onto said keyboard image, wherein the position of said cursor is directly correlated with said hand-motion data.
7. The display system as in claim 6, wherein said keyboard image is a 3D image.
8. The display system as in claim 6, wherein said cursor is said hand image.
9. The display system as in claim 6, wherein said hand-motion data is determined to be an input action selected from the group including a stroke of a keyboard key, a mouse right click, a mouse left click, a mouse click and drag and a screen scroll.
10. The display system as in claim 9, wherein said input action is emulated on said virtual input device.
11. The display system as in claim 6, wherein said hand-motion data is determined to be an input action selected from the group including pan, zoom and rotate.
12. The display system as in claim 11, wherein said input action is emulated on said virtual input device.
13. The display system of claim 1, wherein said image data projected by a first projector is different from the image data projected by a second projector, wherein said difference is preconfigured to cause the user to apprehend a 3D image.
14. The display system of claim 1, wherein said projectors can be spatially adjusted laterally, vertically and pivotely.
15. The display system of claim 1, wherein said one or two lenses are wavelength selective holographic lenses.
16. A method for providing a user with image data generated by a media generation device, the method comprising the steps of:
a) providing display system including:
i. a direct-retinal-eyewear device including:
1. one or two lenses coupled to operate with one or both eyes of the user, respectively; and
2. a frame for positioning said one or two lenses at a preconfigured distance from said one or both eyes of the user, respectively; and
ii. a remote-projecting apparatus comprising one or two image projectors coupled to respectively operate with said one or two lenses,
wherein said remote-projecting apparatus is facilitated to project a light beam carrying said image data generated by said media generation device, onto at least a portion of a respective lens of said direct-retinal-eyewear device; and
wherein said one or two lenses are preconfigured to focus said image data onto the retina of said one or both eyes of the user, respectively;
b) projecting said image data by said one or two image projectors, in a preconfigured direction;
c) wearing said direct-retinal-eyewear device by the user;
d) coarsely moving the head by the user until observing at least a portion of said projected image data by the user; and
e) fine moving of the head by the user until fully observing said projected image data by the user,
wherein said light beam carrying said image data is projected onto at least a portion of a respective lens; and
wherein said one or two lenses focus said image data onto the retina of said one or both eyes of the user, respectively.
17. The method of claim 16, wherein the user wears said direct-retinal eyewear device only after said coarsely moving the head by the user.
18. The method of claim 16, wherein said display system further includes an alignment element and wherein the method comprises a procedure of aligning said user eyes with said projected image data by aligning said eyes with said alignment element, wherein said aligning procedure replaces said step of coarsely moving the head by the user.
19. The method of claim 18, wherein said alignment element is a light spot formed on a diffractive element by said light beam,
wherein said diffractive element is preconfigured to diverting a portion of the light intensity of said light beam sideways, with respect to the optical axis of said respective image projector; wherein the balance of said light intensity of said light beam proceeds substantial coaxial with said optical axis of said respective image projector; and
wherein said aligning procedure comprises the steps of:
a) moving the head by the user until observing said light spot as formed by said diverted portion of light intensity of said light beam on said alignment element;
b) keeping focus on the spatial location of said observed light spot while moving the head by the user towards the optical axis of said one or two image projectors;
c) wearing said direct-retinal-eyewear device by the user; and
d) if observing at least a portion of said projected image data by the user, performing said fine moving of the head by the user, until observing the whole of said projected image data by the user; else, removing said direct-retinal-eyewear device and repeating steps (a)-(d).
20. The method of claim 19, wherein said moving of the head by the user is at least partially replaced by moving the direction of said optical axis of said respective image projector.
21. The method of claim 18, wherein said alignment element is a target-mark disposed with alignment to said projector, target-mark comprising:
a) at least one pair of stripes, wherein said pair of stripes includes a stripe that is disposed on a straight virtual line, substantially symmetrically about the projecting end of at least one of said projectors, wherein said straight virtual line crosses said optical axis of said projector; and
b) a pair of bars disposed symmetrically disposed on each side of each of said stripes, wherein said bars are elevated towards the user, with respect to said stripes; wherein when the respective eye of the user is not situated at or substantially proximal to said optical axis of said projector, at least one of said bars occludes at least a portion of an adjacent stripe; and
wherein said aligning procedure comprises the step of moving the head by the user until none of said bars occludes any of said stripes.
PCT/IL2011/000929 2010-12-11 2011-12-07 System and method for direct retinal display with integrated 3d alignment and a virtual input device WO2012077106A1 (en)

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
US42214310P 2010-12-11 2010-12-11
US61/422,143 2010-12-11
US201161478523P 2011-04-24 2011-04-24
US61/478,523 2011-04-24
US201161481206P 2011-05-01 2011-05-01
US61/481,206 2011-05-01
US201161538869P 2011-09-25 2011-09-25
US61/538,869 2011-09-25

Publications (1)

Publication Number Publication Date
WO2012077106A1 true WO2012077106A1 (en) 2012-06-14

Family

ID=46206669

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2011/000929 WO2012077106A1 (en) 2010-12-11 2011-12-07 System and method for direct retinal display with integrated 3d alignment and a virtual input device

Country Status (1)

Country Link
WO (1) WO2012077106A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100097580A1 (en) * 2007-11-21 2010-04-22 Panasonic Corporation Display apparatus
US20100138668A1 (en) * 2007-07-03 2010-06-03 Nds Limited Content delivery system
US20100253904A1 (en) * 2006-12-14 2010-10-07 James Jannard Wearable high resolution audio visual interface

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100253904A1 (en) * 2006-12-14 2010-10-07 James Jannard Wearable high resolution audio visual interface
US20100138668A1 (en) * 2007-07-03 2010-06-03 Nds Limited Content delivery system
US20100097580A1 (en) * 2007-11-21 2010-04-22 Panasonic Corporation Display apparatus

Similar Documents

Publication Publication Date Title
JP6393367B2 (en) Tracking display system, tracking display program, tracking display method, wearable device using them, tracking display program for wearable device, and operation method of wearable device
EP3330771B1 (en) Display apparatus and method of displaying using focus and context displays
EP3330772B1 (en) Display apparatus and method of displaying using projectors
CN210166824U (en) Wearable device
CN109983755A (en) The image capture system focused automatically, device and method are tracked based on eyes
US9870050B2 (en) Interactive projection display
JP6786792B2 (en) Information processing device, display device, information processing method, and program
JP5295714B2 (en) Display device, image processing method, and computer program
EP3274985A1 (en) Combining video-based and optic-based augmented reality in a near eye display
TWI486631B (en) Head mounted display and control method thereof
KR20130108643A (en) Systems and methods for a gaze and gesture interface
KR20160075571A (en) System and method for reconfigurable projected augmented/virtual reality appliance
CN110082910A (en) Method and apparatus for showing emoticon on display mirror
KR20170118618A (en) Eye capturing device
JP5857082B2 (en) Display device and electronic device
CN110082911A (en) For showing the device and method for how staring at the expression of object in display video
JP7165994B2 (en) Methods and devices for collecting eye measurements
Lander et al. hEYEbrid: A hybrid approach for mobile calibration-free gaze estimation
CN108427194A (en) A kind of display methods and equipment based on augmented reality
EP2721444B1 (en) System for colocating a touch screen and a virtual object, and device for manipulating virtual objects implementing such a system
US11340736B2 (en) Image display device, image display method, and image display program
JP2008212718A (en) Visual field detection system
WO2016101861A1 (en) Head-worn display device
CN111929899A (en) Augmented reality wears display device
WO2012077106A1 (en) System and method for direct retinal display with integrated 3d alignment and a virtual input device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11846417

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11846417

Country of ref document: EP

Kind code of ref document: A1