US20070030543A1 - Depth and lateral size control of three-dimensional images in projection integral imaging - Google Patents

Depth and lateral size control of three-dimensional images in projection integral imaging Download PDF

Info

Publication number
US20070030543A1
US20070030543A1 US11/498,666 US49866606A US2007030543A1 US 20070030543 A1 US20070030543 A1 US 20070030543A1 US 49866606 A US49866606 A US 49866606A US 2007030543 A1 US2007030543 A1 US 2007030543A1
Authority
US
United States
Prior art keywords
images
displaying
planar
micro
depth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/498,666
Inventor
Bahram Javidi
Ju-Seog Jang
Hyunju Ha
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Connecticut
Original Assignee
University of Connecticut
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Connecticut filed Critical University of Connecticut
Priority to US11/498,666 priority Critical patent/US20070030543A1/en
Assigned to THE UNIVERSITY OF CONNECTICUT reassignment THE UNIVERSITY OF CONNECTICUT ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: JANG (DEC'D) BY HYUNJU HA, WIFE AND LEGAL REPRESENTATIVE, JU-SEOG, JAVIDI, BAHRAM
Publication of US20070030543A1 publication Critical patent/US20070030543A1/en
Priority to US12/939,647 priority patent/US8264772B2/en
Priority to US13/596,715 priority patent/US20120320161A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/26Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
    • G02B30/27Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving lenticular arrays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/144Processing image signals for flicker reduction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/305Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using lenticular lenses, e.g. arrangements of cylindrical lenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2213/00Details of stereoscopic systems
    • H04N2213/002Eyestrain reduction by processing stereoscopic signals or controlling stereoscopic devices

Definitions

  • stereoscopic Most 3D display techniques developed to date are stereoscopic.
  • a stereoscopic system may be realized that may display large images with high resolution, however stereoscopic techniques may require supplementary glasses to evoke 3-D visual effects. Additionally, stereoscopic techniques may provide observers with horizontal parallax and a small number of viewpoints. Observation of stereoscopic images may also cause visual fatigue due to convergence-accommodation conflict.
  • Convergence-accommodation conflict may be avoided by a true 3-D image formation in space with full parallax and continuous viewing points.
  • Holography is one way to form 3-D images in space, but recording full-color holograms for an outdoor scene may be difficult. For example, when computer-generated holograms are prepared, a large amount of computation time and capacity may be required to obtain proper gratings. Because coherent light is often used in holography, speckle may also occur.
  • II integral imaging
  • 3-D images may be formed by crossing the rays coming from 2-D elemental images using a lenslet array.
  • Each microlens in a lenslet array may act as a directional pixel in a pinhole fashion.
  • the pinholes create directional views which when viewed with two eyes for example, appear as a 3D image in space. II may provide observers with true 3-D images with full parallax and continuous viewing points. However, the viewing angle, depth-of-focus, and resolution of 3-D images may be limited.
  • 3-D images produced in direct camera pickup II are pseudoscopic (depth-reversed) images, and thus may make II systems more complex and thus more impractical.
  • Advancements in the art are needed to increase viewing angles and improve image quality. Also needed are ways to display, images of large objects that are far from the pickup device. Additionally needed advancements include the ability to project 3-D images to a large display screen.
  • a method disclosed herein relates to a method of displaying three-dimensional images.
  • the method comprising, projecting integral images to a display device, and displaying three-dimensional images with the display device.
  • the method comprising, magnifying elemental images during pickup, projecting the magnified elemental images via an optics relay to a display device, and displaying 3-D images within the depth-of-focus of the display device while maintaining lateral image sizes.
  • the method comprising, positioning an optical path-length-equalizing (OPLE) lens adjacent to a planar lenslet array, projecting 3-D images via an optics relay to a planar display device, and displaying 3-D images within the depth-of-focus of the display device.
  • OLE optical path-length-equalizing
  • the method comprising, generating elemental images with a micro-lenslet array, increasing disparity of elemental images with an optical path-length-equalizing (OPLE) lens, recording the elemental images on an imaging sensor of a recording device.
  • the method further comprising, projecting 3-D images through an optical relay to a display device, and displaying the 3-D images within the depth-of-focus of the display device.
  • OLE optical path-length-equalizing
  • the apparatus comprising, a projector for projecting integral images, and a micro-convex-mirror array for displaying the projected images.
  • FIGS. 1 a , 1 b , and 1 c are side views of an integral imaging (II) arrangements using planar devices;
  • FIGS. 2 a , 2 b , and 2 c are side views of a projection integral imaging (PII) arrangements using planar devices;
  • PII projection integral imaging
  • FIGS. 3 a , 3 b , 3 c , 3 d , 3 e, and 3 f are side views of non-linear depth control arrangements using curved devices;
  • FIGS. 4 a and 4 b are side views of modified pick up systems
  • FIGS. 5 a , 5 b , and 5 c show divergent projection methods
  • FIG. 6 a shows examples of objects to be imaged
  • FIG. 6 b shows a modified pick up lens system attached to a digital camera
  • FIG. 7 shows a top view of optical set up for 3D image display which includes a micro-convex mirror array
  • FIGS. 8 a , 8 b , 8 c and 8 d show center parts of elemental images
  • FIG. 9 shows reconstructed orthoscopic virtual 3-D images when an optical path-length-equalizing (OPLE) lens was not used.
  • FIG. 10 shows reconstructed orthoscopic virtual 3-D images when an OPLE lens was used.
  • curved pickup devices i.e., a curved 2-D image sensor and a curved lenslet array
  • curved display devices or both may be used for this purpose.
  • the lenslets in the curved array have a zooming capability, a linear depth control is additionally possible.
  • planar pickup devices may be used (lenslet array, sensor, and display).
  • An additional large aperture negative lens also referred to herein as an optical path-length-equalizing (OPLE) lens, is placed in contact with the pickup lenslet array.
  • OPLE optical path-length-equalizing
  • planar lenslet arrays with positive focal lengths have been used as depicted in FIG. 1 .
  • a set of elemental images 1 of a 3-D object 2 may be obtained by use of a lenslet array 3 and a 2-D image sensor 4 such as a charged coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) image sensor.
  • a 2-D image sensor 4 such as a charged coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) image sensor.
  • CCD charged coupled device
  • CMOS complementary metal oxide semiconductor
  • FIG. 1 ( b ) to reconstruct a 3-D image 7 of the object 2 , the set of 2-D elemental images 1 may be displayed in front of a lenslet array 3 using a 2-D display panel 5 , such as a liquid crystal display (LCD) panel.
  • LCD liquid crystal display
  • (L i ⁇ ) ⁇ g r , where it may be assumed that 3-D real images 7 are formed around z L i . The rays coming from elemental images converge to form a 3-D real image through the lenslet array 3 . The reconstructed 3-D image may be a pseudoscopic (depth-reversed) real image 7 of the object 2 .
  • the gap distance g should be L i ⁇
  • Projection integral imaging is the novel subject of this invention. In other words, the inventors were the first to invent projection integral imaging (PII).
  • elemental images may be projected through relay optics 10 onto a lenslet array 3 as depicted in FIGS. 2 ( a ) and ( b ).
  • a micro-convex/concave-mirror array 11 , 21 as a projection screen may be used, as depicted in FIGS. 2 ( c ) and ( d ).
  • 3-D orthoscopic virtual images 8 may be displayed without the P/O conversion.
  • the gap distance g becomes L i ⁇
  • PII allows for the following because of the use of a micro-convex-mirror array as a projection screen:
  • the full viewing angle ⁇ is limited and determined approximately by 2 ⁇ arctan[0.5/( ⁇ /#)], where ⁇ /# is the ⁇ number of the lenslet, when the fill factor of the lenslet array is close to 1.
  • the P/O conversion is unnecessary, if a positive lenslet array is used for direct camera pickup.
  • 3-D images reconstructed in II systems may have limited depth-of-focus ⁇ . It has been shown that ⁇ cannot be larger than 1/( ⁇ 2 ) where ⁇ is the display wavelength and ⁇ is the resolution of reconstructed 3-D images. ⁇ is defined as the inverse of the reconstructed image spot size. In PII, 3-D images with high resolution can be reconstructed only near the projection screen of micro-convex-mirror arrays (or the display lenslet array). Thus the depth-of-focus ⁇ should be measured from the projection screen.
  • the focal length of the pickup lenslet array ⁇ p is longer than that of the display micro-convex-mirror array ⁇ d , the longitudinal scale of reconstructed image space is reduced linearly by a factor of ⁇ d / ⁇ p ⁇ r while the lateral scale does not change. So if (z o +T)r ⁇ , the 3-D reconstructed image is well focused.
  • Digital zoom-in may degrade the resolution of elemental images.
  • a nonlinear depth control method may be used.
  • a curved pickup devices e.g., a curved lenslet array 17 and a curved 2-D image sensor 18 ) with a radius of curvature R may be used, and then 3-D images may be reconstructed using planar display devices as depicted in FIGS. 3 ( a ) and ( b ), respectively.
  • planar pickup devices e.g., a planar image sensor 14 and a planar lenslet array 16
  • curved display devices e.g., a curved display panel 19 and a curved lenslet array 20
  • R>0 when the center of the curvature is positioned at the same side of the object (observer 6 ) in the pickup (display) process
  • R ⁇ 0 when it is positioned at the opposite side.
  • the effect of depth and size reduction using the negatively curved pickup lenslet array can be analyzed by introducing a hypothetical thin lens with a negative focal length ⁇ R p , which is in contact with the planar pickup lenslet array 16 , as depicted in FIG. 3 ( e ). This is because ray propagation behaviors for the two setups in FIGS. 3 ( a ) and 3 ( e ), and those in FIGS. 3 ( d ) and 3 ( f ) are the same, respectively.
  • this lens an optical path-length-equalizing (OPLE) lens 15 .
  • OPLE optical path-length-equalizing
  • R p >> ⁇ p and thus ⁇ p e ⁇ p .
  • a planar lenslet array 16 with a focal length ⁇ p e may be used, a flat image sensor 14 , and the pickup OPLE lens 15 with a focal length ⁇ R p .
  • the OPLE lens 15 first produces images of objects, and then the images are actually picked up by the planar pickup devices 14 , 16 to produce elemental images with increased disparity.
  • the effect of depth and size reduction can also be achieved by use of negatively curved display devices 19 , 20 .
  • curved display devices 19 , 20 with a radius of curvature ⁇ R d are used, while elemental images are obtained by use of planar pickup devices 14 , 16 .
  • both linear and nonlinear depth control methods may be used together.
  • the position of the reconstructed image can be predicted from the equivalent planar pickup 14 , 16 and display 22 , 16 devices with OPLE lenses.
  • elemental images with increased disparity are obtained and then they are digitally zoomed-in.
  • a modified pickup system is usually used as depicted in FIG. 4 ( a ).
  • elemental images formed by a planar lenslet array 3 are detected through a camera lens 25 with a large ⁇ /#.
  • the use of such a camera lens 25 and the planar pickup lenslet array 3 produces the effect of a negatively curved pickup lenslet array, because disparity of elemental images increases. This effect is taken into account, by considering the modified pickup system as a curved pickup system with a curved lenslet array whose radius of curvature is ⁇ R c .
  • R c equals approximately the distance between the planar pickup lenslet array and the camera lens.
  • the projection beam angle ⁇ (e.g., in the azimuthal direction) may not be negligible.
  • the effect of negatively curved display devices naturally exists even if planar display devices are used.
  • the horizontal size of the overall projected elemental images on the screen is S.
  • the planar display devices as curved display devices with a radius of curvature ⁇ R s ⁇ S/ ⁇ if the aperture size of the relay optics is much smaller than S.
  • R s is approximately equal to the distance between the planar projection screen and the relay optics.
  • the object to be imaged is composed of a small cacti 35 and a large building 36 as shown in FIG. 6 ( a ).
  • the distance between the pickup lenslet array and the cacti 35 is approximately 20 cm and that between the pickup lenslet array and the building is approximately 70 m.
  • elemental images were obtained by use of a planar 2-D image sensor and a planar lenslet array in contact with a large-aperture negative lens as an OPLE lens.
  • the planar pickup lenslet array used is made from acrylic, and has 53 ⁇ 53 plano-convex lenslets.
  • Each lenslet element is square-shaped and has a uniform base size of 1.09 mm ⁇ 1.09 mm, with less than 7.6 ⁇ m separating the lenslet elements.
  • a total of 48 ⁇ 36 elemental images are used in the experiments.
  • a digital camera 37 with 4500 ⁇ 3000 CMOS pixels was used for the 2-D image sensor.
  • the camera pickup system 37 is shown in FIG. 6 ( b ).
  • the linear depth reduction method was also used in combination with the nonlinear method. To avoid resolution degradation caused by digital zoom-in, the resolution of the zoom-in elemental images was kept higher than that of the LCD projector.
  • a planar micro-convex-mirror array for the projection screen was obtained by coating the convex surface of a lenslet array that is identical to the pickup lenslet array. Light intensity reflectance of the screen is more than 90%.
  • FIG. 7 The setup for 3-D image reconstruction is depicted in FIG. 7 .
  • a color LCD projector 40 that has 3 (red, green, and blue) panels was used for elemental image projection. Each panel has 1024 ⁇ 768 square pixels with a pixel pitch of 18 ⁇ m. Each elemental image has approximately 21 ⁇ 21 pixels on average.
  • Magnification of the relay optics 41 is 2.9.
  • the diverging angle of the projection beam ⁇ is approximately 6 degrees in the azimuthal direction. The effect of curved display devices slightly exists.
  • z oc 20 cm
  • z ob 70 m
  • FIGS. 8 ( a ) and 8 ( b ) Center parts of elemental images that were obtained without the OPLE lens and those obtained with the OPLE lens are shown in FIGS. 8 ( a ) and 8 ( b ), respectively.
  • the OPLE lens increases disparity between neighboring elemental images.
  • FIGS. 9 and 10 Left, center, and right views of reconstructed 3-D images for different depth control parameters are illustrated in FIGS. 9 and 10 .
  • the observed positions of the reconstructed images agree qualitatively with the estimated positions given in Table 1. Comparing the images shown in FIGS. 9 and 10 , one can see that smaller 3-D images are reconstructed for shorter R p e .
  • reconstructed 3-D images squeeze further in the longitudinal direction and thus disparity between left and right views reduces.
  • the lateral size of reconstructed 3-D images is independent of r.
  • Reconstructed 3-D images at deeper positions are more blurred because the depth-of-focus of the PII system is limited, which is estimated to be 5 cm approximately.
  • Binocular parallax is the most effective depth cue for viewing medium distances.
  • our depth control method degrades solidity of reconstructed 3-D images because it squeezes their longitudinal depth more excessively than the lateral size for distant objects.
  • human vision also uses other depth cues, and binocular parallax may not be so effective for viewing long distances. Therefore, our nonlinear position control method can be efficiently used for large-scale 3-D display system with limited depth-of-focus. Nevertheless, efforts to enhance the depth-of-focus of II systems should be pursued.
  • Some embodiments of the invention have the following advantages: imaging is performed with direct pickup to create true 3-D image formations with full parallax and continuous viewing points with incoherent light using two-dimensional display devices resulting in orthoscopic images with wide viewing angles, large depth of focus and high resolution. Additional advantages include the ability to project 3-D images to a large display screen.
  • a computer or other client or server device can be deployed as part of a computer network, or in a distributed computing environment.
  • the methods and apparatus described above and/or claimed herein pertain to any computer system having any number of memory or storage units, and any number of applications and processes occurring across any number of storage units or volumes, which may be used in connection with the methods and apparatus described above and/or claimed herein.
  • the same may apply to an environment with server computers and client computers deployed in a network environment or distributed computing environment, having remote or local storage.
  • the methods and apparatus described above and/or claimed herein may also be applied to standalone computing devices, having programming language functionality, interpretation and execution capabilities for generating, receiving and transmitting information in connection with remote or local services.
  • the methods and apparatus described above and/or claimed herein is operational with numerous other general purpose or special purpose computing system environments or configurations.
  • Examples of well known computing systems, environments, and/or configurations that may be suitable for use with the methods and apparatus described above and/or claimed herein include, but are not limited to, personal computers, server computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices.
  • the methods described above and/or claimed herein may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer.
  • Program modules typically include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types.
  • the methods and apparatus described above and/or claimed herein may also be practiced in distributed computing environments such as between different units where tasks are performed by remote processing devices that are linked through a communications network or other data transmission medium.
  • program modules and routines or data may be located in both local and remote computer storage media including memory storage devices.
  • Distributed computing facilitates sharing of computer resources and services by direct exchange between computing devices and systems. These resources and services may include the exchange of information, cache storage, and disk storage for files.
  • Distributed computing takes advantage of network connectivity, allowing clients to leverage their collective power to benefit the entire enterprise.
  • a variety of devices may have applications, objects or resources that may utilize the methods and apparatus described above and/or claimed herein.
  • Computer programs implementing the method described above will commonly be distributed to users on a distribution medium such as a CD-ROM.
  • the program could be copied to a hard disk or a similar intermediate storage medium.
  • the programs When the programs are to be run, they will be loaded either from their distribution medium or their intermediate storage medium into the execution memory of the computer, thus configuring a computer to act in accordance with the methods and apparatus described above.
  • computer-readable medium encompasses all distribution and storage media, memory of a computer, and any other medium or device capable of storing for reading by a computer a computer program implementing the method described above.
  • the various techniques described herein may be implemented in connection with hardware or software or, where appropriate, with a combination of both.
  • the methods and apparatus described above and/or claimed herein, or certain aspects or portions thereof may take the form of program code or instructions embodied in tangible media, such as floppy diskettes, CD-ROMs, hard drives, or any other machine-readable storage medium, wherein, when the program code is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the methods and apparatus of described above and/or claimed herein.
  • the computing device will generally include a processor, a storage medium readable by the processor, which may include volatile and non-volatile memory and/or storage elements, at least one input device, and at least one output device.
  • One or more programs that may utilize the techniques of the methods and apparatus described above and/or claimed herein, e.g., through the use of a data processing, may be implemented in a high level procedural or object oriented programming language to communicate with a computer system.
  • the program(s) can be implemented in assembly or machine language, if desired.
  • the language may be a compiled or interpreted language, and combined with hardware implementations.
  • the methods and apparatus described above and/or claimed herein may also be practiced via communications embodied in the form of program code that is transmitted over some transmission medium, such as over electrical wiring or cabling, through fiber optics, or via any other form of transmission, wherein, when the program code is received and loaded into and executed by a machine, such as an EPROM, a gate array, a programmable logic device (PLD), a client computer, or a receiving machine having the signal processing capabilities as described in exemplary embodiments above becomes an apparatus for practicing the method described above and/or claimed herein.
  • a machine such as an EPROM, a gate array, a programmable logic device (PLD), a client computer, or a receiving machine having the signal processing capabilities as described in exemplary embodiments above becomes an apparatus for practicing the method described above and/or claimed herein.
  • PLD programmable logic device
  • client computer or a receiving machine having the signal processing capabilities as described in exemplary embodiments above becomes an apparatus for practicing the method described above and/or claimed herein.

Abstract

A method disclosed herein relates to displaying three-dimensional images. The method comprising, projecting integral images to a display device, and displaying three-dimensional images with the display device. Further disclosed herein is an apparatus for displaying orthoscopic 3-D images. The apparatus comprising, a projector for projecting integral images, and a micro-convex-mirror array for displaying the projected images.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application claims priority to U.S. provisional application 60/706,281 filed on Aug. 8, 2005, which is incorporated herein in its entirety by reference.
  • BACKGROUND OF THE INVENTION
  • Most 3D display techniques developed to date are stereoscopic. A stereoscopic system may be realized that may display large images with high resolution, however stereoscopic techniques may require supplementary glasses to evoke 3-D visual effects. Additionally, stereoscopic techniques may provide observers with horizontal parallax and a small number of viewpoints. Observation of stereoscopic images may also cause visual fatigue due to convergence-accommodation conflict.
  • Convergence-accommodation conflict may be avoided by a true 3-D image formation in space with full parallax and continuous viewing points. Holography is one way to form 3-D images in space, but recording full-color holograms for an outdoor scene may be difficult. For example, when computer-generated holograms are prepared, a large amount of computation time and capacity may be required to obtain proper gratings. Because coherent light is often used in holography, speckle may also occur.
  • To produce true 3-D images in space with incoherent light using two-dimensional (2-D) display devices, techniques based on ray optics have also been studied. One technique may be referred to as integral imaging (II).
  • In II, 3-D images may be formed by crossing the rays coming from 2-D elemental images using a lenslet array. Each microlens in a lenslet array may act as a directional pixel in a pinhole fashion. The pinholes create directional views which when viewed with two eyes for example, appear as a 3D image in space. II may provide observers with true 3-D images with full parallax and continuous viewing points. However, the viewing angle, depth-of-focus, and resolution of 3-D images may be limited.
  • In addition, 3-D images produced in direct camera pickup II are pseudoscopic (depth-reversed) images, and thus may make II systems more complex and thus more impractical.
  • Advancements in the art are needed to increase viewing angles and improve image quality. Also needed are ways to display, images of large objects that are far from the pickup device. Additionally needed advancements include the ability to project 3-D images to a large display screen.
  • BRIEF DESCRIPTION OF THE INVENTION
  • A method disclosed herein relates to a method of displaying three-dimensional images. The method comprising, projecting integral images to a display device, and displaying three-dimensional images with the display device.
  • Further disclosed herein is a method that relates to controlling the depth of 3-D images when recording and displaying 3-D images. The method comprising, magnifying elemental images during pickup, projecting the magnified elemental images via an optics relay to a display device, and displaying 3-D images within the depth-of-focus of the display device while maintaining lateral image sizes.
  • Further disclosed herein is a method that relates to controlling the depth of 3-D images when recording and displaying 3-D images with planar pickup and planar display devices. The method comprising, positioning an optical path-length-equalizing (OPLE) lens adjacent to a planar lenslet array, projecting 3-D images via an optics relay to a planar display device, and displaying 3-D images within the depth-of-focus of the display device.
  • Further disclosed herein is a method that relates to recording and displaying 3-D images. The method comprising, generating elemental images with a micro-lenslet array, increasing disparity of elemental images with an optical path-length-equalizing (OPLE) lens, recording the elemental images on an imaging sensor of a recording device. The method further comprising, projecting 3-D images through an optical relay to a display device, and displaying the 3-D images within the depth-of-focus of the display device.
  • Further disclosed herein is an apparatus for displaying orthoscopic 3-D images. The apparatus comprising, a projector for projecting integral images, and a micro-convex-mirror array for displaying the projected images.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Embodiments will now be described, by way of example only, with reference to the accompanying drawings which are meant to be exemplary, not limiting, and wherein like elements are numbered alike in several Figures, in which:
  • FIGS. 1 a, 1 b, and 1 c are side views of an integral imaging (II) arrangements using planar devices;
  • FIGS. 2 a, 2 b, and 2 c are side views of a projection integral imaging (PII) arrangements using planar devices;
  • FIGS. 3 a, 3 b, 3 c, 3 d, 3 e, and 3 f are side views of non-linear depth control arrangements using curved devices;
  • FIGS. 4 a and 4 b are side views of modified pick up systems;
  • FIGS. 5 a, 5 b, and 5 c show divergent projection methods;
  • FIG. 6 a shows examples of objects to be imaged;
  • FIG. 6 b shows a modified pick up lens system attached to a digital camera;
  • FIG. 7 shows a top view of optical set up for 3D image display which includes a micro-convex mirror array;
  • FIGS. 8 a, 8 b, 8 c and 8 d show center parts of elemental images;
  • FIG. 9 shows reconstructed orthoscopic virtual 3-D images when an optical path-length-equalizing (OPLE) lens was not used; and
  • FIG. 10 shows reconstructed orthoscopic virtual 3-D images when an OPLE lens was used.
  • DETAILED DESCRIPTION
  • Methods and devices to control the depth and lateral size of reconstructed 3-D images are disclosed. These methods and devices may be used with a novel “Projection” Integral Imaging (PII) system for example.
  • One described technique allows pick up of large 3-D objects that may be far away, and also allows the display of their demagnified 3-D images within the depth-of-focus of II systems. It is shown that curved pickup devices (i.e., a curved 2-D image sensor and a curved lenslet array) or curved display devices or both may be used for this purpose. When the lenslets in the curved array have a zooming capability, a linear depth control is additionally possible.
  • Two exemplary methods are discussed below alone and also when they are used together. In experiments to demonstrate the feasibility of our method, planar pickup devices may be used (lenslet array, sensor, and display). An additional large aperture negative lens, also referred to herein as an optical path-length-equalizing (OPLE) lens, is placed in contact with the pickup lenslet array.
  • It should be noted that in this disclosure the term “recording” is used interchangeably with “pickup” and the term “reconstruction” is used interchangeably with “display.”
  • Review of Integral Imaging
  • Conventional Integral Imaging (CII)
  • In CII, planar lenslet arrays with positive focal lengths have been used as depicted in FIG. 1.
  • As depicted in FIG. 1(a), a set of elemental images 1 of a 3-D object 2 (i.e., direction and intensity information of the spatially sampled rays coming from the object) may be obtained by use of a lenslet array 3 and a 2-D image sensor 4 such as a charged coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) image sensor. As depicted in FIG. 1(b), to reconstruct a 3-D image 7 of the object 2, the set of 2-D elemental images 1 may be displayed in front of a lenslet array 3 using a 2-D display panel 5, such as a liquid crystal display (LCD) panel.
  • Further, with reference to FIGS. 1 a and 1 b, in one example, the lenslet array 3 with focal length ƒ may be positioned at z=0, and the display panel at z=−g. From the Gauss lens law: 1 g + 1 L i = 1 f , ( 1 )
    it is shown that the gap distance g should be Liƒ|(Li−ƒ)≡gr, where it may be assumed that 3-D real images 7 are formed around z=Li. The rays coming from elemental images converge to form a 3-D real image through the lenslet array 3. The reconstructed 3-D image may be a pseudoscopic (depth-reversed) real image 7 of the object 2. To convert the pseudoscopic image to an orthoscopic image, a process to rotate every elemental image by 180 degrees around its own center optic axis may be used. The orthoscopic image becomes a virtual image 8 by this P/O conversion process. Also, as shown in FIG. 1 c, when the 3-D virtual image 8 is formed around z=−Li, the gap distance g should be Liƒ|(Li+ƒ)≡gv for optimal focusing from Eq. (1).
  • Projection Integral Imaging (PII)
  • Projection integral imaging (PII) is the novel subject of this invention. In other words, the inventors were the first to invent projection integral imaging (PII).
  • As shown in FIG. 2 a, in PII, the process to obtain elemental images is not substantially different from that in CII. However, elemental images may be projected through relay optics 10 onto a lenslet array 3 as depicted in FIGS. 2(a) and (b). A micro-convex/concave- mirror array 11, 21 as a projection screen may be used, as depicted in FIGS. 2(c) and (d). When a lenslet array 3 with a positive focal length is used or a micro-concave-mirror array 11 is used, the in-focus plane of projected elemental images 12 may be positioned at z=−gr, as depicted in FIGS. 2(a) and (c). If P/O-converted elemental images are used to display 3-D orthoscopic virtual images 8, which are formed around z=−Li, the in-focus plane of projected elemental images 12 should be positioned at z=−gv.
  • When a lenslet array with a negative focal length 9 is used or a micro-convex-mirror array 21, 3-D orthoscopic virtual images 8 may be displayed without the P/O conversion. For example, suppose that 3-D images 8 are formed around z=−Li and the focal length of the lenslet array 9 (or the micro-convex-mirror array 21) is −ƒ. Then, the gap distance g becomes Liƒ|(ƒ−Li)≡−gr from Eq. (1). Thus the in-focus plane of projected elemental images 12 may be positioned at z=+gr, as depicted in FIGS. 2(b) and (d). On the other hand, when 3-D real images 7 are displayed around z=Li, the in-focus plane of projected elemental images 12 may be positioned at z=+gv. Because Li>>ƒ in both PII and CII, gr≈gv≈ƒ.
  • Advantages of PII Over CII
  • PII allows for the following because of the use of a micro-convex-mirror array as a projection screen:
  • First, viewing angle is enhanced. In II, the full viewing angle ψ is limited and determined approximately by 2×arctan[0.5/(ƒ/#)], where ƒ/# is the ƒ number of the lenslet, when the fill factor of the lenslet array is close to 1.
  • Also, it is easier to make diffraction-limited (or aberration-free) convex mirrors with a small ƒ/# than it is to make similar lenslets. A convex mirror element may have an ƒ/# smaller than 1. For example, if ƒ/#=0.5, the viewing angle ψ becomes 90 degrees, which is acceptable for many practical applications.
  • Second, the P/O conversion is unnecessary, if a positive lenslet array is used for direct camera pickup.
  • Third, it is easy to realize 3-D movies with large screens even if a small size of display panels or film is used. This is because the display panel and the screen are separated, and thus the size of elemental images that are projected onto the screen can be controlled easily by use of the relay optics.
  • Forth, flipping-free observations of 3-D images are possible even if optical barriers are not used. This is because each elemental image can be projected only onto its corresponding micro-convex mirror.
  • Fifth, it is easy to implement spatial multiplexing or temporal multiplexing or both in PII. To display 3-D images with high-resolution and large depth-of-focus, the number of pixels in the entire set of elemental images should be sufficiently large. Because display panels that are currently available or near future cannot meet such requirement, spatial multiplexing or temporal multiplexing or both may be needed to display the entire set of high-resolution elemental images.
  • In the experiments below, PII was used using a micro-convex-mirror array screen. However, this disclosure is not limited only to use of the structures used in these exemplary embodiments and experiments below.
  • Longitudinal Depth Control of 3-D Images
  • 3-D images reconstructed in II systems may have limited depth-of-focus δ. It has been shown that δ cannot be larger than 1/(λρ2) where λ is the display wavelength and ρ is the resolution of reconstructed 3-D images. ρ is defined as the inverse of the reconstructed image spot size. In PII, 3-D images with high resolution can be reconstructed only near the projection screen of micro-convex-mirror arrays (or the display lenslet array). Thus the depth-of-focus δ should be measured from the projection screen.
  • Suppose that one is trying to pickup an object positioned beyond the depth-of-focus range. Specifically, the front surface of the object, whose longitudinal thickness is T, is positioned at z=zo>δ. When the focal lengths of the pickup lenslets and the micro-convex-mirrors in the projection screen are equal in magnitude, a 3-D image is reconstructed either at z=zo for real image display or at z=−zo for virtual image display. Thus, in this example, a focused 3-D image cannot be displayed because the image position is beyond the range of depth-of-focus. Therefore, control of the depth (and thus position) of the reconstructed 3-D integral images to be displayed is needed so that it can be reconstructed near the screen, i.e., within the depth-of-focus.
  • Linear Depth Control by Zooming the Elemental Images
  • If the focal length of the pickup lenslet array ƒp is longer than that of the display micro-convex-mirror array ƒd, the longitudinal scale of reconstructed image space is reduced linearly by a factor of ƒdp≡r while the lateral scale does not change. So if (zo+T)r<δ, the 3-D reconstructed image is well focused.
  • One solution to pickup objects at various longitudinal positions and display their images within the depth-of-focus of II systems, therefore, is to use a pickup lenslet array with a variable focal length ƒp or an array of micro-zoom lenses. If ƒp is increased by a factor of α, every elemental image is also magnified by that factor, according to geometrical optics. Therefore, digital zoom-in can be used, even if ƒp is fixed. In other words, by digitally magnifying every elemental images in a computer by a factor of α, r can be changed as r = f d α f p . ( 2 )
  • Then, an orthoscopic virtual image is reconstructed at z=−rzo for the object positioned at z=zo in the pickup process.
  • Digital zoom-in may degrade the resolution of elemental images. When zo→∞ and the object is very large, a nonlinear depth control method may be used.
  • Nonlinear Depth Control using Curved Pickup Devices
  • For a large object that is far away, elemental images are almost identical because parallax of the object is small for neighboring pickup lenslets. When such elemental images are displayed in the II system, the reconstructed image may be seriously blurred and not easily seen. A curved pickup devices (e.g., a curved lenslet array 17 and a curved 2-D image sensor 18) with a radius of curvature R may be used, and then 3-D images may be reconstructed using planar display devices as depicted in FIGS. 3(a) and (b), respectively. Similarly, planar pickup devices (e.g., a planar image sensor 14 and a planar lenslet array 16) and curved display devices (e.g., a curved display panel 19 and a curved lenslet array 20) may be used as depicted in FIGS. 3(c) and (d), respectively. The following sign convention is used: R>0, when the center of the curvature is positioned at the same side of the object (observer 6) in the pickup (display) process; and R<0 when it is positioned at the opposite side.
  • The use of a negatively curved pickup lenslet array increases disparity of neighboring elemental images. This is because pickup directions of the lenslets in a curved array are not parallel and thus their fields of view are more separated than those for a planar array. Such elemental images may also be obtained if the object of a reduced size near the pickup lenslet array is picked up. Therefore, when elemental images with increased disparity are displayed on a planar display screen (a micro-convex-mirror array), an integral image with a reduced size is reconstructed near the screen. By controlling R, 3-D images of large objects that are far away can be displayed within the depth-of-focus of the II system.
  • The effect of depth and size reduction using the negatively curved pickup lenslet array can be analyzed by introducing a hypothetical thin lens with a negative focal length −Rp, which is in contact with the planar pickup lenslet array 16, as depicted in FIG. 3(e). This is because ray propagation behaviors for the two setups in FIGS. 3(a) and 3(e), and those in FIGS. 3(d) and 3(f) are the same, respectively. We call this lens an optical path-length-equalizing (OPLE) lens 15. When two thin lenses with focal length ƒ1 and ƒ2 are in contact, the effective focal length becomes ƒ1ƒ2/(ƒ12). To get complete equivalence between the two setups, the focal length of the lenslet array 16 that is in contact with the OPLE lens 15 should be ƒp e=Rpƒp/(Rpp), where ƒp is the focal length of the curved pickup lenslets 17. In general, Rp>>ƒp and thus ƒp e≈ƒp. Therefore, instead of using the curved pickup lenslet array 17 with a radius of curvature −Rp and a focal length ƒp, and a curved image sensor 18 in the analysis, a planar lenslet array 16 with a focal length ƒp e may be used, a flat image sensor 14, and the pickup OPLE lens 15 with a focal length −Rp.
  • The OPLE lens 15 first produces images of objects, and then the images are actually picked up by the planar pickup devices 14, 16 to produce elemental images with increased disparity. For an object positioned at z=zo(>0) ,the OPLE lens 15 produces its image according to Eq. (1) at z = R p z o R p + z o z i ( 3 )
  • As zo varies from ∞ to 0, zi changes from Rp to 0. The elemental images with increased disparity are projected onto a planar micro-convex-mirror array screen, a virtual image is reconstructed at z=−zi if ƒdp. Therefore, Rp should be shorter than the depth-of-focus of the II system. Lateral magnification of the OPLE lens is given by zi/zo(<1) according to geometrical optics.
  • As shown in FIG. 3(d), the effect of depth and size reduction can also be achieved by use of negatively curved display devices 19, 20. Suppose that curved display devices 19, 20 with a radius of curvature −Rd are used, while elemental images are obtained by use of planar pickup devices 14, 16. As before, a hypothetical display OPLE lens 15 is introduced to planar display devices (e.g., planar lenslet array 16 and planar display panel 22). Then, an orthoscopic virtual image of the object is reconstructed at z = - R d z o R d + z o ( 4 )
  • for the object positioned at z=zo(>0) in the pickup process, if ƒdp.
  • Combination of Linear and Nonlinear Depth Control Methods
  • In general, both linear and nonlinear depth control methods may be used together. For an object positioned at z=zo, the position of the reconstructed image can be predicted from the equivalent planar pickup 14, 16 and display 22, 16 devices with OPLE lenses. The pickup OPLE lens produces an image of the object at z=zi where zi is given in Eq. (3). From this image, elemental images with increased disparity are obtained and then they are digitally zoomed-in. Then, the planar display lenslet array 16 produces an intermediate reconstructed image at z=−rzi where r is given in Eq. (2). Because of the display OPLE lens 15, from the Gauss lens law the final reconstructed image is obtained at z=−zr where z r = rR p R d z o ( rR p + R d ) z o + R p R d ( 5 )
  • As zo varies from ∞ to 0, zr changes from rRpRd/(rRp+Rd) to 0.
  • Other System Factors that Influence 3-D Image Depth and Size
  • The use of a Modified Pickup System
  • Because the physical size of the 2-D image sensor 14 is smaller than that of the pickup lenslet array 3, a modified pickup system is usually used as depicted in FIG. 4(a). Here, elemental images formed by a planar lenslet array 3 are detected through a camera lens 25 with a large ƒ/#. The use of such a camera lens 25 and the planar pickup lenslet array 3 produces the effect of a negatively curved pickup lenslet array, because disparity of elemental images increases. This effect is taken into account, by considering the modified pickup system as a curved pickup system with a curved lenslet array whose radius of curvature is −Rc. Rc equals approximately the distance between the planar pickup lenslet array and the camera lens.
  • Therefore, if elemental images are detected through a camera lens 25 when a curved pickup lenslet array 26 is used with the radius of curvature Rp as depicted in FIG. 4(b), the actual radius of curvature of the pickup lenslet array 26 is considered to be R p e = R c R p R c + R p ( 6 )
  • This is treated in this experiment as the equivalent of planar pickup devices (14, 16) with two OPLE lenses (27, 28). In this case, we replace Rp with Rp e in Eq. (5).
  • Diverging Projection of Elemental Images
  • As depicted in FIG. 5(a), when elemental images are projected onto a lenslet array 3 screen, the projection beam angle θ (e.g., in the azimuthal direction) may not be negligible. In this case, the effect of negatively curved display devices naturally exists even if planar display devices are used. Suppose that the horizontal size of the overall projected elemental images on the screen is S. Then, one can consider the planar display devices as curved display devices with a radius of curvature −Rs≈−S/θ if the aperture size of the relay optics is much smaller than S. In fact, Rs is approximately equal to the distance between the planar projection screen and the relay optics.
  • Suppose that such a diverging projection system is used in a negatively curved lenslet array 30 with the radius of curvature −Rd as depicted in FIG. 5(b) or in a negatively curved micro-convex-mirror array 31 as in FIG. 5(c). The actual radius of curvature of the display screen in the non-diverging system is: R d e = R s R d R s + R d . ( 7 )
  • In this case, one would have to replace Rd with Rd e in Eq. (5).
  • Experiments
  • System Description
  • The object to be imaged is composed of a small cacti 35 and a large building 36 as shown in FIG. 6(a). The distance between the pickup lenslet array and the cacti 35 is approximately 20 cm and that between the pickup lenslet array and the building is approximately 70 m. Because curved pickup devices were not available for this experiment, elemental images were obtained by use of a planar 2-D image sensor and a planar lenslet array in contact with a large-aperture negative lens as an OPLE lens. The focal length and the diameter of the negative lens are 33 cm (=Rp) and 7 cm, respectively. The planar pickup lenslet array used is made from acrylic, and has 53×53 plano-convex lenslets. Each lenslet element is square-shaped and has a uniform base size of 1.09 mm×1.09 mm, with less than 7.6 μm separating the lenslet elements. The focal length of the lenslets is approximately 3 mm (=ƒp). A total of 48×36 elemental images are used in the experiments.
  • A digital camera 37 with 4500×3000 CMOS pixels was used for the 2-D image sensor. The camera pickup system 37 is shown in FIG. 6(b). In this modified pickup system, Rc≈20 cm. From Eq. (6), Rp e=Rc=20 cm, when the OPLE lens is not used; and Rp e=12.5 cm, when the OPLE lens is used.
  • The linear depth reduction method was also used in combination with the nonlinear method. To avoid resolution degradation caused by digital zoom-in, the resolution of the zoom-in elemental images was kept higher than that of the LCD projector. Four different α's are used: α1=1, α2=1.5, α3=2, and α4=2.5. A planar micro-convex-mirror array for the projection screen was obtained by coating the convex surface of a lenslet array that is identical to the pickup lenslet array. Light intensity reflectance of the screen is more than 90%. The focal length of each micro-convex mirror is 0.75 mm (=ƒd) in magnitude. Because ƒp=3 mm, linear depth squeezing rates are r1=¼, r2=⅙, r3=⅛, and r4 = 1/10 from Eq. (2) for α1, α2, . . . , α4, respectively.
  • The setup for 3-D image reconstruction is depicted in FIG. 7. A color LCD projector 40 that has 3 (red, green, and blue) panels was used for elemental image projection. Each panel has 1024×768 square pixels with a pixel pitch of 18 μm. Each elemental image has approximately 21×21 pixels on average. Magnification of the relay optics 41 is 2.9. The diverging angle of the projection beam η is approximately 6 degrees in the azimuthal direction. The effect of curved display devices slightly exists. The distance between the screen and the relay optics is approximately 48 cm. Because S=52.3 mm, Rs≈50 mm. From Eq. (7), Rd e=50 cm, because Rd=∞ in the experiments.
  • The position of cacti 35 is denoted by zoc(=20 cm) and that of the building 36 by zob(=70 m). For different r's and Rp's, one can estimate the position of the reconstructed image for the cacti z=−zrc and that for the building z=−zrb from Eq. (5). They are illustrated in Table 1.
    TABLE 1
    Estimated Positions of Reconstructed Imagesa
    Rp e (cm) 20 12.5
    r /4 /6 /8 /10 /4 /6 /8 /10
    zrc (cm) .38 .61 .22 .98 .85 .25 .94 .76
    zrb (cm) .53 .12 .37 .92 .94 .00 .51 .22

    aOther parameters: Rd e = 50 cm; zoc = 20 cm; and zob = 70 m.
  • Experimental Results
  • Center parts of elemental images that were obtained without the OPLE lens and those obtained with the OPLE lens are shown in FIGS. 8(a) and 8(b), respectively. When α=2.5, digitally zoomed-in elemental images for those in FIGS. 8(a) and 8(b) are illustrated in FIGS. 8(c) and 8(d), respectively. One can see that the OPLE lens increases disparity between neighboring elemental images.
  • When elemental images are projected onto the planar micro-convex-mirror array, 3-D orthoscopic virtual images are reconstructed. The measured viewing angle was 60˜70 degrees, which agrees well with the predicted value. To observers who move beyond the viewing angle range, the entire reconstructed image disappears. Higher-order reconstructed images were hardly observed for a well-aligned system. Left, center, and right views of reconstructed 3-D images for different depth control parameters are illustrated in FIGS. 9 and 10. The observed positions of the reconstructed images agree qualitatively with the estimated positions given in Table 1. Comparing the images shown in FIGS. 9 and 10, one can see that smaller 3-D images are reconstructed for shorter Rp e. As r decreases, reconstructed 3-D images squeeze further in the longitudinal direction and thus disparity between left and right views reduces. The lateral size of reconstructed 3-D images is independent of r. Reconstructed 3-D images at deeper positions are more blurred because the depth-of-focus of the PII system is limited, which is estimated to be 5 cm approximately.
  • Binocular parallax is the most effective depth cue for viewing medium distances. In general, our depth control method degrades solidity of reconstructed 3-D images because it squeezes their longitudinal depth more excessively than the lateral size for distant objects. However, human vision also uses other depth cues, and binocular parallax may not be so effective for viewing long distances. Therefore, our nonlinear position control method can be efficiently used for large-scale 3-D display system with limited depth-of-focus. Nevertheless, efforts to enhance the depth-of-focus of II systems should be pursued.
  • In conclusion, at least a method, apparatus and system to control depth and lateral size of reconstructed 3-D images in II have been presented, in which a curved pickup lenslet array or a curved micro-convex-mirror (display lenslet) array or both may be used. When lenslets in the curved array have a zooming capability, a linear depth control is additionally possible. Using both control methods, it has been shown that large objects in far distances can be reconstructed efficiently by the II system with limited depth-of-focus. This control will be useful for realization of 3-D television, video, and movie based on II.
  • Some embodiments of the invention have the following advantages: imaging is performed with direct pickup to create true 3-D image formations with full parallax and continuous viewing points with incoherent light using two-dimensional display devices resulting in orthoscopic images with wide viewing angles, large depth of focus and high resolution. Additional advantages include the ability to project 3-D images to a large display screen.
  • One of ordinary skill in the art can appreciate that a computer or other client or server device can be deployed as part of a computer network, or in a distributed computing environment. In this regard, the methods and apparatus described above and/or claimed herein pertain to any computer system having any number of memory or storage units, and any number of applications and processes occurring across any number of storage units or volumes, which may be used in connection with the methods and apparatus described above and/or claimed herein. Thus, the same may apply to an environment with server computers and client computers deployed in a network environment or distributed computing environment, having remote or local storage. The methods and apparatus described above and/or claimed herein may also be applied to standalone computing devices, having programming language functionality, interpretation and execution capabilities for generating, receiving and transmitting information in connection with remote or local services.
  • The methods and apparatus described above and/or claimed herein is operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well known computing systems, environments, and/or configurations that may be suitable for use with the methods and apparatus described above and/or claimed herein include, but are not limited to, personal computers, server computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices.
  • The methods described above and/or claimed herein may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Program modules typically include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Thus, the methods and apparatus described above and/or claimed herein may also be practiced in distributed computing environments such as between different units where tasks are performed by remote processing devices that are linked through a communications network or other data transmission medium. In a typical distributed computing environment, program modules and routines or data may be located in both local and remote computer storage media including memory storage devices. Distributed computing facilitates sharing of computer resources and services by direct exchange between computing devices and systems. These resources and services may include the exchange of information, cache storage, and disk storage for files. Distributed computing takes advantage of network connectivity, allowing clients to leverage their collective power to benefit the entire enterprise. In this regard, a variety of devices may have applications, objects or resources that may utilize the methods and apparatus described above and/or claimed herein.
  • Computer programs implementing the method described above will commonly be distributed to users on a distribution medium such as a CD-ROM. The program could be copied to a hard disk or a similar intermediate storage medium. When the programs are to be run, they will be loaded either from their distribution medium or their intermediate storage medium into the execution memory of the computer, thus configuring a computer to act in accordance with the methods and apparatus described above.
  • The term “computer-readable medium” encompasses all distribution and storage media, memory of a computer, and any other medium or device capable of storing for reading by a computer a computer program implementing the method described above.
  • Thus, the various techniques described herein may be implemented in connection with hardware or software or, where appropriate, with a combination of both. Thus, the methods and apparatus described above and/or claimed herein, or certain aspects or portions thereof, may take the form of program code or instructions embodied in tangible media, such as floppy diskettes, CD-ROMs, hard drives, or any other machine-readable storage medium, wherein, when the program code is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the methods and apparatus of described above and/or claimed herein. In the case of program code execution on programmable computers, the computing device will generally include a processor, a storage medium readable by the processor, which may include volatile and non-volatile memory and/or storage elements, at least one input device, and at least one output device. One or more programs that may utilize the techniques of the methods and apparatus described above and/or claimed herein, e.g., through the use of a data processing, may be implemented in a high level procedural or object oriented programming language to communicate with a computer system. However, the program(s) can be implemented in assembly or machine language, if desired. In any case, the language may be a compiled or interpreted language, and combined with hardware implementations.
  • The methods and apparatus described above and/or claimed herein may also be practiced via communications embodied in the form of program code that is transmitted over some transmission medium, such as over electrical wiring or cabling, through fiber optics, or via any other form of transmission, wherein, when the program code is received and loaded into and executed by a machine, such as an EPROM, a gate array, a programmable logic device (PLD), a client computer, or a receiving machine having the signal processing capabilities as described in exemplary embodiments above becomes an apparatus for practicing the method described above and/or claimed herein. When implemented on a general-purpose processor, the program code combines with the processor to provide a unique apparatus that operates to invoke the functionality of the methods and apparatus described above and/or claimed herein. Further, any storage techniques used in connection with the methods and apparatus described above and/or claimed herein may invariably be a combination of hardware and software.
  • The operations and methods described herein may be capable of or configured to be or otherwise adapted to be performed in or by the disclosed or described structures.
  • While the methods and apparatus described above and/or claimed herein have been described in connection with the preferred embodiments and the figures, it is to be understood that other similar embodiments may be used or modifications and additions may be made to the described embodiment for performing the same function of the methods and apparatus described above and/or claimed herein without deviating therefrom. Furthermore, it should be emphasized that a variety of computer platforms, including handheld device operating systems and other application specific operating systems are contemplated, especially given the number of wireless networked devices in use.
  • While the description above refers to particular embodiments, it will be understood that many modifications may be made without departing from the spirit thereof. The accompanying claims are intended to cover such modifications as would fall within the true scope and spirit of the present invention.

Claims (28)

1. A method of displaying three-dimensional images, the method comprising:
projecting integral images to a display device; and
displaying three-dimensional images with the display device.
2. The method of claim 1, further comprising:
relaying the images from a projector to the display device through relay optics.
3. The method of claim 2, further comprising:
converting pseudoscopic images to orthoscopic images with a lenslet array.
4. The method of claim 3, further comprising:
focusing orthoscopic images with a lenslet array with a positive focal length.
5. The method of claim 2, further comprising:
focusing orthoscopic images with a lenslet array with a negative focal length.
6. The method of claim 2, further comprising:
converting pseudoscopic images to orthoscopic images with a micro-concave-mirror array.
7. The method of claim 6, further comprising:
focusing orthoscopic images with a micro-concave-mirror array.
8. The method of claim 2, further comprising:
focusing orthoscopic images with a micro-convex-mirror array.
9. The method of claim 8, further comprising:
increasing a viewing angle by using the micro-convex-mirror array as a display screen.
10. The method of claim 8, further comprising:
diverging the projected images with the relay optics to a large micro-convex-mirror array.
11. The method of claim 8, further comprising:
projecting each elemental image to a unique micro-convex mirror.
12. The method of claim 8, further comprising:
multiplexing the projected images temporally.
13. The method of claim 8, further comprising:
multiplexing the projected images spatially.
14. The method of claim 8, further comprising:
multiplexing the projected images both temporally and spatially.
15. The method of claim 1, wherein the display device is planar.
16. A method of controlling the depth of 3-D images when recording and displaying 3-D images, comprising:
magnifying elemental images during pickup;
projecting the magnified elemental images via an optics relay to a display device; and
displaying 3-D images within the depth-of-focus of the display device; while maintaining lateral image sizes.
17. The method of claim 16, wherein the magnifying is performed digitally.
18. A method of controlling the depth of 3-D images when recording and displaying 3-D images with planar pickup and planar display devices, comprising:
positioning an optical path-length-equalizing (OPLE) lens adjacent to a planar lenslet array;
projecting 3-D images via an optics relay to a planar display device; and
displaying 3-D images within the depth-of-focus of the display device.
19. The method of claim 18, further comprising:
positioning the OPLE lens adjacent to a planar lenslet array of a planar pickup device during pickup.
20. The method of claim 18, further comprising:
positioning the OPLE lens adjacent to a planar lenslet array of a planar display device during display.
21. The method of claim 18, further comprising:
magnifying elemental images during pickup.
22. A method of recording and displaying 3-D images, comprising:
generating elemental images with a micro-lenslet array;
increasing disparity of elemental images with an optical path-length-equalizing (OPLE) lens;
recording the elemental images on an imaging sensor of a recording device;
projecting 3-D images through an optical relay to a display device; and
displaying the 3-D images within the depth-of-focus of the display device.
23. The method of claim 22, further comprising:
controlling images depth linearly; by
digitally magnifying elemental images; and
unaltering the lateral size of images.
24. An apparatus for displaying orthoscopic 3-D images, comprising:
a projector for projecting integral images; and
a micro-convex-mirror array for displaying the projected images.
25. The apparatus of claim 24, further comprising:
relay optics for relaying the projected integral images therethrough to the micro-convex-mirror array.
26. The apparatus of claim 24, wherein the projector is a two-dimensional (2-D) projector.
27. The apparatus of claim 24, further comprising:
a planar micro-convex-mirror array for displaying the projected images.
28. The apparatus of claim 24, further comprising:
a curved micro-convex-mirror array for displaying the projected images.
US11/498,666 2005-08-08 2006-08-03 Depth and lateral size control of three-dimensional images in projection integral imaging Abandoned US20070030543A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US11/498,666 US20070030543A1 (en) 2005-08-08 2006-08-03 Depth and lateral size control of three-dimensional images in projection integral imaging
US12/939,647 US8264772B2 (en) 2005-08-08 2010-11-04 Depth and lateral size control of three-dimensional images in projection integral imaging
US13/596,715 US20120320161A1 (en) 2005-08-08 2012-08-28 Depth and lateral size control of three-dimensional images in projection integral imaging

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US70628105P 2005-08-08 2005-08-08
US11/498,666 US20070030543A1 (en) 2005-08-08 2006-08-03 Depth and lateral size control of three-dimensional images in projection integral imaging

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US12/939,647 Division US8264772B2 (en) 2005-08-08 2010-11-04 Depth and lateral size control of three-dimensional images in projection integral imaging

Publications (1)

Publication Number Publication Date
US20070030543A1 true US20070030543A1 (en) 2007-02-08

Family

ID=37727940

Family Applications (3)

Application Number Title Priority Date Filing Date
US11/498,666 Abandoned US20070030543A1 (en) 2005-08-08 2006-08-03 Depth and lateral size control of three-dimensional images in projection integral imaging
US12/939,647 Active US8264772B2 (en) 2005-08-08 2010-11-04 Depth and lateral size control of three-dimensional images in projection integral imaging
US13/596,715 Abandoned US20120320161A1 (en) 2005-08-08 2012-08-28 Depth and lateral size control of three-dimensional images in projection integral imaging

Family Applications After (2)

Application Number Title Priority Date Filing Date
US12/939,647 Active US8264772B2 (en) 2005-08-08 2010-11-04 Depth and lateral size control of three-dimensional images in projection integral imaging
US13/596,715 Abandoned US20120320161A1 (en) 2005-08-08 2012-08-28 Depth and lateral size control of three-dimensional images in projection integral imaging

Country Status (4)

Country Link
US (3) US20070030543A1 (en)
CN (1) CN101278565A (en)
DE (1) DE112006002095T5 (en)
WO (1) WO2007019347A2 (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070126967A1 (en) * 2005-12-02 2007-06-07 Choi Kyung H Two-dimensional and three-dimensional image selectable display device
US20080169997A1 (en) * 2007-01-16 2008-07-17 Kyung-Ho Choi Multi-dimensional image selectable display device
US20130193679A1 (en) * 2010-07-21 2013-08-01 Giesecke & Devrient Optically variable security element with tilt image
KR20140121529A (en) * 2013-04-05 2014-10-16 삼성전자주식회사 Method and apparatus for formating light field image
JP2015212795A (en) * 2014-05-07 2015-11-26 日本放送協会 Stereoscopic image display device
JP2017003688A (en) * 2015-06-08 2017-01-05 日本放送協会 Light beam control element and stereoscopic display device
CN107430277A (en) * 2015-01-21 2017-12-01 特塞兰德有限责任公司 Advanced diffractive optical devices for immersive VR
WO2018031963A1 (en) * 2016-08-12 2018-02-15 Avegant Corp. A near-eye display system including a modulation stack
US10057488B2 (en) 2016-08-12 2018-08-21 Avegant Corp. Image capture with digital light path length modulation
US10185153B2 (en) 2016-08-12 2019-01-22 Avegant Corp. Orthogonal optical path length extender
US10296098B2 (en) * 2014-09-30 2019-05-21 Mirama Service Inc. Input/output device, input/output program, and input/output method
US10379388B2 (en) 2016-08-12 2019-08-13 Avegant Corp. Digital light path length modulation systems
US10401639B2 (en) 2016-08-12 2019-09-03 Avegant Corp. Method and apparatus for an optical path length extender
US10516879B2 (en) 2016-08-12 2019-12-24 Avegant Corp. Binocular display with digital light path length modulation
US10809546B2 (en) 2016-08-12 2020-10-20 Avegant Corp. Digital light path length modulation
JP2021152661A (en) * 2010-06-16 2021-09-30 株式会社ニコン Image display device
US11303878B2 (en) * 2017-06-27 2022-04-12 Boe Technology Group Co., Ltd. Three-dimensional display panel, display method thereof, and display device

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ES2562924T3 (en) 2008-10-10 2016-03-09 Koninklijke Philips N.V. A parallax information processing procedure comprised in a signal
JP2013502617A (en) * 2009-08-25 2013-01-24 ドルビー ラボラトリーズ ライセンシング コーポレイション 3D display system
US20110075257A1 (en) 2009-09-14 2011-03-31 The Arizona Board Of Regents On Behalf Of The University Of Arizona 3-Dimensional electro-optical see-through displays
DE102012212088A1 (en) * 2012-07-11 2014-05-22 Bayerische Motoren Werke Aktiengesellschaft Image-forming unit
US9341853B2 (en) * 2012-08-14 2016-05-17 Young Optics Inc. Stereo image displaying system and stereo image capturing system
KR101984701B1 (en) * 2012-11-13 2019-05-31 삼성전자주식회사 3D image dispaly apparatus including electrowetting lens array and 3D image pickup apparatus including electrowetting lens array
KR101294261B1 (en) * 2013-01-08 2013-08-06 동서대학교산학협력단 Three dimensional interal imagine display-using mask and time-multiplexing
US8699868B1 (en) 2013-03-14 2014-04-15 Microsoft Corporation Anti-shake correction system for curved optical sensor
KR20150066901A (en) 2013-12-09 2015-06-17 삼성전자주식회사 Driving apparatus and method of a display panel
US9182605B2 (en) * 2014-01-29 2015-11-10 Emine Goulanian Front-projection autostereoscopic 3D display system
KR101617514B1 (en) 2014-04-16 2016-05-13 광운대학교 산학협력단 Multi-projection integral imaging method
CN105025284B (en) 2014-04-18 2019-02-05 北京三星通信技术研究有限公司 Demarcate the method and apparatus that integration imaging shows the display error of equipment
CN104407442A (en) * 2014-05-31 2015-03-11 福州大学 Integrated imaging 3D display micro-lens array and 3D manufacturing method thereof
CN104113750B (en) * 2014-07-04 2015-11-11 四川大学 A kind of integration imaging 3D projection display equipment without degree of depth reversion
DE102016113669A1 (en) 2016-07-25 2018-01-25 Osram Opto Semiconductors Gmbh Method for autostereoscopic imaging and autostereoscopic illumination unit
DE102016224162A1 (en) * 2016-12-05 2018-06-07 Continental Automotive Gmbh Head-Up Display
CN107301620B (en) * 2017-06-02 2019-08-13 西安电子科技大学 Method for panoramic imaging based on camera array
CN111869204B (en) 2018-03-22 2023-10-03 亚利桑那大学评议会 Method for rendering light field images for light field display based on integral imaging
WO2020170246A1 (en) 2019-02-18 2020-08-27 Rnvtech Ltd High resolution 3d display
WO2022192054A1 (en) * 2021-03-09 2022-09-15 Arizona Boards Of Regents On Behalf Of The University Of Arizona Devices and methods for enhancing the performance of integral imaging based light field displays using time-multiplexing schemes

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020114077A1 (en) * 2001-01-23 2002-08-22 Bahram Javidi Integral three-dimensional imaging with digital reconstruction
US20030020809A1 (en) * 2000-03-15 2003-01-30 Gibbon Michael A Methods and apparatuses for superimposition of images
US20040061934A1 (en) * 2000-12-18 2004-04-01 Byoungho Lee Reflecting three-dimensional display system
US20040184145A1 (en) * 2002-01-23 2004-09-23 Sergey Fridman Autostereoscopic display and method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3921061A1 (en) * 1989-06-23 1991-01-03 Hertz Inst Heinrich DISPLAY DEVICE FOR THREE-DIMENSIONAL PERCEPTION OF IMAGES
JP2005070255A (en) * 2003-08-22 2005-03-17 Denso Corp Virtual image display device
US7261417B2 (en) * 2004-02-13 2007-08-28 Angstrom, Inc. Three-dimensional integral imaging and display system using variable focal length lens

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030020809A1 (en) * 2000-03-15 2003-01-30 Gibbon Michael A Methods and apparatuses for superimposition of images
US20040061934A1 (en) * 2000-12-18 2004-04-01 Byoungho Lee Reflecting three-dimensional display system
US20020114077A1 (en) * 2001-01-23 2002-08-22 Bahram Javidi Integral three-dimensional imaging with digital reconstruction
US20040184145A1 (en) * 2002-01-23 2004-09-23 Sergey Fridman Autostereoscopic display and method

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8174464B2 (en) 2005-12-02 2012-05-08 Samsung Mobile Display Co., Ltd. Two-dimensional and three-dimensional image selectable display device
US20070126967A1 (en) * 2005-12-02 2007-06-07 Choi Kyung H Two-dimensional and three-dimensional image selectable display device
US20080169997A1 (en) * 2007-01-16 2008-07-17 Kyung-Ho Choi Multi-dimensional image selectable display device
JP2021152661A (en) * 2010-06-16 2021-09-30 株式会社ニコン Image display device
US9987875B2 (en) * 2010-07-21 2018-06-05 Giesecke+Devrient Mobile Security Gmbh Optically variable security element with tilt image
US20130193679A1 (en) * 2010-07-21 2013-08-01 Giesecke & Devrient Optically variable security element with tilt image
KR20140121529A (en) * 2013-04-05 2014-10-16 삼성전자주식회사 Method and apparatus for formating light field image
JP2014203462A (en) * 2013-04-05 2014-10-27 三星電子株式会社Samsung Electronics Co.,Ltd. Apparatus and method for forming light field image
EP2787734A3 (en) * 2013-04-05 2014-11-05 Samsung Electronics Co., Ltd. Apparatus and method for forming light field image
US9536347B2 (en) 2013-04-05 2017-01-03 Samsung Electronics Co., Ltd. Apparatus and method for forming light field image
KR102049456B1 (en) 2013-04-05 2019-11-27 삼성전자주식회사 Method and apparatus for formating light field image
JP2015212795A (en) * 2014-05-07 2015-11-26 日本放送協会 Stereoscopic image display device
US10296098B2 (en) * 2014-09-30 2019-05-21 Mirama Service Inc. Input/output device, input/output program, and input/output method
CN107430277A (en) * 2015-01-21 2017-12-01 特塞兰德有限责任公司 Advanced diffractive optical devices for immersive VR
US10663626B2 (en) * 2015-01-21 2020-05-26 Tesseland, Llc Advanced refractive optics for immersive virtual reality
JP2017003688A (en) * 2015-06-08 2017-01-05 日本放送協会 Light beam control element and stereoscopic display device
US10379388B2 (en) 2016-08-12 2019-08-13 Avegant Corp. Digital light path length modulation systems
US10944904B2 (en) 2016-08-12 2021-03-09 Avegant Corp. Image capture with digital light path length modulation
US10401639B2 (en) 2016-08-12 2019-09-03 Avegant Corp. Method and apparatus for an optical path length extender
WO2018031963A1 (en) * 2016-08-12 2018-02-15 Avegant Corp. A near-eye display system including a modulation stack
US10516879B2 (en) 2016-08-12 2019-12-24 Avegant Corp. Binocular display with digital light path length modulation
US10185153B2 (en) 2016-08-12 2019-01-22 Avegant Corp. Orthogonal optical path length extender
US10809546B2 (en) 2016-08-12 2020-10-20 Avegant Corp. Digital light path length modulation
US10057488B2 (en) 2016-08-12 2018-08-21 Avegant Corp. Image capture with digital light path length modulation
US11016307B2 (en) 2016-08-12 2021-05-25 Avegant Corp. Method and apparatus for a shaped optical path length extender
US11025893B2 (en) 2016-08-12 2021-06-01 Avegant Corp. Near-eye display system including a modulation stack
US11042048B2 (en) 2016-08-12 2021-06-22 Avegant Corp. Digital light path length modulation systems
US10187634B2 (en) 2016-08-12 2019-01-22 Avegant Corp. Near-eye display system including a modulation stack
US11852839B2 (en) 2016-08-12 2023-12-26 Avegant Corp. Optical path length extender
US11480784B2 (en) 2016-08-12 2022-10-25 Avegant Corp. Binocular display with digital light path length modulation
US11852890B2 (en) 2016-08-12 2023-12-26 Avegant Corp. Near-eye display system
US11303878B2 (en) * 2017-06-27 2022-04-12 Boe Technology Group Co., Ltd. Three-dimensional display panel, display method thereof, and display device

Also Published As

Publication number Publication date
DE112006002095T5 (en) 2008-10-30
US8264772B2 (en) 2012-09-11
WO2007019347A3 (en) 2007-06-07
US20110043611A1 (en) 2011-02-24
US20120320161A1 (en) 2012-12-20
WO2007019347A2 (en) 2007-02-15
CN101278565A (en) 2008-10-01

Similar Documents

Publication Publication Date Title
US8264772B2 (en) Depth and lateral size control of three-dimensional images in projection integral imaging
JP5752823B2 (en) Coarse integral holographic display
Min et al. Three-dimensional display system based on computer-generated integral photography
Balram et al. Light‐field imaging and display systems
CN108803053B (en) Three-dimensional light field display system
US20120127570A1 (en) Auto-stereoscopic display
US9740015B2 (en) Three-dimensional imaging system based on stereo hologram having nine-to-one microlens-to-prism arrangement
US20020114077A1 (en) Integral three-dimensional imaging with digital reconstruction
RU2625815C2 (en) Display device
Gao et al. 360 light field 3D display system based on a triplet lenses array and holographic functional screen
CN107092096A (en) A kind of bore hole 3D ground sand table shows system and method
Javidi et al. Breakthroughs in photonics 2014: recent advances in 3-D integral imaging sensing and display
TWI489149B (en) Autostereoscopic display apparatus and storage media
Jang et al. Depth and lateral size control of three-dimensional images in projection integral imaging
JP4741395B2 (en) 3D image display device
CN1598690A (en) Screen division stereoscopic photography projection instrument
Dorado et al. Toward 3D integral-imaging broadcast with increased viewing angle and parallax
Javidi et al. Enhanced 3D color integral imaging using multiple display devices
JP2012177756A (en) Stereoscopic image acquisition device
Kim et al. Integral imaging with reduced color moiré pattern by using a slanted lens array
Okaichi et al. Integral three-dimensional display with high image quality using multiple flat-panel displays
Yang et al. Projection-type integral imaging using a pico-projector
Moon et al. Compensation of image distortion in Fresnel lens-based 3D projection display system using a curved screen
Arai Three-dimensional television system based on integral photography
Mishina Three-dimensional television system based on integral photography

Legal Events

Date Code Title Description
AS Assignment

Owner name: THE UNIVERSITY OF CONNECTICUT, CONNECTICUT

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:JAVIDI, BAHRAM;JANG (DEC'D) BY HYUNJU HA, WIFE AND LEGAL REPRESENTATIVE, JU-SEOG;REEL/FRAME:018452/0239;SIGNING DATES FROM 20061020 TO 20061023

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION