US20100225783A1 - Temporally Aligned Exposure Bracketing for High Dynamic Range Imaging - Google Patents

Temporally Aligned Exposure Bracketing for High Dynamic Range Imaging Download PDF

Info

Publication number
US20100225783A1
US20100225783A1 US12/717,765 US71776510A US2010225783A1 US 20100225783 A1 US20100225783 A1 US 20100225783A1 US 71776510 A US71776510 A US 71776510A US 2010225783 A1 US2010225783 A1 US 2010225783A1
Authority
US
United States
Prior art keywords
image
exposure
prism
temporally aligned
aligned images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/717,765
Inventor
Paul A. Wagner
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US12/717,765 priority Critical patent/US20100225783A1/en
Publication of US20100225783A1 publication Critical patent/US20100225783A1/en
Priority to US14/514,077 priority patent/US10511785B2/en
Priority to US16/682,663 priority patent/US20200084363A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • G02B27/1066Beam splitting or combining systems for enhancing image performance, like resolution, pixel numbers, dual magnifications or dynamic range, by tiling, slicing or overlapping fields of view
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • G02B27/14Beam splitting or combining systems operating by reflection only
    • G02B27/144Beam splitting or combining systems operating by reflection only using partially transparent surfaces without spectral selectivity
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/02Bodies
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B7/00Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
    • G03B7/08Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
    • G03B7/091Digital circuits
    • G03B7/093Digital circuits for control of exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/743Bracketing, i.e. taking a series of images with varying exposure conditions

Definitions

  • This invention relates generally to imaging systems, and more particularly, to imaging systems that provide varying exposures for production of high dynamic range images.
  • HDRI High dynamic range imaging
  • HDRI is a term applied in image processing, computer graphics and photography, and generally relates to systems or techniques for providing a greater dynamic range of exposures. HDRI is most commonly employed in situations where the range between light and dark areas is great, and subsequently a normal exposure, or even a digitally enhanced exposure, are not adequate to resolve all of the image area.
  • HDRI manipulates images and exposures to accurately represent the wide range of intensity levels found in real scenes, from direct sunlight to shadows.
  • the user employs multiple exposures and bracketing with photo merging, to get greater detail throughout the tonal range.
  • HDRI processing involves merging several exposures of a given scene into a, typically, 32-bit HDRI source file, which is then “tone mapped” to produce an image in which adjustments of qualities of light and contrast are applied locally to the HDRI source image.
  • HDRI images are best captured originally in a digital format with a much higher bit depth than the current generation of digital imaging devices.
  • Current devices are built around an 8-bit per channel architecture. That means that both the cameras and output displays have a maximum tonal range of 8-bits per RGB color channel.
  • HDRI formats are typically 32-bits per channel. A few next generation cameras and displays are capable of handling this kind of imagery natively. It will probably be quite a few years until HDRI displays become common but HDRI cameras and acquisition techniques are already emerging.
  • HDRI images are typically tone-mapped back to 8-bits per channel, essentially compressing the extended information into the smaller dynamic range. This is typically done automatically with a variety of existing software algorithms, or manually with artistic input through programs like Adobe Photoshop.
  • the artist first captures the HDRI image, and then the image is tone-mapped back to desired output device such as ink on paper, an 8-bit RGB monitor, or even a 32-bit HDRI monitor (requiring no tone mapping).
  • desired output device such as ink on paper, an 8-bit RGB monitor, or even a 32-bit HDRI monitor (requiring no tone mapping).
  • HDRI high definition HDRI
  • open EXR is an example of a robust open source HDRI format developed by Industrial Light and Magic.
  • the hardest part of capturing HDR images is the physical devices used to capture the imagery. So far only two ways of capturing HDR images are available.
  • CMOS complementary metal-oxide semiconductor
  • CCD charge-coupled device
  • CMOS complementary metal-oxide semiconductor
  • These types of cameras are typically used by professionals in controlled environments for the primary purpose of creating spherical photos to illuminate computer generated images (another important use of HDRI). They are not point and shoot cameras and are not capable of motion photography.
  • the second is shooting multiple varying exposures in rapid succession (known as exposure bracketing) then combining those images taking the highlights from the underexposed images, mid tones from the normally exposed images, and shadows from the over exposed images to create a composite HDR image that retains massive detail in the highlights and shadows where normal cameras would lose detail.
  • the second technique can be done with conventional hardware, but it is time consuming and takes substantial expertise to pull off.
  • the images are not temporally aligned, meaning they were taken one after another at different moments in time, there can be changes in the scene that produce artifacts when the HDRI software attempts to eliminate or synthesize the objects in motion across the frame.
  • An example would be a car moving through the frame.
  • HDRI exposure bracketed HDRI is typically restricted to still subjects, and any animals, cars, pedestrians, moving leaves or litter, clouds, etc., in fact anything that is shifting within the frame will preclude HDRI, or at the very least lead to unhappy results.
  • HDRI requires multiple, huge files, multiple steps, and typically specialized and complicated software.
  • the first technique is very expensive and requires exotic hardware or sophisticated electronic and software systems. While imaging chips are moving ever forward in sensitivity and dynamic range, they still do not produce the dramatic results that the first technique of changing exposures does. In addition, these special cameras are not capable of shooting higher frame rates required to shoot motion pictures. These products are used for narrow specialized purposes.
  • United States Patent Application No. 20070126918, to Lee, published Jun. 7, 2007, discloses cameras that can provide improved images by combining several shots of a scene taken with different exposure and focus levels is provided.
  • cameras are provided, which have pixel-wise exposure control means so that high quality images are obtained for a scene with a high level of contrast.
  • the system is complicated, and employs light reducing filters to create exposures of varying intensity. Much of the light is lost, reducing clarity and introducing sources of distortion and noise to the images.
  • United States Patent Application No. 20080149812 discloses an electronic camera comprising two or more image sensor arrays. At least one of the image sensor arrays has a high dynamic range.
  • the camera also comprises a shutter for selectively allowing light to reach the two or more image sensor arrays, readout circuitry for selectively reading out pixel data from the image sensor arrays, and, a controller configured to control the shutter and the readout circuitry.
  • the controller comprises a processor and a memory having computer-readable code embodied therein which, when executed by the processor, causes the controller to open the shutter for an image capture period to allow the two or more image sensor arrays to capture pixel data, and, read out pixel data from the two or more image sensor arrays. This is essentially a total digital solution to the problem of controlling exposure levels for different images for high dynamic range processing.
  • United States Patent Application No. 20070177004 to Kolehmainen, et al., published Aug. 2, 2007, is directed to an image creating method and imaging device comprising at least two image capturing apparatus, each apparatus being arranged to produce an image.
  • the apparatus is configured to utilize at least a portion of the images produced with different image capturing apparatus with each other to produce an image with an enhanced image quality.
  • Multiple lenses are required to implement this method, which is expensive and creates parallax and optic imagery distortions with each lens addition.
  • None of the prior approaches have been able to provide a simple means for capturing multiple images that overcome the difficulties of temporal misalignment, and that are simple and quickly resolved into a high definition range image.
  • an optical imaging system for temporally aligning bracketed exposures of a single image, the system comprising a light aperture, a prism and a image capturing device, where the prism is capable of splitting an incoming image from the light aperture into at least two temporally aligned images, and where the image capturing device captures the temporally aligned images at different levels of exposure.
  • the prism splits the intensity of said incoming image to achieve a desired EV output interval between temporally aligned images.
  • the capturing device further comprises image detection sensors, and the ISO of the sensors is adjusted to achieve a desired EV output interval between said images.
  • the system comprises an image processing device connected to said image capturing device.
  • the image processing device comprises a computer processor.
  • the device further comprises a tone-mapping processor.
  • the system comprises an eyepiece for viewing the image to be captured by the lens.
  • the system comprises a digital readout monitor.
  • the prism is capable of splitting the image into three or more levels of exposure.
  • the three levels of exposure are about 14%, about 29% and about 57%, respectively, of the exposure level of the original image.
  • the three levels of exposure are about 5%, about 19% and about 76%, respectively, of the exposure level of the original image.
  • the three levels of exposure are about 1%, about 11% and about 88%, respectively, of the exposure level of the original image.
  • the prism is capable of splitting the image into four or more levels of exposure.
  • the prism is capable of splitting the image into five or more levels of exposure.
  • the invention provides a method for temporally aligning bracketed exposures of a single image, the method comprising the steps of a) using a prism to split an incoming image from a light aperture into at least two temporally aligned images, and b) using an image capturing device to capture the temporally aligned images at different levels of exposure.
  • FIG. 1 shows a diagrammatic view of the system produced according to the invention, demonstrating variations to exposure intervals are shown using different combinations of prism splits and sensor sensitivity settings.
  • FIG. 2 shows a diagrammatic view of a system of FIG. 1 and further showing additional components of the system for processing the images.
  • FIG. 3 shows a perspective drawing of a two-way prism that could be utilized with the invention.
  • FIG. 4 shows a perspective drawing of a three-way prism that could be utilized with the invention.
  • FIG. 5 shows a perspective drawing of a four-way prism that could be utilized with the invention.
  • FIG. 6 shows a perspective drawing of a five-way prism that could be utilized with the invention.
  • the optical imaging system of the present invention provides an improvement to high dynamic range imaging, and assemblies therefore, that allows temporally aligned exposure bracketing.
  • the system is simple, elegant, leverages existing technologies, allows for motion capture with no temporal distortion, and is relatively inexpensive to implement.
  • the present optical imaging system allows the user to capture light with confidence that the under and over exposed regions in the image will be imaged properly. The user simply captures all the available light with and image capturing device, and determines later how to map that information to the output device. With the optical imaging system the user can create stunning imagery that is otherwise impossible to capture, even with the most sophisticated of the current generation of normal photography equipment.
  • an imaging apparatus without a primary lens (i.e., a pinhole camera or a slit scanner). These applications are more likely in industrial or scientific applications.
  • the invention can easily be adapted for designs that don't include a front end lens, but rather a simple aperture or the like.
  • the systems and methods utilize prism splitting by full spectrum brackets to several image detecting sensors of an image capturing device.
  • the system eliminates exotic image sensors as a necessary feature.
  • the system allows multiple exposures from existing commodity sensors simultaneously by simply dividing the incoming light for an image into multiple and different levels of exposure for the same image.
  • the temporally aligned imaging system can be analogized to Technicolor. Before color film stock was developed, Hollywood was in search of a way to shoot films in color. Technicolor, Inc. was the first company to develop a way to create color pictures from black and white film stock. It utilized three rolls of black and white film exposed simultaneously through a special set of beam splitters with red, green, and blue filters on them.
  • each black and white film negative recorded just the red, green, or blue information. This process was done in reverse with a projector that ran all three rolls of film simultaneously with the correct color filter in front of each. When the images are aligned properly, a full color picture is realized.
  • 3CCD sensor This technique is used to this day in professional level video cameras, sometimes referred to as 3CCD sensor.
  • the three red, green and blue sensors not only allow for sharper more saturated colors but also help enhance the dynamic range of the images they help create. But just as better color film stocks helped to usher out the era of the Technicolor process, better CMOS and CCD sensors are ushering out the era of 3CCD sensor systems in favor of full color single sensor systems. In fact some of the highest end professional cameras like the lineup from RED Digital Cinema Camera Company as well as every professional Digital SLR use only one full color sensor. It is quite apparent that sensor technology has progressed to the point where a single color sensor can replace and even outperform 3CCD sensor systems.
  • the temporally aligned exposure bracketing system employs trichroic prisms adapted to split the entire spectrum to each of multiple full color sensors, at different exposure levels, rather than splitting out the spectrum into different colors.
  • color neutral it is meant that while the temporally aligned images created by the prism may vary in intensity between themselves, or between themselves and the incoming image, they are not substantially different from one another in color spectrum, i.e., the prism creates split images that are similar in color spectrum, or spectrally neutral, even if differing substantially in intensity.
  • the system 10 comprises an optical imaging system having an aperture 20 for capturing incoming light 30 .
  • a neutral prism 100 Internal to the system is a neutral prism 100 that is used to reflect the captured light to generate a color-neutral separation.
  • the neutral film prism 100 is depicted as a three-way prism that splits the light to three separate full color sensors image 101 , 102 and 103 .
  • Various means can be employed to adjust the EV (Exposure Value, commonly referred to as a “stop”) up and down with the intensity spectrum, and a camera can then capture the images simultaneously.
  • a camera can then capture the images simultaneously.
  • two consecutive neutral films 104 and 105 are used, respectively capturing 57.1429% ( 4/7) of the light followed by a neutral film of 33.33% (1 ⁇ 3) for the remainder light.
  • the neutral prism thus fractionates a captured image into three temporally aligned exposures 106 , 107 and 108 , that have relative light intensities of 1/7, 2/7 and 4/7 of the incoming light.
  • the film coatings 104 and 105 for the prism 100 may be of any of numerous coatings known to the art and capable of achieving a color neutral split, or separation, of the image, by reflection of the incoming light 30 .
  • Two examples of such spectrally neutral films include a thin film metallic coating, typically aluminum or silver, with or without a set of dielectric layers, and a set of dielectric layers consisting of high and low refractive index materials with the thin film stack designed to reflect a certain percentage of the incident light over the visible wavelength range.
  • These and related types of thin film coatings 104 and 105 shall be termed “spectrally neutral film” or, alternatively, “neutral film.”
  • the following table provides a demonstration for calculating the percentages for such a system, using a prism for splitting a captured image into temporally aligned exposures 106 , 107 and 108 at levels of 14.2857%, 28.5714% and 57.1429%, respectively.
  • the prism is harnessed for the purpose of splitting out different exposures of the same image, that are temporally aligned (taken at the same moment).
  • EV Exposure Value
  • stop Exposure Value
  • the system could split the light intensity in the prism 100 into equal amounts of roughly 33% each and then adjust the ISO of the sensors 101 , 102 and 103 respectively to achieve different EV output intervals.
  • the system could split the light intensity within the prism 100 into the desired EV intervals for the light 106 , 107 and 108 .
  • the desired different EV output intervals are achieved for the recorded images. Any combination between these two extremes may be more or less desirable for various applications.
  • FIG. 2 illustrates some additional components of a system 10 .
  • a tone mapping processor 110 and an HDRI 120 processor that are used for combining the images.
  • the processing chip is used to combine the 3 images in real time to an HDRI image, and another chip is used to complete the tone mapping. These functions can also be combined into a single processing chip.
  • the individual sensors could benefit from some tuning for their respective exposure levels to reduce noise and other artifacts associated with under and over exposure, in ways known to the art.
  • a high quality standard camera lens 140 can be used with the system 10 to gather and focus light from the light aperture.
  • the system 10 also will typically include an eyepiece and/or monitor 150 for aligning the images for capture from the lens onto the sensors.
  • Additional features of the system typically would include mass storage for either the 8 bit tone mapped data 160 , or the raw 32 bit HDRI data 170 .
  • Other HDRI formats are known, for instance 16 bit and 14 bit formats, though the standard is evolving toward the higher 32 bit format.
  • the ISO is a function of how sensitive the sensor/film is to light.
  • the exposure generated by a particular aperture, shutter speed, and sensitivity combination can be represented by its exposure value “EV”.
  • Zero EV is defined by the combination of an aperture of f/1 and a shutter speed of 1s at ISO 100.
  • Exposure value is used to represent shutter speed and aperture combinations only.
  • An exposure value which takes into account the ISO sensitivity is called “Light Value” or LV and represents the luminance of the scene.
  • Light Value is often referred to as “exposure value”, grouping aperture, shutter speed and sensitivity in one familiar variable. This is because in a digital camera it is as easy to change sensitivity as it is to change aperture and shutter speed.
  • the EV will increase by 1. For instance, 6 EV represents half the amount of light as 5 EV.
  • Table 2 shows the additional variations possible for adjusting output intervals on top of the prismatic split, for +/ ⁇ 3EV, +/ ⁇ 2EV and +/ ⁇ 1EV.
  • the various exposure intervals can be modified or enhanced by using different combinations of prism splits with sensor sensitivity settings. This is accomplished by using differential exposure values (EV) to amplify the differences created by the prismatic split at the level of the sensors.
  • EV differential exposure values
  • Table 3 shows results for a diagrammatic view of a system produced according to the invention that as shown in FIGS. 1 and 2 , only deploying a prism with two splits of light 104 and 105 corresponding to 76.1905% ( 16/21) followed by 20.00% (1 ⁇ 5) on the remainder light. This is used for splitting a captured image into temporally aligned exposures 106 , 107 and 108 of levels of 76.1905%, 19.0476% and 4.7619%, respectively.
  • Table 4 shows the results where variations to exposure intervals are shown using different combinations of prism splits and sensor sensitivity settings of +/ ⁇ 3EV, +/ ⁇ 2EV and +/ ⁇ 1EV.
  • Table 4 shows the various ISO settings for each sensor that is used to produce alternative EV output intervals from each sensor (these settings are for +/ ⁇ 1EV input values only) as found in Table 3 (these settings are for +/ ⁇ 2EV input values only).
  • Table 5 is the results for a system produced according to the invention as depicted in FIGS. 1 and 2 , only showing a prism with two splits of light 104 and 105 corresponding to 87.6712% ( 64/73) followed by 11.11% ( 1/9) on the remainder light. This is used for splitting a captured image into temporally aligned exposures 106 , 107 and 108 of levels of 87.6712%, 10.9589% and 1.3699%, respectively.
  • Table 6 is the settings for a system as would be configured for the Table 5 percentages, where variations to exposure intervals are shown using different combinations of prism splits and sensor sensitivity settings of +/ ⁇ 3EV, +/ ⁇ 2EV and +/ ⁇ 1EV.
  • FIGS. 1 and 2 demonstrate configurations for two-way, three-way, four-way and five-way neutral prism configurations, respectively.
  • a 2-way configuration could work ( FIG. 3 ), although not as well for some applications.
  • a two-way neutral prism likely represents the least expensive implementation of the device, and may likely be used in consumer versions of many products produced for the cost savings.

Abstract

The invention provides an optical imaging system for temporally aligning bracketed exposures of a single image, the system comprising a light aperture, a prism and a image capturing device, where the prism is capable of splitting an incoming image from the light aperture into at least two temporally aligned images, and where the image capturing device captures the temporally aligned images at different levels of exposure.

Description

    REFERENCE TO RELATED APPLICATION
  • This application claims the benefit of U.S. Provisional Patent Application No. 61/157,494, filed Mar. 4, 2009, the complete disclosure of which is incorporated herein, in the entirety.
  • COPYRIGHT NOTICE
  • A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the United States Patent and Trademark Office patent files and records, but otherwise reserves all other copyright rights.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • This invention relates generally to imaging systems, and more particularly, to imaging systems that provide varying exposures for production of high dynamic range images.
  • 2. Description of Related Art
  • High dynamic range imaging (HDRI) is a term applied in image processing, computer graphics and photography, and generally relates to systems or techniques for providing a greater dynamic range of exposures. HDRI is most commonly employed in situations where the range between light and dark areas is great, and subsequently a normal exposure, or even a digitally enhanced exposure, are not adequate to resolve all of the image area.
  • HDRI manipulates images and exposures to accurately represent the wide range of intensity levels found in real scenes, from direct sunlight to shadows. With HDRI, the user employs multiple exposures and bracketing with photo merging, to get greater detail throughout the tonal range.
  • More particularly, HDRI processing involves merging several exposures of a given scene into a, typically, 32-bit HDRI source file, which is then “tone mapped” to produce an image in which adjustments of qualities of light and contrast are applied locally to the HDRI source image.
  • HDRI images are best captured originally in a digital format with a much higher bit depth than the current generation of digital imaging devices. Current devices are built around an 8-bit per channel architecture. That means that both the cameras and output displays have a maximum tonal range of 8-bits per RGB color channel.
  • HDRI formats are typically 32-bits per channel. A few next generation cameras and displays are capable of handling this kind of imagery natively. It will probably be quite a few years until HDRI displays become common but HDRI cameras and acquisition techniques are already emerging.
  • HDRI images are typically tone-mapped back to 8-bits per channel, essentially compressing the extended information into the smaller dynamic range. This is typically done automatically with a variety of existing software algorithms, or manually with artistic input through programs like Adobe Photoshop.
  • So in a typical workflow for HDRI the artist first captures the HDRI image, and then the image is tone-mapped back to desired output device such as ink on paper, an 8-bit RGB monitor, or even a 32-bit HDRI monitor (requiring no tone mapping).
  • The real challenge with HDRI is not the file formats or computer algorithms to tone map them to 8-bit displays. Those challenges have already been largely met. For example, open EXR is an example of a robust open source HDRI format developed by Industrial Light and Magic. The hardest part of capturing HDR images is the physical devices used to capture the imagery. So far only two ways of capturing HDR images are available.
  • The first is to use exotic high end cameras with special imaging chips (CMOS or CCD) like the Spheron HDR. Both CCD (charge-coupled device) and CMOS (complimentary metal-oxide semiconductor) image sensors convert light into electrons, though CMOS sensors are much less expensive to manufacture than CCD sensors. These types of cameras are typically used by professionals in controlled environments for the primary purpose of creating spherical photos to illuminate computer generated images (another important use of HDRI). They are not point and shoot cameras and are not capable of motion photography.
  • The second is shooting multiple varying exposures in rapid succession (known as exposure bracketing) then combining those images taking the highlights from the underexposed images, mid tones from the normally exposed images, and shadows from the over exposed images to create a composite HDR image that retains massive detail in the highlights and shadows where normal cameras would lose detail.
  • Both of these techniques have substantial disadvantages. The second technique can be done with conventional hardware, but it is time consuming and takes substantial expertise to pull off. In addition, because the images are not temporally aligned, meaning they were taken one after another at different moments in time, there can be changes in the scene that produce artifacts when the HDRI software attempts to eliminate or synthesize the objects in motion across the frame. An example would be a car moving through the frame.
  • Even a slight movement of the camera between exposures will be noticeable in the resulting combined image. Moving objects will be “ghosted” in the HDRI image. As such this technique is totally useless for motion photography and can only be used with substantial success in still photography applications.
  • For this reason, exposure bracketed HDRI is typically restricted to still subjects, and any animals, cars, pedestrians, moving leaves or litter, clouds, etc., in fact anything that is shifting within the frame will preclude HDRI, or at the very least lead to unhappy results.
  • Further, producing HDRI from multiple images can be a time consuming and frustrating task. HDRI requires multiple, huge files, multiple steps, and typically specialized and complicated software.
  • The first technique is very expensive and requires exotic hardware or sophisticated electronic and software systems. While imaging chips are moving ever forward in sensitivity and dynamic range, they still do not produce the dramatic results that the first technique of changing exposures does. In addition, these special cameras are not capable of shooting higher frame rates required to shoot motion pictures. These products are used for narrow specialized purposes.
  • Proposed solutions to the problems associated with the second technique are reflected in various published patents at the United States Patent and Trademark Office. For example, United States Patent Application No. 20060221209, to McGuire, et al., published Oct. 5, 2006, teaches an apparatus and method for acquiring and combining images of a scene with multiple optical characteristics at multiple resolutions. Disclosed therein is a camera system that acquires multiple optical characteristics at multiple resolutions of a scene. The camera system includes multiple optical elements arranged as a tree having a multiple of nodes connected by edges. The system employs filters at the end of the chain, and lenses are placed in front of each of the sensors, creating additional sources of optical distortion.
  • United States Patent Application No. 20070126918, to Lee, published Jun. 7, 2007, discloses cameras that can provide improved images by combining several shots of a scene taken with different exposure and focus levels is provided. In addition, cameras are provided, which have pixel-wise exposure control means so that high quality images are obtained for a scene with a high level of contrast. The system is complicated, and employs light reducing filters to create exposures of varying intensity. Much of the light is lost, reducing clarity and introducing sources of distortion and noise to the images.
  • United States Patent Application No. 20080149812, to Ward, et al., published Jun. 26, 2008, discloses an electronic camera comprising two or more image sensor arrays. At least one of the image sensor arrays has a high dynamic range. The camera also comprises a shutter for selectively allowing light to reach the two or more image sensor arrays, readout circuitry for selectively reading out pixel data from the image sensor arrays, and, a controller configured to control the shutter and the readout circuitry. The controller comprises a processor and a memory having computer-readable code embodied therein which, when executed by the processor, causes the controller to open the shutter for an image capture period to allow the two or more image sensor arrays to capture pixel data, and, read out pixel data from the two or more image sensor arrays. This is essentially a total digital solution to the problem of controlling exposure levels for different images for high dynamic range processing.
  • Finally, United States Patent Application No. 20070177004, to Kolehmainen, et al., published Aug. 2, 2007, is directed to an image creating method and imaging device comprising at least two image capturing apparatus, each apparatus being arranged to produce an image. The apparatus is configured to utilize at least a portion of the images produced with different image capturing apparatus with each other to produce an image with an enhanced image quality. Multiple lenses are required to implement this method, which is expensive and creates parallax and optic imagery distortions with each lens addition.
  • None of the prior approaches have been able to provide a simple means for capturing multiple images that overcome the difficulties of temporal misalignment, and that are simple and quickly resolved into a high definition range image.
  • What is needed is an inexpensive solution that can be easily integrated into products with conventional form factors. This solution would ideally be easy to use, compact, and able to shoot at high frame rates with no introduction of temporal alignment problems and associated artifacts.
  • SUMMARY OF THE INVENTION
  • By this invention is provided an optical imaging system for temporally aligning bracketed exposures of a single image, the system comprising a light aperture, a prism and a image capturing device, where the prism is capable of splitting an incoming image from the light aperture into at least two temporally aligned images, and where the image capturing device captures the temporally aligned images at different levels of exposure.
  • In one embodiment of the invention, the prism splits the intensity of said incoming image to achieve a desired EV output interval between temporally aligned images.
  • In a different embodiment, the capturing device further comprises image detection sensors, and the ISO of the sensors is adjusted to achieve a desired EV output interval between said images.
  • In another aspect of the invention, the system comprises an image processing device connected to said image capturing device.
  • In one embodiment, the image processing device comprises a computer processor.
  • In a different embodiment, the device further comprises a tone-mapping processor.
  • In a different aspect, the system comprises an eyepiece for viewing the image to be captured by the lens.
  • In a still further aspect, the system comprises a digital readout monitor.
  • In another embodiment, the prism is capable of splitting the image into three or more levels of exposure.
  • In a different embodiment, the three levels of exposure are about 14%, about 29% and about 57%, respectively, of the exposure level of the original image.
  • In a different embodiment, the three levels of exposure are about 5%, about 19% and about 76%, respectively, of the exposure level of the original image.
  • In a different embodiment, the three levels of exposure are about 1%, about 11% and about 88%, respectively, of the exposure level of the original image.
  • In a still different embodiment, the prism is capable of splitting the image into four or more levels of exposure.
  • In another embodiment, the prism is capable of splitting the image into five or more levels of exposure.
  • In a different aspect, the invention provides a method for temporally aligning bracketed exposures of a single image, the method comprising the steps of a) using a prism to split an incoming image from a light aperture into at least two temporally aligned images, and b) using an image capturing device to capture the temporally aligned images at different levels of exposure.
  • These and other features and advantages of this invention are described in, or are apparent from, the following detailed description of various exemplary embodiments of the apparatus and methods according to this invention.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • A more complete understanding of the present invention and the attendant features and advantages thereof may be had by reference to the following detailed description when considered in conjunction with the accompanying drawings wherein:
  • FIG. 1 shows a diagrammatic view of the system produced according to the invention, demonstrating variations to exposure intervals are shown using different combinations of prism splits and sensor sensitivity settings.
  • FIG. 2 shows a diagrammatic view of a system of FIG. 1 and further showing additional components of the system for processing the images.
  • FIG. 3 shows a perspective drawing of a two-way prism that could be utilized with the invention.
  • FIG. 4 shows a perspective drawing of a three-way prism that could be utilized with the invention.
  • FIG. 5 shows a perspective drawing of a four-way prism that could be utilized with the invention.
  • FIG. 6 shows a perspective drawing of a five-way prism that could be utilized with the invention.
  • DETAILED DESCRIPTION OF THE INVENTION
  • The optical imaging system of the present invention provides an improvement to high dynamic range imaging, and assemblies therefore, that allows temporally aligned exposure bracketing. The system is simple, elegant, leverages existing technologies, allows for motion capture with no temporal distortion, and is relatively inexpensive to implement.
  • The present optical imaging system allows the user to capture light with confidence that the under and over exposed regions in the image will be imaged properly. The user simply captures all the available light with and image capturing device, and determines later how to map that information to the output device. With the optical imaging system the user can create stunning imagery that is otherwise impossible to capture, even with the most sophisticated of the current generation of normal photography equipment.
  • Before the present invention is described in greater detail, it is to be understood that this invention is not limited to particular embodiments described, as such may, of course, vary. It is also to be understood that the terminology used herein is for the purpose of describing particular embodiments only, and is not intended to be limiting, since the scope of the present invention will be limited only by the appended claims.
  • Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Although any materials similar or equivalent to those described herein can also be used in the practice or testing of the present invention, the preferred materials are now described.
  • All publications and patents cited in this specification are herein incorporated by reference as if each individual publication or patent were specifically and individually indicated to be incorporated by reference and are incorporated herein by reference to disclose and describe the materials in connection with which the publications are cited. The citation of any publication is for its disclosure prior to the filing date and should not be construed as an admission that the present invention is not entitled to antedate such publication by virtue of prior invention. Further, the dates of publication provided may be different from the actual publication dates which may need to be independently confirmed.
  • It must be noted that as used herein and in the appended claims, the singular forms “a,” “an”, and “the” include plural referents unless the context clearly dictates otherwise. It is further noted that the claims may be drafted to exclude any optional element. As such, this statement is intended to serve as antecedent basis for use of such exclusive terminology as “solely,” “only” and the like in connection with the recitation of claim elements, or use of a “negative” limitation.
  • As will be apparent to those of skill in the art upon reading this disclosure, each of the individual embodiments described and illustrated herein has discrete components and features which may be readily separated from or combined with the features of any of the other several embodiments without departing from the scope or spirit of the present invention.
  • For example, although the foregoing drawings and references refer to color images and processors, the system and methods work equally well for black and white (grayscale) images and sensors. For instance, some applications for scientific or industrial use may prefer grayscale imagery.
  • Further, while unusual in present day camera art, it is possible to build an imaging apparatus without a primary lens (i.e., a pinhole camera or a slit scanner). These applications are more likely in industrial or scientific applications. The invention can easily be adapted for designs that don't include a front end lens, but rather a simple aperture or the like.
  • Generally speaking, the systems and methods utilize prism splitting by full spectrum brackets to several image detecting sensors of an image capturing device. The system eliminates exotic image sensors as a necessary feature. The system allows multiple exposures from existing commodity sensors simultaneously by simply dividing the incoming light for an image into multiple and different levels of exposure for the same image.
  • The temporally aligned imaging system can be analogized to Technicolor. Before color film stock was developed, Hollywood was in search of a way to shoot films in color. Technicolor, Inc. was the first company to develop a way to create color pictures from black and white film stock. It utilized three rolls of black and white film exposed simultaneously through a special set of beam splitters with red, green, and blue filters on them.
  • Simply put, each black and white film negative recorded just the red, green, or blue information. This process was done in reverse with a projector that ran all three rolls of film simultaneously with the correct color filter in front of each. When the images are aligned properly, a full color picture is realized.
  • As better color film stocks emerged, this process fell out of favor, until video cameras emerged. In the early days of video, color sensors were not very sharp, and had difficulty producing high resolution images, or good color saturation and reproduction. Black and white sensors were far sharper and had a higher dynamic range. So the Technicolor principle of using three image sensors and a beam splitter to feed each an identical simultaneous image was dusted off and put into use for a new generation of imaging products. Three black and white CCD were used with a new and vastly improved beam splitter called a trichroic prism.
  • This technique is used to this day in professional level video cameras, sometimes referred to as 3CCD sensor. The three red, green and blue sensors not only allow for sharper more saturated colors but also help enhance the dynamic range of the images they help create. But just as better color film stocks helped to usher out the era of the Technicolor process, better CMOS and CCD sensors are ushering out the era of 3CCD sensor systems in favor of full color single sensor systems. In fact some of the highest end professional cameras like the lineup from RED Digital Cinema Camera Company as well as every professional Digital SLR use only one full color sensor. It is quite apparent that sensor technology has progressed to the point where a single color sensor can replace and even outperform 3CCD sensor systems.
  • In one aspect, the temporally aligned exposure bracketing system employs trichroic prisms adapted to split the entire spectrum to each of multiple full color sensors, at different exposure levels, rather than splitting out the spectrum into different colors.
  • The system allows a color neutral change in the amount, rather than the spectrum, of light going to each sensor, by the application of such prisms for the temporal alignment of images for HRDI. By “color neutral”, it is meant that while the temporally aligned images created by the prism may vary in intensity between themselves, or between themselves and the incoming image, they are not substantially different from one another in color spectrum, i.e., the prism creates split images that are similar in color spectrum, or spectrally neutral, even if differing substantially in intensity.
  • All of the commonly understood color separation prism layouts may also be used for neutral separation. In reference now to FIG. 1, the system 10 comprises an optical imaging system having an aperture 20 for capturing incoming light 30. Internal to the system is a neutral prism 100 that is used to reflect the captured light to generate a color-neutral separation.
  • In FIG. 1, the neutral film prism 100 is depicted as a three-way prism that splits the light to three separate full color sensors image 101, 102 and 103. Various means can be employed to adjust the EV (Exposure Value, commonly referred to as a “stop”) up and down with the intensity spectrum, and a camera can then capture the images simultaneously. In FIG. 1, two consecutive neutral films 104 and 105 are used, respectively capturing 57.1429% ( 4/7) of the light followed by a neutral film of 33.33% (⅓) for the remainder light. The neutral prism thus fractionates a captured image into three temporally aligned exposures 106, 107 and 108, that have relative light intensities of 1/7, 2/7 and 4/7 of the incoming light.
  • The film coatings 104 and 105 for the prism 100 may be of any of numerous coatings known to the art and capable of achieving a color neutral split, or separation, of the image, by reflection of the incoming light 30. Two examples of such spectrally neutral films include a thin film metallic coating, typically aluminum or silver, with or without a set of dielectric layers, and a set of dielectric layers consisting of high and low refractive index materials with the thin film stack designed to reflect a certain percentage of the incident light over the visible wavelength range. These and related types of thin film coatings 104 and 105 shall be termed “spectrally neutral film” or, alternatively, “neutral film.”
  • The following table provides a demonstration for calculating the percentages for such a system, using a prism for splitting a captured image into temporally aligned exposures 106, 107 and 108 at levels of 14.2857%, 28.5714% and 57.1429%, respectively.
  • TABLE 1
    sensor percent light ratio light
    +1 EV 14.2857% 1/7
    standard EV 28.5714% 2/7
    −1 EV 57.1429% 4/7
    neutral film
    percent neutral film ratio
    57.1429% 4/7
    33.3333% 1/3
  • Thus, with color image sensors that do not need the RGB color split, the prism is harnessed for the purpose of splitting out different exposures of the same image, that are temporally aligned (taken at the same moment).
  • Various means can be employed to adjust the EV (Exposure Value, commonly referred to as a “stop”) up and down with the intensity spectrum that would allow a camera to capture the images simultaneously. For instance, this can be accomplished by splitting the incoming light into different intensities directly in the prism, adjusting the ISO sensitivity in the sensors or some combination of the two.
  • At one extreme, the system could split the light intensity in the prism 100 into equal amounts of roughly 33% each and then adjust the ISO of the sensors 101, 102 and 103 respectively to achieve different EV output intervals. At another extreme, the system could split the light intensity within the prism 100 into the desired EV intervals for the light 106, 107 and 108. Thus, even while leaving the ISO of the sensors the same, the desired different EV output intervals are achieved for the recorded images. Any combination between these two extremes may be more or less desirable for various applications.
  • FIG. 2 illustrates some additional components of a system 10. In FIG. 2 is seen the deployment of a tone mapping processor 110 and an HDRI 120 processor that are used for combining the images. The processing chip is used to combine the 3 images in real time to an HDRI image, and another chip is used to complete the tone mapping. These functions can also be combined into a single processing chip.
  • Systems for controlling the action of the lens and associated hardware, including light responsive software controllers, are well known to the art.
  • In addition, the individual sensors could benefit from some tuning for their respective exposure levels to reduce noise and other artifacts associated with under and over exposure, in ways known to the art.
  • A high quality standard camera lens 140 can be used with the system 10 to gather and focus light from the light aperture.
  • The system 10 also will typically include an eyepiece and/or monitor 150 for aligning the images for capture from the lens onto the sensors.
  • Additional features of the system typically would include mass storage for either the 8 bit tone mapped data 160, or the raw 32 bit HDRI data 170. Other HDRI formats are known, for instance 16 bit and 14 bit formats, though the standard is evolving toward the higher 32 bit format.
  • The ISO is a function of how sensitive the sensor/film is to light. The exposure generated by a particular aperture, shutter speed, and sensitivity combination can be represented by its exposure value “EV”. Zero EV is defined by the combination of an aperture of f/1 and a shutter speed of 1s at ISO 100.
  • The term “exposure value” is used to represent shutter speed and aperture combinations only. An exposure value which takes into account the ISO sensitivity is called “Light Value” or LV and represents the luminance of the scene. For the sake of simplicity, as is the case in this patent, Light Value is often referred to as “exposure value”, grouping aperture, shutter speed and sensitivity in one familiar variable. This is because in a digital camera it is as easy to change sensitivity as it is to change aperture and shutter speed.
  • Each time the amount of light collected by the sensor is halved (e.g., by doubling shutter speed or by halving the aperture), the EV will increase by 1. For instance, 6 EV represents half the amount of light as 5 EV.
  • Table 2 shows the additional variations possible for adjusting output intervals on top of the prismatic split, for +/−3EV, +/−2EV and +/−1EV.
  • TABLE 2
    output sensor 1 Sensor 2 sensor 3
    interval (+1 EV in) (Standard EV in) (−1 EV in)
    +/−3 EV 25 ISO 100 ISO 400 ISO
    +/−2 EV 50 ISO 100 ISO 200 ISO
    +/−1 EV 100 ISO 100 ISO 100 ISO
  • The various exposure intervals can be modified or enhanced by using different combinations of prism splits with sensor sensitivity settings. This is accomplished by using differential exposure values (EV) to amplify the differences created by the prismatic split at the level of the sensors.
  • Table 3 shows results for a diagrammatic view of a system produced according to the invention that as shown in FIGS. 1 and 2, only deploying a prism with two splits of light 104 and 105 corresponding to 76.1905% ( 16/21) followed by 20.00% (⅕) on the remainder light. This is used for splitting a captured image into temporally aligned exposures 106, 107 and 108 of levels of 76.1905%, 19.0476% and 4.7619%, respectively.
  • TABLE 3
    sensor percent light ratio light
    +2 EV 4.7619% 1/21
    standard EV 19.0476% 4/21
    −2 EV 76.1905% 16/21 
    neutral film
    percent neutral film ratio
    76.1905% 16/21
    20.0000% 1/5
  • Table 4 shows the results where variations to exposure intervals are shown using different combinations of prism splits and sensor sensitivity settings of +/−3EV, +/−2EV and +/−1EV. Table 4 shows the various ISO settings for each sensor that is used to produce alternative EV output intervals from each sensor (these settings are for +/−1EV input values only) as found in Table 3 (these settings are for +/−2EV input values only).
  • TABLE 4
    output sensor 1 sensor 2 sensor 3
    interval (+2 EV in) (Standard EV in) (−2 EV in)
    +/−3 EV  50 ISO 100 ISO 200 ISO
    +/−2 EV 100 ISO 100 ISO 100 ISO
    +/−1 EV 200 ISO 100 ISO  50 ISO
  • Table 5 is the results for a system produced according to the invention as depicted in FIGS. 1 and 2, only showing a prism with two splits of light 104 and 105 corresponding to 87.6712% ( 64/73) followed by 11.11% ( 1/9) on the remainder light. This is used for splitting a captured image into temporally aligned exposures 106, 107 and 108 of levels of 87.6712%, 10.9589% and 1.3699%, respectively.
  • TABLE 5
    sensor percent light ratio light
    +3 EV 1.3699% 1/73
    standard EV 10.9589% 8/73
    −3 EV 87.6712% 64/73 
    neutral film
    percent neutral film ratio
    87.6712% 64/73
    11.1111% 1/9
  • Table 6 is the settings for a system as would be configured for the Table 5 percentages, where variations to exposure intervals are shown using different combinations of prism splits and sensor sensitivity settings of +/−3EV, +/−2EV and +/−1EV.
  • TABLE 6
    output sensor 1 Sensor 2 sensor 3
    interval (+3 EV in) (Standard EV in) (−3 EV in)
    +/−3 EV 100 ISO 100 ISO 100 ISO 
    +/−2 EV 200 ISO 100 ISO 50 ISO
    +/−1 EV 400 ISO 100 ISO 25 ISO
  • The system depicted in FIGS. 1 and 2, and through Tables 1 through 6, exemplifies a wide range of exposure levels that can be achieved, but are not exhaustive by any means. These are intended as examples only, and even more possibilities exist, including narrower or greater exposure ranges and configurations and settings of the prism splits with sensor sensitivity settings.
  • Further, while the use of a three-way prism is demonstrated in FIGS. 1 and 2, other neutral prism configurations could be utilized. FIGS. 3 through 6 demonstrate configurations for two-way, three-way, four-way and five-way neutral prism configurations, respectively.
  • Use of different prism splits will be desirable for different applications. In a very minimal configuration a 2-way configuration could work (FIG. 3), although not as well for some applications. However, a two-way neutral prism likely represents the least expensive implementation of the device, and may likely be used in consumer versions of many products produced for the cost savings.
  • On the other hand, in some scientific or professional applications, the greater control from more elaborate splits possible from the four-way and five-way neutral prism splits shown in FIGS. 5 and 6 may be desired.
  • While this invention has been described in conjunction with the specific embodiments outlined above, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art. Accordingly, the preferred embodiments of the invention, as set forth above, are intended to be illustrative, not limiting. Various changes may be made without departing from the spirit and scope of this invention.

Claims (28)

1. An optical imaging system for temporally aligning bracketed exposures of a single image, said system comprising a light aperture, a prism and a image capturing device,
wherein said prism is capable of splitting an incoming image from said light aperture into at least two color neutral, temporally aligned images,
whereby said image capturing device captures said temporally aligned images at different levels of exposure.
2. The system of claim 1 wherein said prism splits the intensity of said incoming image to achieve a desired EV output interval between said temporally aligned images.
3. The system of claim 1 wherein said image capturing device further comprises image detection sensors for said temporally aligned images.
4. The system of claim 1 wherein the ISO of said sensors is adjusted to achieve a desired EV output interval between said images.
5. The system of claim 1 wherein said prism further comprises at least one neutral film coating.
6. The system of claim 1 further comprising an image processing device connected to said image capturing device.
7. The system of claim 1 wherein said image processing device comprises a computer processor.
8. The system of claim 7 further comprising a tone-mapping processor.
9. The system of claim 8 wherein the image processing device and tone-mapping processor are contained on a single integrated circuit.
10. The system of claim 1 further comprising a digital readout monitor.
11. The system of claim 1 further comprising a lens associated with said aperture.
12. The system of claim 1 further comprising an eyepiece for viewing said incoming image.
13. The system of claim 1 wherein said prism is capable of splitting said image into at least three temporally aligned images having different levels of exposure.
14. The system of claim 13 wherein said three levels of exposure are about 14%, about 29% and about 57%, respectively, of the intensity of said incoming image.
15. The system of claim 13 wherein said three levels of exposure are about 5%, about 19% and about 76%, respectively, of the intensity of said incoming image.
16. The system of claim 13 wherein said three levels of exposure are about 1%, about 11% and about 88%, respectively, of the intensity of said incoming image.
17. The system of claim 1 wherein said prism is capable of splitting said image into at least four temporally aligned images having different levels of exposure.
18. The system of claim 1 wherein said prism is capable of splitting said image into at least five temporally aligned images having different levels of exposure.
19. A method for temporally aligning bracketed exposures of a single image, said method comprising the steps of
a) using a prism to split an incoming image from a light aperture into at least two temporally aligned images, and
b) using an image capturing device to capture said temporally aligned images at different levels of exposure,
wherein said prism produces a color neutral split of said temporally aligned images.
20. The method of claim 19 wherein said prism splits the intensity of said incoming image to achieve a desired EV output interval between said temporally aligned images.
21. The method of claim 19 wherein said image capturing device further comprises image detection sensors for said temporally aligned images.
22. The method of claim 19 wherein the ISO of said sensors is adjusted to achieve a desired EV output interval between said images.
23. The method of claim 19 wherein said prism is capable of splitting said image into at least three temporally aligned images having different levels of exposure.
24. The method of claim 23 wherein said three levels of exposure are about 14%, about 29% and about 57%, respectively, of the intensity of said incoming image.
25. The method of claim 23 wherein said three levels of exposure are about 5%, about 19% and about 76%, respectively, of the intensity of said incoming image.
26. The method of claim 23 wherein said three levels of exposure are about 1%, about 11% and about 88%, respectively, of the intensity of said incoming image.
27. The method of claim 19 wherein said prism is capable of splitting said image into at least four temporally aligned images having different levels of exposure.
28. The method of claim 19 wherein said prism is capable of splitting said image into at least five temporally aligned images having different levels of exposure.
US12/717,765 2009-03-04 2010-03-04 Temporally Aligned Exposure Bracketing for High Dynamic Range Imaging Abandoned US20100225783A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US12/717,765 US20100225783A1 (en) 2009-03-04 2010-03-04 Temporally Aligned Exposure Bracketing for High Dynamic Range Imaging
US14/514,077 US10511785B2 (en) 2009-03-04 2014-10-14 Temporally aligned exposure bracketing for high dynamic range imaging
US16/682,663 US20200084363A1 (en) 2009-03-04 2019-11-13 Temporally aligned exposure bracketing for high dynamic range imaging

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US15749409P 2009-03-04 2009-03-04
US12/717,765 US20100225783A1 (en) 2009-03-04 2010-03-04 Temporally Aligned Exposure Bracketing for High Dynamic Range Imaging

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/514,077 Continuation US10511785B2 (en) 2009-03-04 2014-10-14 Temporally aligned exposure bracketing for high dynamic range imaging

Publications (1)

Publication Number Publication Date
US20100225783A1 true US20100225783A1 (en) 2010-09-09

Family

ID=42677914

Family Applications (3)

Application Number Title Priority Date Filing Date
US12/717,765 Abandoned US20100225783A1 (en) 2009-03-04 2010-03-04 Temporally Aligned Exposure Bracketing for High Dynamic Range Imaging
US14/514,077 Expired - Fee Related US10511785B2 (en) 2009-03-04 2014-10-14 Temporally aligned exposure bracketing for high dynamic range imaging
US16/682,663 Abandoned US20200084363A1 (en) 2009-03-04 2019-11-13 Temporally aligned exposure bracketing for high dynamic range imaging

Family Applications After (2)

Application Number Title Priority Date Filing Date
US14/514,077 Expired - Fee Related US10511785B2 (en) 2009-03-04 2014-10-14 Temporally aligned exposure bracketing for high dynamic range imaging
US16/682,663 Abandoned US20200084363A1 (en) 2009-03-04 2019-11-13 Temporally aligned exposure bracketing for high dynamic range imaging

Country Status (5)

Country Link
US (3) US20100225783A1 (en)
EP (1) EP2404209A4 (en)
KR (1) KR20120073159A (en)
AU (1) AU2010221241A1 (en)
WO (1) WO2010102135A1 (en)

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102012009151A1 (en) * 2012-05-08 2013-11-14 Steinbichler Optotechnik Gmbh Method for detecting intensity-modulated optical radiation field by device, involves irradiating surface of object with optical radiation and generating intensity-modulated optical radiation field
US9077910B2 (en) 2011-04-06 2015-07-07 Dolby Laboratories Licensing Corporation Multi-field CCD capture for HDR imaging
US20150369565A1 (en) * 2014-06-20 2015-12-24 Matthew Flint Kepler Optical Device Having a Light Separation Element
US9245348B2 (en) 2012-06-15 2016-01-26 Microsoft Technology Licensing, Llc Determining a maximum inscribed size of a rectangle
US20160205291A1 (en) * 2015-01-09 2016-07-14 PathPartner Technology Consulting Pvt. Ltd. System and Method for Minimizing Motion Artifacts During the Fusion of an Image Bracket Based On Preview Frame Analysis
CN106060351A (en) * 2016-06-29 2016-10-26 联想(北京)有限公司 Image processing device and image processing method
TWI594635B (en) * 2016-01-14 2017-08-01 瑞昱半導體股份有限公司 Method for generating target gain value of wide dynamic range operation
US20170237879A1 (en) * 2016-02-12 2017-08-17 Contrast Optical Design & Engineering, Inc. Color matching across multiple sensors in an optical system
WO2018026599A1 (en) 2016-08-03 2018-02-08 Waymo Llc Beam split extended dynamic range image capture system
WO2018031441A1 (en) 2016-08-09 2018-02-15 Contrast, Inc. Real-time hdr video for vehicle control
JP2019506821A (en) * 2016-02-12 2019-03-07 コントラスト オプティカル デザイン アンド エンジニアリング, インコーポレイテッド Devices and methods for high dynamic range video
US10264196B2 (en) 2016-02-12 2019-04-16 Contrast, Inc. Systems and methods for HDR video capture with a mobile device
US10620128B2 (en) * 2014-02-17 2020-04-14 Eaton Intelligent Power Limited Oxygen sensor having a tip coated large diameter optical fiber utilizing a trichroic prism or tricolor sensor
US10951888B2 (en) 2018-06-04 2021-03-16 Contrast, Inc. Compressed high dynamic range video
US11265530B2 (en) 2017-07-10 2022-03-01 Contrast, Inc. Stereoscopic camera
US11290612B1 (en) * 2014-08-21 2022-03-29 Oliver Markus Haynold Long-exposure camera
US20220159167A1 (en) * 2020-11-17 2022-05-19 Axis Ab Method and electronic device for increased dynamic range of an image
EP4125267A1 (en) * 2021-07-29 2023-02-01 Koninklijke Philips N.V. An image sensing system
US11736784B2 (en) * 2020-04-17 2023-08-22 i-PRO Co., Ltd. Three-plate camera and four-plate camera

Citations (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4672196A (en) * 1984-02-02 1987-06-09 Canino Lawrence S Method and apparatus for measuring properties of thin materials using polarized light
US5309243A (en) * 1992-06-10 1994-05-03 Eastman Kodak Company Method and apparatus for extending the dynamic range of an electronic imaging system
US5315384A (en) * 1990-10-30 1994-05-24 Simco/Ramic Corporation Color line scan video camera for inspection system
US5801773A (en) * 1993-10-29 1998-09-01 Canon Kabushiki Kaisha Image data processing apparatus for processing combined image signals in order to extend dynamic range
US6204881B1 (en) * 1993-10-10 2001-03-20 Canon Kabushiki Kaisha Image data processing apparatus which can combine a plurality of images at different exposures into an image with a wider dynamic range
US20030174413A1 (en) * 2002-03-13 2003-09-18 Satoshi Yahagi Autofocus system
US20040008267A1 (en) * 2002-07-11 2004-01-15 Eastman Kodak Company Method and apparatus for generating images used in extended range image composition
US20040130649A1 (en) * 2003-01-03 2004-07-08 Chulhee Lee Cameras
US20050041113A1 (en) * 2001-04-13 2005-02-24 Nayar Shree K. Method and apparatus for recording a sequence of images using a moving optical element
US6864916B1 (en) * 1999-06-04 2005-03-08 The Trustees Of Columbia University In The City Of New York Apparatus and method for high dynamic range imaging using spatially varying exposures
US20050104900A1 (en) * 2003-11-14 2005-05-19 Microsoft Corporation High dynamic range image viewing on low dynamic range displays
US20050275747A1 (en) * 2002-03-27 2005-12-15 Nayar Shree K Imaging method and system
US7010174B2 (en) * 2003-04-29 2006-03-07 Microsoft Corporation System and process for generating high dynamic range video
US20060055894A1 (en) * 2004-09-08 2006-03-16 Seiko Epson Corporation Projector
US20060082692A1 (en) * 2004-10-15 2006-04-20 Seiko Epson Corporation Image display device and projector
US7084905B1 (en) * 2000-02-23 2006-08-01 The Trustees Of Columbia University In The City Of New York Method and apparatus for obtaining high dynamic range images
US20060221209A1 (en) * 2005-03-29 2006-10-05 Mcguire Morgan Apparatus and method for acquiring and combining images of a scene with multiple optical characteristics at multiple resolutions
US7158687B2 (en) * 2002-04-23 2007-01-02 Olympus Corporation Image combination device
US20070177004A1 (en) * 2006-06-08 2007-08-02 Timo Kolehmainen Image creating method and imaging device
US20070229766A1 (en) * 2006-03-29 2007-10-04 Seiko Epson Corporation Modulation Apparatus and Projector
US20080149812A1 (en) * 2006-12-12 2008-06-26 Brightside Technologies Inc. Hdr camera with multiple sensors
US20080158400A1 (en) * 2002-06-26 2008-07-03 Pixim, Incorporated Digital Image Capture Having an Ultra-High Dynamic Range
US20080158245A1 (en) * 2006-12-29 2008-07-03 Texas Instruments Incorporated High dynamic range display systems
US20080291289A1 (en) * 2007-02-20 2008-11-27 Seiko Epson Corporation Image pickup device, image pickup system, image pickup method, and image processing device
US20090213466A1 (en) * 2006-09-14 2009-08-27 3M Innovative Properties Company Beam splitter apparatus and system

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1198418A (en) * 1997-09-24 1999-04-09 Toyota Central Res & Dev Lab Inc Image pickup device
US6720997B1 (en) * 1997-12-26 2004-04-13 Minolta Co., Ltd. Image generating apparatus
US6529640B1 (en) * 1998-06-09 2003-03-04 Nikon Corporation Image processing apparatus
US7057659B1 (en) * 1999-07-08 2006-06-06 Olympus Corporation Image pickup device and image pickup optical system
JP2001157109A (en) * 1999-11-24 2001-06-08 Nikon Corp Electronic camera and recording medium for image data processing
EP1271935A1 (en) * 2001-06-29 2003-01-02 Kappa opto-electronics GmbH Apparatus for taking digital images with two simultaneously controlled image sensors
JP4687492B2 (en) * 2006-02-14 2011-05-25 株式会社ニコン Camera, imaging method, exposure calculation device, and program
US7949182B2 (en) * 2007-01-25 2011-05-24 Hewlett-Packard Development Company, L.P. Combining differently exposed images of the same object
US7978239B2 (en) * 2007-03-01 2011-07-12 Eastman Kodak Company Digital camera using multiple image sensors to provide improved temporal sampling
DE102007026337B4 (en) * 2007-06-06 2016-11-03 Arnold & Richter Cine Technik Gmbh & Co. Betriebs Kg Digital camera
JP4960907B2 (en) * 2008-03-11 2012-06-27 富士フイルム株式会社 Imaging apparatus and imaging method
US8441732B2 (en) * 2008-03-28 2013-05-14 Michael D. Tocci Whole beam image splitting system

Patent Citations (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4672196A (en) * 1984-02-02 1987-06-09 Canino Lawrence S Method and apparatus for measuring properties of thin materials using polarized light
US5315384A (en) * 1990-10-30 1994-05-24 Simco/Ramic Corporation Color line scan video camera for inspection system
US5309243A (en) * 1992-06-10 1994-05-03 Eastman Kodak Company Method and apparatus for extending the dynamic range of an electronic imaging system
US6204881B1 (en) * 1993-10-10 2001-03-20 Canon Kabushiki Kaisha Image data processing apparatus which can combine a plurality of images at different exposures into an image with a wider dynamic range
US5801773A (en) * 1993-10-29 1998-09-01 Canon Kabushiki Kaisha Image data processing apparatus for processing combined image signals in order to extend dynamic range
US6864916B1 (en) * 1999-06-04 2005-03-08 The Trustees Of Columbia University In The City Of New York Apparatus and method for high dynamic range imaging using spatially varying exposures
US7084905B1 (en) * 2000-02-23 2006-08-01 The Trustees Of Columbia University In The City Of New York Method and apparatus for obtaining high dynamic range images
US20050041113A1 (en) * 2001-04-13 2005-02-24 Nayar Shree K. Method and apparatus for recording a sequence of images using a moving optical element
US20030174413A1 (en) * 2002-03-13 2003-09-18 Satoshi Yahagi Autofocus system
US20050275747A1 (en) * 2002-03-27 2005-12-15 Nayar Shree K Imaging method and system
US7158687B2 (en) * 2002-04-23 2007-01-02 Olympus Corporation Image combination device
US20080158400A1 (en) * 2002-06-26 2008-07-03 Pixim, Incorporated Digital Image Capture Having an Ultra-High Dynamic Range
US20040008267A1 (en) * 2002-07-11 2004-01-15 Eastman Kodak Company Method and apparatus for generating images used in extended range image composition
US20070126920A1 (en) * 2003-01-03 2007-06-07 Chulhee Lee Cameras capable of focus adjusting
US20040130649A1 (en) * 2003-01-03 2004-07-08 Chulhee Lee Cameras
US20070126918A1 (en) * 2003-01-03 2007-06-07 Chulhee Lee Cameras with multiple sensors
US20070126919A1 (en) * 2003-01-03 2007-06-07 Chulhee Lee Cameras capable of providing multiple focus levels
US7010174B2 (en) * 2003-04-29 2006-03-07 Microsoft Corporation System and process for generating high dynamic range video
US20060158462A1 (en) * 2003-11-14 2006-07-20 Microsoft Corporation High dynamic range image viewing on low dynamic range displays
US20050104900A1 (en) * 2003-11-14 2005-05-19 Microsoft Corporation High dynamic range image viewing on low dynamic range displays
US20060055894A1 (en) * 2004-09-08 2006-03-16 Seiko Epson Corporation Projector
US20060082692A1 (en) * 2004-10-15 2006-04-20 Seiko Epson Corporation Image display device and projector
US20060221209A1 (en) * 2005-03-29 2006-10-05 Mcguire Morgan Apparatus and method for acquiring and combining images of a scene with multiple optical characteristics at multiple resolutions
US20070229766A1 (en) * 2006-03-29 2007-10-04 Seiko Epson Corporation Modulation Apparatus and Projector
US20070177004A1 (en) * 2006-06-08 2007-08-02 Timo Kolehmainen Image creating method and imaging device
US20090213466A1 (en) * 2006-09-14 2009-08-27 3M Innovative Properties Company Beam splitter apparatus and system
US20080149812A1 (en) * 2006-12-12 2008-06-26 Brightside Technologies Inc. Hdr camera with multiple sensors
US20080158245A1 (en) * 2006-12-29 2008-07-03 Texas Instruments Incorporated High dynamic range display systems
US20080291289A1 (en) * 2007-02-20 2008-11-27 Seiko Epson Corporation Image pickup device, image pickup system, image pickup method, and image processing device

Cited By (49)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9077910B2 (en) 2011-04-06 2015-07-07 Dolby Laboratories Licensing Corporation Multi-field CCD capture for HDR imaging
US9549123B2 (en) 2011-04-06 2017-01-17 Dolby Laboratories Licensing Corporation Multi-field CCD capture for HDR imaging
DE102012009151B4 (en) * 2012-05-08 2018-01-18 Steinbichler Optotechnik Gmbh Method and apparatus for detecting an intensity-modulated optical radiation field
DE102012009151A1 (en) * 2012-05-08 2013-11-14 Steinbichler Optotechnik Gmbh Method for detecting intensity-modulated optical radiation field by device, involves irradiating surface of object with optical radiation and generating intensity-modulated optical radiation field
US9245348B2 (en) 2012-06-15 2016-01-26 Microsoft Technology Licensing, Llc Determining a maximum inscribed size of a rectangle
US9615040B2 (en) 2012-06-15 2017-04-04 Microsoft Technology Licensing, Llc Determining a maximum inscribed size of a rectangle
US10620128B2 (en) * 2014-02-17 2020-04-14 Eaton Intelligent Power Limited Oxygen sensor having a tip coated large diameter optical fiber utilizing a trichroic prism or tricolor sensor
US20150369565A1 (en) * 2014-06-20 2015-12-24 Matthew Flint Kepler Optical Device Having a Light Separation Element
WO2015196178A3 (en) * 2014-06-20 2016-02-25 Trackingpoint, Inc. Optical device having a light separation element
US11290612B1 (en) * 2014-08-21 2022-03-29 Oliver Markus Haynold Long-exposure camera
US20160205291A1 (en) * 2015-01-09 2016-07-14 PathPartner Technology Consulting Pvt. Ltd. System and Method for Minimizing Motion Artifacts During the Fusion of an Image Bracket Based On Preview Frame Analysis
TWI594635B (en) * 2016-01-14 2017-08-01 瑞昱半導體股份有限公司 Method for generating target gain value of wide dynamic range operation
US10536612B2 (en) * 2016-02-12 2020-01-14 Contrast, Inc. Color matching across multiple sensors in an optical system
US10819925B2 (en) 2016-02-12 2020-10-27 Contrast, Inc. Devices and methods for high dynamic range imaging with co-planar sensors
US11805218B2 (en) * 2016-02-12 2023-10-31 Contrast, Inc Devices and methods for high dynamic range video
US9948829B2 (en) * 2016-02-12 2018-04-17 Contrast, Inc. Color matching across multiple sensors in an optical system
US11785170B2 (en) 2016-02-12 2023-10-10 Contrast, Inc. Combined HDR/LDR video streaming
US10200569B2 (en) * 2016-02-12 2019-02-05 Contrast, Inc. Color matching across multiple sensors in an optical system
JP2019506821A (en) * 2016-02-12 2019-03-07 コントラスト オプティカル デザイン アンド エンジニアリング, インコーポレイテッド Devices and methods for high dynamic range video
US10257393B2 (en) * 2016-02-12 2019-04-09 Contrast, Inc. Devices and methods for high dynamic range video
US10264196B2 (en) 2016-02-12 2019-04-16 Contrast, Inc. Systems and methods for HDR video capture with a mobile device
US20190166283A1 (en) * 2016-02-12 2019-05-30 Contrast, Inc. Color matching across multiple sensors in an optical system
US20190238725A1 (en) * 2016-02-12 2019-08-01 Contrast, Inc. Devices and methods for high dynamic range video
US11637974B2 (en) 2016-02-12 2023-04-25 Contrast, Inc. Systems and methods for HDR video capture with a mobile device
US20170237890A1 (en) * 2016-02-12 2017-08-17 Contrast Optical Design & Engineering, Inc. Devices and Methods for High Dynamic Range Video
US20230079875A1 (en) * 2016-02-12 2023-03-16 Contrast, Inc. Devices and methods for high dynamic range video
US11463605B2 (en) * 2016-02-12 2022-10-04 Contrast, Inc. Devices and methods for high dynamic range video
US11368604B2 (en) 2016-02-12 2022-06-21 Contrast, Inc. Combined HDR/LDR video streaming
US20170237879A1 (en) * 2016-02-12 2017-08-17 Contrast Optical Design & Engineering, Inc. Color matching across multiple sensors in an optical system
US10742847B2 (en) * 2016-02-12 2020-08-11 Contrast, Inc. Devices and methods for high dynamic range video
US10805505B2 (en) 2016-02-12 2020-10-13 Contrast, Inc. Combined HDR/LDR video streaming
JP6997461B2 (en) 2016-02-12 2022-01-17 コントラスト, インコーポレイテッド Devices and methods for high dynamic range video
CN106060351A (en) * 2016-06-29 2016-10-26 联想(北京)有限公司 Image processing device and image processing method
EP3494434A4 (en) * 2016-08-03 2020-04-01 Waymo LLC Beam split extended dynamic range image capture system
WO2018026599A1 (en) 2016-08-03 2018-02-08 Waymo Llc Beam split extended dynamic range image capture system
IL263876A (en) * 2016-08-03 2019-01-31 Waymo Llc Beam split extended dynamic range image capture system
JP7081835B2 (en) 2016-08-09 2022-06-07 コントラスト, インコーポレイテッド Real-time HDR video for vehicle control
EP3497925A4 (en) * 2016-08-09 2020-03-11 Contrast, Inc. Real-time hdr video for vehicle control
US11910099B2 (en) 2016-08-09 2024-02-20 Contrast, Inc. Real-time HDR video for vehicle control
WO2018031441A1 (en) 2016-08-09 2018-02-15 Contrast, Inc. Real-time hdr video for vehicle control
US10554901B2 (en) 2016-08-09 2020-02-04 Contrast Inc. Real-time HDR video for vehicle control
JP2019525688A (en) * 2016-08-09 2019-09-05 コントラスト, インコーポレイテッド Real-time HDR video for vehicle control
US11265530B2 (en) 2017-07-10 2022-03-01 Contrast, Inc. Stereoscopic camera
US10951888B2 (en) 2018-06-04 2021-03-16 Contrast, Inc. Compressed high dynamic range video
US11736784B2 (en) * 2020-04-17 2023-08-22 i-PRO Co., Ltd. Three-plate camera and four-plate camera
US20220159167A1 (en) * 2020-11-17 2022-05-19 Axis Ab Method and electronic device for increased dynamic range of an image
US11956552B2 (en) * 2020-11-17 2024-04-09 Axis Ab Method and electronic device for increased dynamic range of an image
WO2023006542A1 (en) * 2021-07-29 2023-02-02 Koninklijke Philips N.V. An image sensing system
EP4125267A1 (en) * 2021-07-29 2023-02-01 Koninklijke Philips N.V. An image sensing system

Also Published As

Publication number Publication date
US20150029361A1 (en) 2015-01-29
WO2010102135A1 (en) 2010-09-10
AU2010221241A1 (en) 2011-10-27
US20200084363A1 (en) 2020-03-12
US10511785B2 (en) 2019-12-17
KR20120073159A (en) 2012-07-04
EP2404209A4 (en) 2012-10-17
EP2404209A1 (en) 2012-01-11

Similar Documents

Publication Publication Date Title
US20200084363A1 (en) Temporally aligned exposure bracketing for high dynamic range imaging
US11785170B2 (en) Combined HDR/LDR video streaming
US9681026B2 (en) System and method for lens shading compensation
JP4826028B2 (en) Electronic camera
EP3349433B1 (en) Control system, imaging device, and program
US8432466B2 (en) Multiple image high dynamic range imaging from a single sensor array
WO2018070100A1 (en) Image processing device, image processing method, and photographing apparatus
KR20070121717A (en) Method of controlling an action, such as a sharpness modification, using a colour digital image
AU2017217833B2 (en) Devices and methods for high dynamic range video
KR20150109177A (en) Photographing apparatus, method for controlling the same, and computer-readable recording medium
JP2018023077A (en) Video camera imaging device
JP2015119436A (en) Imaging apparatus
Stump Camera Sensors
JPH07131799A (en) Image pickup device

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION