US9077917B2 - Image sensor having HDR capture capability - Google Patents

Image sensor having HDR capture capability Download PDF

Info

Publication number
US9077917B2
US9077917B2 US13/157,090 US201113157090A US9077917B2 US 9077917 B2 US9077917 B2 US 9077917B2 US 201113157090 A US201113157090 A US 201113157090A US 9077917 B2 US9077917 B2 US 9077917B2
Authority
US
United States
Prior art keywords
image
data
image data
image sensor
read
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US13/157,090
Other versions
US20120314100A1 (en
Inventor
Michael Frank
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Inc filed Critical Apple Inc
Priority to US13/157,090 priority Critical patent/US9077917B2/en
Assigned to APPLE INC. reassignment APPLE INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FRANK, MICHAEL
Priority to PCT/US2012/041398 priority patent/WO2012170717A1/en
Priority to EP12171184.0A priority patent/EP2533520B1/en
Priority to CN201210269110.9A priority patent/CN102857712B/en
Priority to TW101120795A priority patent/TWI511558B/en
Priority to KR1020120062155A priority patent/KR101361331B1/en
Priority to JP2012132345A priority patent/JP5531054B2/en
Publication of US20120314100A1 publication Critical patent/US20120314100A1/en
Publication of US9077917B2 publication Critical patent/US9077917B2/en
Application granted granted Critical
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • H04N5/35554
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/57Control of the dynamic range
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/57Control of the dynamic range
    • H04N25/58Control of the dynamic range involving two or more exposures
    • H04N25/581Control of the dynamic range involving two or more exposures acquired simultaneously
    • H04N25/583Control of the dynamic range involving two or more exposures acquired simultaneously with different integration times
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/40Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
    • H04N25/44Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/53Control of the integration time
    • H04N25/531Control of the integration time by controlling rolling shutters in CMOS SSIS
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/57Control of the dynamic range
    • H04N25/58Control of the dynamic range involving two or more exposures
    • H04N5/2355
    • H04N5/3532
    • H04N5/35536

Definitions

  • the present disclosure relates generally to image capture systems and techniques.
  • High dynamic range (HDR) imaging generally relates to a set of imaging techniques that allows for the capture and representation of a greater dynamic range of luminances between the lightest and darkest areas of an image than standard digital imaging techniques. Wider dynamic ranges allows for HDR images to more accurately represent the wide range of intensity levels found in real-world scenes.
  • One method for capturing HDR images includes the merging of multiple independently captured photographs. For instance, this process may include capturing multiple images at different exposures in succession, and then processing them to generate a composite HDR image.
  • Embodiments of the present invention relate to the generation of images, particularly in an HDR imaging application.
  • a single image may be captured by an image capture device, such as a camera.
  • This captured image may be the result of light energy that is converted into electrical signals (e.g., a voltage) by an image sensor of the image capture device.
  • Multiple scans (i.e., reads) of the image sensor may be made such that one read may correspond to an underexposed representation of the image to be captured while a second read may correspond to an overexposed representation of the image to be captured.
  • This read data may be transmitted along a single path to an image processing circuit, where an image signal processor separates data from the first and second scan. This separated data may be independently stored and recombined by the image signal processor to generate a HDR image, which may be transmitted for display on a display of an electronic device.
  • FIG. 1 is a simplified block diagram depicting components of an example of an electronic device that includes an image capture device and an image signal processing subsystem configured to implement an HDR image processing technique in accordance with aspects set forth in the present disclosure
  • FIG. 2 is a graphical representation of a 2 ⁇ 2 pixel block of a Bayer color filter array that may be implemented in the image capture device of the electronic device of FIG. 1 ;
  • FIG. 3 is a front view of the electronic device of FIG. 1 in the form of a handheld portable electronic device in accordance with aspects of the present disclosure
  • FIG. 4 is a rear view of the handheld electronic device shown in FIG. 3 ;
  • FIG. 5 shows an example of an image to be captured via the electronic device of FIG. 1 ;
  • FIG. 6 is a block diagram of the image capture device and the image signal processing subsystem of the electronic device of FIG. 1 ;
  • FIG. 7 is a flow chart illustrating the operation of an image signal processor of the image signal processing subsystem of FIG. 6 .
  • FIG. 1 is a block diagram illustrating an example of an electronic device 10 that may provide for the generation of digital images in accordance with embodiments of the present disclosure.
  • the electronic device 10 may be any type of electronic device, such as a laptop or desktop computer, a mobile phone, a digital media player, or the like, that may receive and process image data, such as data acquired using one or more image sensing devices.
  • the electronic device 10 may be a portable electronic device, such as a model of an iPod® or iPhone®, available from Apple Inc. of Cupertino, Calif.
  • the electronic device 10 may be a desktop, laptop, or tablet computer, such as a model of a MacBook®, MacBook® Pro, MacBook Air®, iMac®, Mac® Mini, Mac Pro®, or iPad® available from Apple Inc. In other embodiments, electronic device 10 may also be a model of an electronic device from another manufacturer that is capable of acquiring and processing image data.
  • the electronic device 10 may include various internal and/or external components, which contribute to the function of the device 10 .
  • the various functional blocks shown in FIG. 1 may comprise hardware elements (including circuitry), software elements (including computer code stored on a computer-readable medium) or a combination of both hardware and software elements.
  • the electronic device 10 may include input/output (I/O) ports 12 , input structures 14 , one or more processors 16 , memory device 18 , non-volatile storage 20 , expansion card(s) 22 , networking device 24 , power source 26 , and display 28 .
  • I/O input/output
  • the electronic device 10 may include one or more imaging devices 30 , such as a digital camera, and image processing circuitry (ISP sub-system) 32 .
  • the image processing circuitry 32 may implement image processing techniques for processing image data to generate composite HDR images.
  • the processor(s) 16 may control the general operation of the device 10 .
  • the processor(s) 16 may provide the processing capability to execute an operating system, programs, user and application interfaces, and any other functions of the electronic device 10 .
  • the processor(s) 16 may include one or more microprocessors, such as one or more “general-purpose” microprocessors, one or more special-purpose microprocessors and/or application-specific microprocessors (ASICs), or a combination of such processing components.
  • the processor(s) 16 may include one or more processing engines (e.g., RISC or CISC processors, graphics processors (GPU), video processors, and/or related chip sets).
  • the processor(s) 16 may be coupled to one or more data buses for transferring data and instructions between various components of the device 10 .
  • the processor(s) 16 may provide the processing capability to execute an imaging applications on the electronic device 10 , such as Photo Booth®, Aperture®, iPhoto®, or Preview®, available from Apple Inc., or the “Camera” and/or “Photo” applications provided by Apple Inc. and available on models of the iPhone®.
  • the processor(s) 16 may also provide for the capability to execute a video conferencing application on the device 10 , such as FaceTime®, available from Apple Inc.
  • the instructions or data to be processed by the processor(s) 16 may be stored in a computer-readable medium, such as a memory device 18 .
  • the memory device 18 may be provided as a volatile memory, such as random access memory (RAM) or as a non-volatile memory, such as read-only memory (ROM), or as a combination of one or more RAM and ROM devices.
  • the memory 18 may store a variety of information and may be used for various purposes.
  • the memory 18 may store firmware for the electronic device 10 , such as a basic input/output system (BIOS), an operating system, various programs, applications, or any other routines that may be executed on the electronic device 10 , including user interface functions, processor functions, and so forth.
  • the memory 18 may be used for buffering or caching during operation of the electronic device 10 .
  • the memory 18 may include one or more frame buffers for buffering video data as it is being output to the display 28 .
  • the electronic device 10 may further include a non-volatile storage 20 for persistent storage of data and/or instructions.
  • the non-volatile storage 20 may include flash memory, a hard drive, or any other optical, magnetic, and/or solid-state storage media, or some combination thereof.
  • the non-volatile storage device(s) 20 may include a combination of one or more of the above-listed storage devices operating in conjunction with the processor(s) 16 .
  • the non-volatile storage 20 may be used to store firmware, data files, image data, software programs and applications, wireless connection information, personal information, user preferences, and any other suitable data.
  • image data stored in the non-volatile storage 20 and/or the memory device 18 may be processed by the image processing circuitry 32 prior to being output on a display.
  • the display 28 may be used to display various images generated by device 10 , such as a GUI for an operating system, or image data (including still images and video data) processed by the image processing circuitry 32 , as will be discussed further below.
  • the image data may include image data acquired using the imaging device 30 or image data retrieved from the memory 18 and/or non-volatile storage 20 .
  • the display 28 may be any suitable type of display, such as a liquid crystal display (LCD), plasma display, or an organic light emitting diode (OLED) display, for example.
  • the display may be a high-resolution LCD display having 300 or more pixels per inch, such as a Retina Display®, available from Apple Inc.
  • the display 28 may be provided in conjunction with the above-discussed touch-sensitive mechanism (e.g., a touch screen) that may function as an input structure ( 14 ) for the electronic device 10 .
  • the electronic device 10 may include imaging device(s) 30 , which may be provided as a digital camera configured to acquire both still images and moving images (e.g., video).
  • the camera 30 may include a lens and one or more image sensors to capture and convert light into electrical signals.
  • the image sensor may include a CMOS image sensor (e.g., a CMOS active-pixel sensor (APS)) or a CCD (charge-coupled device) sensor.
  • the image sensor in the camera 30 includes an integrated circuit having an array of pixels, wherein each pixel includes a photodetector for sensing light from an image scene.
  • the image sensor may be coupled to the ISP sub-system 32 via a sensor interface, which may utilize a Standard Mobile Imaging Architecture (SMIA) interface or any other suitable serial or parallel image sensor interface, or a combination of such interfaces.
  • SIA Standard Mobile Imaging Architecture
  • the photodetectors in the imaging pixels of the sensor generally detect the intensity of light captured via the camera lenses.
  • the image sensor may further include a color filter array (CFA) that may overlay or be disposed over the pixel array of the image sensor to capture color information.
  • the color filter array may include an array of small color filters, each of which may overlap a respective pixel of the image sensor and may filter the captured light by wavelength.
  • the color filter array and the image sensor may provide both wavelength and intensity information with regard to light captured through the camera, which may be representative of a captured image.
  • the color filter array may include a Bayer color filter array, which provides a color pattern that is 50% green elements, 25% red elements, and 25% blue elements.
  • FIG. 2 shows a 2 ⁇ 2 pixel block of a Bayer CFA that includes 2 green elements (referred to Gr and Gb), 1 red element (R), and 1 blue element (B).
  • an image sensor utilizing a Bayer color filter array may provide information regarding the intensity of the light received by the camera 30 at the green, red, and blue wavelengths, whereby each image pixel records only one of the three colors (red, green or blue).
  • This information may then be processed using one or more demosaicing techniques to convert the raw image data into a full color image, generally by interpolating a set of red, green, and blue values for each pixel.
  • demosaicing techniques may be performed by the ISP sub-system 32 .
  • the electronic device 10 is depicted in the form of a portable handheld electronic device 50 , which may be a model of an iPod®, such as an iPod Touch®, or iPhone® available from Apple Inc.
  • the handheld device 50 includes an enclosure 52 , which may function to protect the interior components from physical damage and to shield them from electromagnetic interference (EMI).
  • EMI electromagnetic interference
  • the enclosure 52 may be formed from any suitable material or combination of materials, such as plastic, metal, alloy, or a composite material, and may allow certain frequencies of electromagnetic radiation, such as wireless networking (e.g., 802.11 a/b/g/n networking) and/or telecommunication signals (e.g., GPRS, EDGE, 3G, LTE, etc.), to pass through to wireless communication circuitry (e.g., network device 24 ), which may be disposed within the enclosure 52 , as shown in FIG. 3 .
  • wireless networking e.g., 802.11 a/b/g/n networking
  • telecommunication signals e.g., GPRS, EDGE, 3G, LTE, etc.
  • the enclosure 52 also includes various user input structures 14 through which a user may interface with the handheld device 50 .
  • each input structure 14 may control one or more respective device functions when pressed or actuated.
  • one or more of the input structures 14 may invoke a “home” screen or menu to be displayed, to toggle between a sleep, wake, or powered on/off mode, to silence a ringer for a cellular phone application, to increase or decrease a volume output, and so forth.
  • the illustrated input structures 14 are merely exemplary, and that the handheld device 50 may include any number of suitable user input structures existing in various forms including buttons, switches, keys, knobs, scroll wheels, and so forth.
  • the handheld device 50 may include various I/O ports 12 .
  • the depicted I/O ports 12 may include a proprietary connection port 12 a (e.g., a 30-pin dock-connector available from Apple Inc.) for transmitting and receiving data and for charging a power source 26 , which may include one or more removable, rechargeable, and/or replaceable batteries.
  • the I/O ports may also include an audio connection port 12 b for connecting the device 50 to an audio output device (e.g., headphones or speakers).
  • the I/O port 12 c may be provided for receiving a subscriber identify module (SIM) card (e.g., an expansion card 22 ).
  • SIM subscriber identify module
  • the display 28 may display various images generated by the handheld device 50 .
  • the display 28 may display various system indicators 54 for providing feedback to a user with regard to one or more states of handheld device 50 , such as power status, signal strength, external device connections, and so forth.
  • the display 28 may also display a graphical user interface (GUI) 56 that allows a user to interact with the device 50 .
  • GUI graphical user interface
  • the presently displayed screen image of the GUI 56 may represent a home-screen of an operating system running on the device 50 , which may be a version of the Mac OS® or iOS® (previously iPhone OS®) operating systems, available from Apple Inc.
  • the GUI 56 may include various graphical elements, such as icons 58 that may correspond to various applications that may be opened or executed upon user selection (e.g., receiving a user input corresponding to the selection of a particular icon 58 ).
  • the selection of an icon 58 may lead to a hierarchical navigation process, such that selection of an icon 58 leads to a screen or opens another graphical window that includes one or more additional icons or other GUI elements.
  • one of the icons 58 may represent a camera application 66 that may be used in conjunction with one or both of a first front-facing camera 30 a located on the front side of the device 50 and a second rear-facing camera 30 b (shown in phantom lines in FIG.
  • the rear of the handheld device 50 may include a flash module (also referred to as a strobe) 64 , which may be used to illuminate an image scene being captured using the rear-facing camera 30 b .
  • the flash module 64 may include a xenon lighting device and/or a light emitting diode (LED).
  • the front and rear facing cameras 30 a and 30 b may be utilized to provide video-conferencing capabilities, such as through the use of a video-conferencing application based upon FaceTime®, available from Apple Inc.
  • the handheld device 50 may include various audio input and output elements.
  • the audio input/output elements 70 may include an input receiver, such as a microphone.
  • the input receivers may receive user audio input, such as a user's voice.
  • the audio input/output elements 70 may include one or more output transmitters, which may include one or more speakers that function to transmit audio signals to a user, such as during the playback of music data using a media player application 72 .
  • an additional audio output transmitter 74 may be provided, as shown in FIG. 3 .
  • the output transmitter 74 may also include one or more speakers that transmit audio signals to a user, such as voice data received during a telephone call.
  • the audio input/output elements 70 and 74 may collectively function as the audio receiving and transmitting elements of a telephone.
  • the image processing circuitry 32 may perform image merging of captured images to generate a composite HDR image.
  • the camera 30 may acquire multiple images in during a single exposure, including one or more images at a low exposure level (underexposed) and one or more images at a high exposure level (overexposed), which may be utilized to generate a single composite HDR image by the image processing circuitry 32 .
  • the camera 30 may acquire at least one image at a low exposure level (underexposed), at least one image at a normal exposure level, and at least one image at a high exposure level (overexposed).
  • the image processing circuitry 32 may process these images to generate a composite HDR image.
  • FIG. 5 illustrates an example of an image 76 to be captured by the image sensor of the camera 30 .
  • the image 76 may be captured by the camera 30 via a process utilizing a rolling shutter method of image acquisition (i.e., a line scan acquisition and line scan reset for pixels to start the exposure). That is, the camera 30 may record an image not from a single snapshot at a single point in time, but rather by rolling (i.e., moving) the shutter across the exposable area of an image sensor of the camera 30 instead of exposing the image area (frame) all at the same time.
  • a rolling shutter method of image acquisition i.e., a line scan acquisition and line scan reset for pixels to start the exposure. That is, the camera 30 may record an image not from a single snapshot at a single point in time, but rather by rolling (i.e., moving) the shutter across the exposable area of an image sensor of the camera 30 instead of exposing the image area (frame) all at the same time.
  • a frame of information is captured by pixels in the image sensor of the camera 30 in rows (either vertically or horizontally oriented), such that all parts of the image 76 are not recorded at exactly the same time, though the entirety of the image 76 will may by reconstituted for display as the single image 76 during playback.
  • a global shutter method of image acquisition may be employed in which an entire frame is exposed for the image sensor at the same time.
  • the scan direction for the image 76 may be represented by line 78 . That is, the rolling shutter reset 80 may move from the uppermost portion 82 of the image 76 to the lowermost portion 84 of the image 76 to reset pixels in the image sensor. As the pixels of the image sensor are reset by the rolling shutter reset 80 , the reset pixels may begin to collect light that corresponds to a portion of the image 76 . That is, the pixels of the image sensor in the camera 30 may be exposed to light, which will be collected and utilized to generate the image 76 .
  • the rolling shutter reset 80 may move from portion 82 to portion 84 of the image 76 (i.e., across a full frame) in a fixed amount of time “t” (i.e., the frame time).
  • this fixed amount of time t may be 1/15 of a second, 1/30 of a second, 1/60 of a second, 1/125 of a second, 1/250 of a second, 1/500 of a second, 1/1000 of a second, 1/5000 of a second, 1/10000 of a second, or another amount of time.
  • This fixed amount of time t may also automatically be adjusted by, for example, the processor(s) 16 executing an imaging applications on the electronic device 10 , such as Photo Booth®, Aperture®, iPhoto®, or Preview®, available from Apple Inc., or the “Camera” and/or “Photo” applications provided by Apple Inc. and available on models of the iPhone®, or adjusted by a user during interaction with, for example, one of the above mentioned imaging applications.
  • an imaging applications on the electronic device 10 such as Photo Booth®, Aperture®, iPhoto®, or Preview®, available from Apple Inc., or the “Camera” and/or “Photo” applications provided by Apple Inc. and available on models of the iPhone®, or adjusted by a user during interaction with, for example, one of the above mentioned imaging applications.
  • a first data read 86 of the data stored in a row of pixels, may be undertaken at a time n, where n is a fixed fractional time of time t.
  • This time n may be, for example, 1 ⁇ 2, 1 ⁇ 3, 1 ⁇ 4, 1 ⁇ 5, 1/10, 1/20, or another value of the frame time t.
  • This time n may be represented as line 88 in FIG. 5 .
  • the first data read 86 may occur at a time n subsequent to the reset of a row of pixels by the rolling shutter reset 80 . Accordingly, as the rolling shutter reset 80 passes downwards along line 78 , the first data read 86 may trail the rolling shutter reset 80 by time n. In this manner, data stored in the pixels for each row of the frame may be read at a time n after the rolling shutter reset 80 of that row of pixels. Thus, each row of pixels read as the first data read 86 passes across the image sensor will have been exposed to light for the same time n, which may be referred to as an exposure time or integration time.
  • this first data read 86 might correspond to generation of a picture at a low exposure level (underexposed). That is, the first data read 86 may be useful in generating a picture in which shadowed areas of the image 76 are poorly rendered but bright areas of the picture are rendered clearly. That is, the data corresponding to the first data read 86 may be useful in rendering the bright portions of a HDR image.
  • a second data read 90 may be performed on the data stored in, for example, a row of pixels of the image sensor.
  • This second data read 90 may be at, a time m represented by line 92 in FIG. 5 .
  • This time m may be a multiple of time n.
  • time m may be equal to n, 1.5 time n, two times n, 205 times n, three times n, 3.5 times n, four times n, or another multiple of time n.
  • time m represented along line 92 may be three times n such that the second data read 90 may be performed at an overall time of four times n (i.e., 4n).
  • the second data read 90 may occur at a time 4n subsequent to the reset of a row of pixels by rolling shutter reset 80 . Accordingly, as the rolling shutter reset 80 passes downwards along line 78 , the second data read 90 may trail the rolling shutter reset 80 by time 4n. In this manner, data stored in the pixels for each row of the frame may be read at a time 4n after the rolling shutter reset 80 of that row of pixels, Thus, each row of pixels read as the second data read 90 passes across the image sensor will have been exposed to light for the same time 4n. In this manner, multiple exposures (e.g., one with exposure time n and one with exposure time 4n) may be accomplished during ⁇ a single frame capture.
  • multiple exposures e.g., one with exposure time n and one with exposure time 4n
  • the second data read 90 might correspond to generation of a picture at a high exposure level (overexposed). That is, the second data read 90 may be useful in generating a picture in which shadowed areas of the image 76 are rendered clearly but bright areas of the picture may be washed out. That is, the data corresponding to the second data read 90 may be useful in rendering the dark portions of a HDR image.
  • data from the first data read 86 may be used to generate bright portions of a HDR image and data from the second data read 90 may be used to generate dark portions of a HDR image so that the composite HDR image may have an improved dynamic range and, thus, be more visually appealing than a picture rendered from data of either of the first data read 86 or the second data read 90 .
  • a third data read may be undertaken at a time between the first data read 86 and the second data read 90 such that the third data read corresponds to a “normal” exposure (for example, at time 2n if the first data read 86 was at time n and the second data read was at time 4n).
  • This third data read may be combined with the data from the first data read 86 and the second data read 90 to generate a composite HDR image.
  • the processor(s) 16 executing an imaging application on the electronic device 10 may also alter the readout time n and any subsequent multiple thereof as well as the overall number of data reads. This alteration may be performed based on feedback regarding such factors as brightness of the subject to be photographed, the exposure index of the camera, noise, or other factors. For example, more data reads may occur at slower frame rates and delayed readout times n may occur at lower brightness levels.
  • the exposure times may be adjusted to allow for modifications to the HDR image to be generated.
  • FIG. 6 illustrates a block diagram of elements of the camera 30 and the image processing circuitry 32 that may be utilized to generate a HDR image.
  • the camera 30 includes an image sensor 94 to capture and convert light into electrical signals.
  • the image sensor 94 may be, for example, a CMOS image sensor (e.g., a CMOS active-pixel sensor (APS)) or a CCD (charge-coupled device) sensor.
  • CMOS image sensor e.g., a CMOS active-pixel sensor (APS)
  • CCD charge-coupled device
  • the image sensor 94 in the camera 30 includes an integrated circuit having an array of pixels, wherein each pixel includes a photodetector for sensing light from an image 76 .
  • the pixels may be reset via the rolling shutter reset 80 as described above with respect to FIG.
  • a first data read 86 may occur at time n discussed above to generate data corresponding to an underexposed picture
  • a second data read 90 may occur at time 4n to generate data corresponding to an overexposed picture
  • a third data read may occur at a time 2n to generate data corresponding to an average (i.e., normal) exposed picture.
  • the rolling shutter reset 80 , the first data read 86 , the second data read 90 , and any other data reads may be performed by the scan circuit 95 .
  • the scan circuit 95 may receive one or more activation signals, for example, from processor(s) 16 and may operate to transmit data read signals to various pixels of the image sensor 94 .
  • the scan circuit 95 may transmit activation signals to a row of pixels during the first data read 86 (e.g., at time n) to cause data to be transmitted from the activated row of pixels.
  • the scan circuit 95 may subsequently transmit activation signals to that same row of pixels during the second data read 90 (e.g., at time 4n) to cause data to be transmitted from the activated row of pixels. In this manner, the scan circuit 95 may allow for data to be read out of the image sensor 94 multiple times prior to a shutter reset.
  • the data read out of the image sensor 94 may be transmitted to an analog to digital (A/D) converter 96 .
  • the A/D converter 96 may, for example, be on the same chip or circuit as the image sensor 94 or the A/D converter 96 may be electrically coupled to the image sensor.
  • the A/D converter 96 may be, for example, a pipelined A/D converter or a column A/D converter.
  • the A/D converter 96 may receive the data read out from the image sensor 94 during, for example, the first data read 86 and may convert the received data into digital signals (digital data) that correspond to the received data values. This digital data may then be transmitted to the image processing circuitry 32 along path 98 .
  • Path 98 may include a Mobile Industry Processor Interface Camera Serial Interface (MIPI CSI), a standard mobile industry architecture (SMIA) interface, or any other suitable parallel or serial interface. Accordingly, in one embodiment, the data may be transmitted to the image processing circuitry 32 in a serial fashion.
  • MIPI CSI Mobile Industry Processor Interface Camera Serial Interface
  • SIA standard mobile industry architecture
  • the data may be transmitted to the image processing circuitry 32 in a serial fashion.
  • multiple data reads of the image sensor 94 may occur during a single exposure (e.g., a first data read 86 , a second data read 90 , and/or other data reads).
  • the data received via each of these reads must share the path 98 .
  • staggered reads of the image sensor 94 may be performed. That is, data from the first data read 86 may be transmitted to the A/D converter until such time as the second read 90 commences. At this time, data may be read from the image sensor 94 first along the row corresponding to the first data read 86 and then along the row corresponding to the second data read 90 .
  • This process may be repeated (e.g., in an interleaved manner) as long as the time that two reads are being performed overlaps. Moreover, if a third or more read is introduced, it may likewise be staggered with all other reads such that each read transmits data to the A/D converter 96 in a staggered fashion. In this manner, as the A/D converter 96 converts the data provided to it into digital form as it is received, the digital data transmitted along path 98 may include, for example, data read from the first data read 86 , data read from the second data read 90 , as well as data read from any additional data readouts.
  • a buffer 97 may be implemented as a line buffer (i.e. memory to store data corresponding to a read out row of data) on the same chip or circuit as the image sensor 94 .
  • This buffer 97 may store the data received from a data read (e.g. first data read 86 ) so that the data may be binned (e.g., 4 ⁇ 4 binning) prior to conversion by the A/D converter 96 . That is, data from a cluster of pixels may be combined into a single pixel value, thus reducing the impact of read noise imparted to the data read from the image sensor 94 .
  • data related to both overexposed and underexposed pictures may be binned to reduce the amount of data transferred on the interface.
  • buffer 97 may be utilized to align streams of data for multiple data reads. That is, buffer 97 may be a frame buffer that may store the entirety of one data read (e.g., the second data read 90 ) for transmission while a second data read (e.g., first data read 86 ) is transmitted without storage. That is, while the data reads may be staggered, the data transmitted along path 98 may include all of the data read out from the first data read 86 and subsequently all the data read out from the second data read 90 (and transmitted from the buffer 97 ). Use of the buffer 97 in this manner may allow for reduced complexity in separating data from data path 98 (i.e., the data on data path 98 will not be interleaved).
  • the digital data passed along path 98 may be received at the image processing circuitry 32 .
  • this digital data may include data from more than one read of the image sensor 94 (i.e., data related to different exposure times).
  • the digital data corresponding to the first data read 86 , the second data read 90 , or any other data reads may include an identifier, such as one or more identification bits, so that the data received may be identified as belonging to a particular data read (e.g., the first data read 86 ).
  • the digital data may be tagged so that correct categorization of the received digital data as related to a particular data read, for example, by an image signal processor (ISP) 100 , may occur.
  • ISP image signal processor
  • the ISP 100 might be utilized to adjust the readout times (e.g., when data reads are performed).
  • the ISP 100 may be utilized to generate statistical data relating to such factors as exposure, white balance, and focus of the imaging device 30 . This data may be utilized to adjust the readout times of the imaging device 30 and, thus, the HDR image.
  • the ISP 100 may receive the digital data and may operate to separate the data received from path 98 into respective exposures. For example, the ISP 100 may separate data read from the first data read 86 and store that separated data in buffer 102 , which may be a memory location. Similarly, the ISP 100 may separate data read from the second data read 90 from the received data along path 98 and store the separated data in buffer 104 , which may be a memory location. In some embodiments, buffer 102 and 104 may be separate locations of a single physical memory circuit. In other embodiments, buffer 102 and 104 may be located in distinct memory circuits. Additionally, the ISP 100 may operate to separate as many data portions as are read from the image sensor 94 , and, for example, may include additional buffers that correspond to each additional data read performed.
  • each of the buffers 102 and 104 may include a full set of data that may be utilized to form an HDR image. Accordingly, the ISP 100 may combine the data stored in buffer 102 and 104 to generate a composite HDR image, which may be transmitted to buffer 106 for retrieval by the processor(s) 16 .
  • the HDR image may contain a higher bit depth than the data stored in buffer 102 and 104 .
  • the HDR image stored in buffer 106 may include one, two, three, or more bits of extra information relative to the data stored in buffer 102 or 104 . This higher bit depth of the HDR image may allow for more clarity and/or a more visually appealing picture to be generated.
  • FIG. 7 illustrates a flow chart 108 that details one embodiment of the operation of the ISP 100 .
  • the ISP 100 may receive the digital data from path 98 .
  • the ISP 100 may separate the data received from path 98 into respective exposures in step 112 . This separation may be accomplished by determining which data read a given set of data corresponds to by analyzing an identifier (i.e., one or more identification bits, for example, appended to the data) associated with the received data so that the received data be classified as belonging to a particular data read (e.g., the first data read 86 ).
  • an identifier i.e., one or more identification bits, for example, appended to the data
  • the ISP 100 may store the separated data in step 114 .
  • data from the first data read 86 may be stored in buffer 102 , which may be a memory location.
  • data from the second data read 90 may be stored in buffer 104 , which may be a memory location separate from buffer 102 .
  • data from the any additional data reads may be stored in additional buffers by the ISP 100 .
  • the ISP 100 may generate a HDR image. That is, each of the buffers 102 and 104 , as well as any additional buffers utilized, may include a full set of data that may be utilized to form an HDR image.
  • the ISP 100 may combine the data stored in the buffers to generate a composite HDR image, which may be transmitted to buffer 106 for transmission to the processor(s) 16 in step 118 . In this manner, the ISP 100 may be utilized to generate a HDR image from a single rolling shutter reset 80 of the image capture device 30 .
  • HDR imaging techniques may be implemented in any suitable manner, including hardware (suitably configured circuitry), software (e.g., via a computer program including executable code stored on one or more tangible computer readable medium), or via using a combination of both hardware and software elements.

Abstract

A system and method of generating a high dynamic range image. A single reset of an image sensor may be executed followed by two or more reads of the sensor to retrieve data. These reads of the image sensor may be done prior to a subsequent reset of the sensor. These reads may also be accomplished at predetermined times relative to one another. Data read out during these scans may be deinterleaved by an image signal processor and combined into a high dynamic range image.

Description

BACKGROUND
The present disclosure relates generally to image capture systems and techniques.
This section is intended to introduce the reader to various aspects of art that may be related to various aspects of the present disclosure, which are described and/or claimed below. This discussion is believed to be helpful in providing the reader with background information to facilitate a better understanding of the various aspects of the present disclosure. Accordingly, it should be understood that these statements are to be read in this light, and not as admissions of prior art.
High dynamic range (HDR) imaging generally relates to a set of imaging techniques that allows for the capture and representation of a greater dynamic range of luminances between the lightest and darkest areas of an image than standard digital imaging techniques. Wider dynamic ranges allows for HDR images to more accurately represent the wide range of intensity levels found in real-world scenes. One method for capturing HDR images includes the merging of multiple independently captured photographs. For instance, this process may include capturing multiple images at different exposures in succession, and then processing them to generate a composite HDR image.
However, there exist disadvantages to the process of generating a HDR image from multiple independently captured images. For example, changes may occur when images are captured successively such that a composite HDR image generated therefrom may not be completely aligned. This may generate motion artifacts in the composite HDR image. Further, the images may be affected by local motion in the image scene, e.g., trees swaying in the wind, people and faces shifting slightly, etc. Additionally, the time required for a HDR image to be processed may be delayed based on the images to be captured. Accordingly, techniques and systems for increasing the speed and continuity with which HDR images may be generated are desirable.
SUMMARY
A summary of certain embodiments disclosed herein is set forth below. It should be understood that these aspects are presented merely to provide the reader with a brief summary of these certain embodiments and that these aspects are not intended to limit the scope of this disclosure. Indeed, this disclosure may encompass a variety of aspects that may not be set forth below.
Embodiments of the present invention relate to the generation of images, particularly in an HDR imaging application. For instance, in one embodiment, a single image may be captured by an image capture device, such as a camera. This captured image may be the result of light energy that is converted into electrical signals (e.g., a voltage) by an image sensor of the image capture device. Multiple scans (i.e., reads) of the image sensor may be made such that one read may correspond to an underexposed representation of the image to be captured while a second read may correspond to an overexposed representation of the image to be captured. This read data may be transmitted along a single path to an image processing circuit, where an image signal processor separates data from the first and second scan. This separated data may be independently stored and recombined by the image signal processor to generate a HDR image, which may be transmitted for display on a display of an electronic device.
Various refinements of the features noted above may exist in relation to various aspects of the present disclosure. Further features may also be incorporated in these various aspects as well. These refinements and additional features may exist individually or in any combination. For instance, various features discussed below in relation to one or more of the illustrated embodiments may be incorporated into any of the above-described aspects of the present disclosure alone or in any combination. Again, the brief summary presented above is intended only to familiarize the reader with certain aspects and contexts of embodiments of the present disclosure without limitation to the claimed subject matter.
BRIEF DESCRIPTION OF THE DRAWINGS
Various aspects of this disclosure may be better understood upon reading the following detailed description and upon reference to the drawings in which:
FIG. 1 is a simplified block diagram depicting components of an example of an electronic device that includes an image capture device and an image signal processing subsystem configured to implement an HDR image processing technique in accordance with aspects set forth in the present disclosure;
FIG. 2 is a graphical representation of a 2×2 pixel block of a Bayer color filter array that may be implemented in the image capture device of the electronic device of FIG. 1;
FIG. 3 is a front view of the electronic device of FIG. 1 in the form of a handheld portable electronic device in accordance with aspects of the present disclosure;
FIG. 4 is a rear view of the handheld electronic device shown in FIG. 3;
FIG. 5 shows an example of an image to be captured via the electronic device of FIG. 1;
FIG. 6 is a block diagram of the image capture device and the image signal processing subsystem of the electronic device of FIG. 1; and
FIG. 7 is a flow chart illustrating the operation of an image signal processor of the image signal processing subsystem of FIG. 6.
DETAILED DESCRIPTION OF SPECIFIC EMBODIMENTS
One or more specific embodiments of the present disclosure will be described below. These described embodiments are only examples of the presently disclosed techniques. Additionally, in an effort to provide a concise description of these embodiments, all features of an actual implementation may not be described in the specification. It should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another. Moreover, it should be appreciated that such a development effort might be complex and time consuming, but would nevertheless be a routine undertaking of design, fabrication, and manufacture for those of ordinary skill having the benefit of this disclosure.
When introducing elements of various embodiments of the present disclosure, the articles “a,” “an,” and “the” are intended to mean that there are one or more of the elements. The terms “comprising,” “including,” and “having” are intended to be inclusive and mean that there may be additional elements other than the listed elements. Additionally, it should be understood that references to “one embodiment” or “an embodiment” of the present disclosure are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features.
As will be discussed below, the present disclosure generally provides various techniques for HDR image generation using a digital image sensor and for merging images captures during a single exposure. FIG. 1 is a block diagram illustrating an example of an electronic device 10 that may provide for the generation of digital images in accordance with embodiments of the present disclosure. The electronic device 10 may be any type of electronic device, such as a laptop or desktop computer, a mobile phone, a digital media player, or the like, that may receive and process image data, such as data acquired using one or more image sensing devices. By way of example only, the electronic device 10 may be a portable electronic device, such as a model of an iPod® or iPhone®, available from Apple Inc. of Cupertino, Calif. Additionally, the electronic device 10 may be a desktop, laptop, or tablet computer, such as a model of a MacBook®, MacBook® Pro, MacBook Air®, iMac®, Mac® Mini, Mac Pro®, or iPad® available from Apple Inc. In other embodiments, electronic device 10 may also be a model of an electronic device from another manufacturer that is capable of acquiring and processing image data.
The electronic device 10 may include various internal and/or external components, which contribute to the function of the device 10. Those of ordinary skill in the art will appreciate that the various functional blocks shown in FIG. 1 may comprise hardware elements (including circuitry), software elements (including computer code stored on a computer-readable medium) or a combination of both hardware and software elements. For example, in the presently illustrated embodiment, the electronic device 10 may include input/output (I/O) ports 12, input structures 14, one or more processors 16, memory device 18, non-volatile storage 20, expansion card(s) 22, networking device 24, power source 26, and display 28. Additionally, the electronic device 10 may include one or more imaging devices 30, such as a digital camera, and image processing circuitry (ISP sub-system) 32. As will be discussed further below, the image processing circuitry 32 may implement image processing techniques for processing image data to generate composite HDR images.
The processor(s) 16 may control the general operation of the device 10. For instance, the processor(s) 16 may provide the processing capability to execute an operating system, programs, user and application interfaces, and any other functions of the electronic device 10. The processor(s) 16 may include one or more microprocessors, such as one or more “general-purpose” microprocessors, one or more special-purpose microprocessors and/or application-specific microprocessors (ASICs), or a combination of such processing components. For example, the processor(s) 16 may include one or more processing engines (e.g., RISC or CISC processors, graphics processors (GPU), video processors, and/or related chip sets). As will be appreciated, the processor(s) 16 may be coupled to one or more data buses for transferring data and instructions between various components of the device 10. In certain embodiments, the processor(s) 16 may provide the processing capability to execute an imaging applications on the electronic device 10, such as Photo Booth®, Aperture®, iPhoto®, or Preview®, available from Apple Inc., or the “Camera” and/or “Photo” applications provided by Apple Inc. and available on models of the iPhone®. In one embodiment, the processor(s) 16 may also provide for the capability to execute a video conferencing application on the device 10, such as FaceTime®, available from Apple Inc.
The instructions or data to be processed by the processor(s) 16 may be stored in a computer-readable medium, such as a memory device 18. The memory device 18 may be provided as a volatile memory, such as random access memory (RAM) or as a non-volatile memory, such as read-only memory (ROM), or as a combination of one or more RAM and ROM devices. The memory 18 may store a variety of information and may be used for various purposes. For example, the memory 18 may store firmware for the electronic device 10, such as a basic input/output system (BIOS), an operating system, various programs, applications, or any other routines that may be executed on the electronic device 10, including user interface functions, processor functions, and so forth. In addition, the memory 18 may be used for buffering or caching during operation of the electronic device 10. For instance, in one embodiment, the memory 18 may include one or more frame buffers for buffering video data as it is being output to the display 28.
In addition to the memory device 18, the electronic device 10 may further include a non-volatile storage 20 for persistent storage of data and/or instructions. The non-volatile storage 20 may include flash memory, a hard drive, or any other optical, magnetic, and/or solid-state storage media, or some combination thereof. Thus, although depicted as a single device in FIG. 1 for purposes of clarity, it should be understood that the non-volatile storage device(s) 20 may include a combination of one or more of the above-listed storage devices operating in conjunction with the processor(s) 16. The non-volatile storage 20 may be used to store firmware, data files, image data, software programs and applications, wireless connection information, personal information, user preferences, and any other suitable data. In accordance with aspects of the present disclosure, image data stored in the non-volatile storage 20 and/or the memory device 18 may be processed by the image processing circuitry 32 prior to being output on a display.
The display 28 may be used to display various images generated by device 10, such as a GUI for an operating system, or image data (including still images and video data) processed by the image processing circuitry 32, as will be discussed further below. As mentioned above, the image data may include image data acquired using the imaging device 30 or image data retrieved from the memory 18 and/or non-volatile storage 20. The display 28 may be any suitable type of display, such as a liquid crystal display (LCD), plasma display, or an organic light emitting diode (OLED) display, for example. In one embodiment, the display may be a high-resolution LCD display having 300 or more pixels per inch, such as a Retina Display®, available from Apple Inc. Further, in some embodiments, the display 28 may be provided in conjunction with the above-discussed touch-sensitive mechanism (e.g., a touch screen) that may function as an input structure (14) for the electronic device 10.
As discussed above, the electronic device 10 may include imaging device(s) 30, which may be provided as a digital camera configured to acquire both still images and moving images (e.g., video). The camera 30 may include a lens and one or more image sensors to capture and convert light into electrical signals. By way of example only, the image sensor may include a CMOS image sensor (e.g., a CMOS active-pixel sensor (APS)) or a CCD (charge-coupled device) sensor. Generally, the image sensor in the camera 30 includes an integrated circuit having an array of pixels, wherein each pixel includes a photodetector for sensing light from an image scene. The image sensor may be coupled to the ISP sub-system 32 via a sensor interface, which may utilize a Standard Mobile Imaging Architecture (SMIA) interface or any other suitable serial or parallel image sensor interface, or a combination of such interfaces.
As those skilled in the art will appreciate, the photodetectors in the imaging pixels of the sensor generally detect the intensity of light captured via the camera lenses. However, photodetectors, by themselves, are generally unable to detect the wavelength of the captured light and, thus, are unable to determine color information. Accordingly, the image sensor may further include a color filter array (CFA) that may overlay or be disposed over the pixel array of the image sensor to capture color information. The color filter array may include an array of small color filters, each of which may overlap a respective pixel of the image sensor and may filter the captured light by wavelength. Thus, when used in conjunction, the color filter array and the image sensor may provide both wavelength and intensity information with regard to light captured through the camera, which may be representative of a captured image.
In one embodiment, the color filter array may include a Bayer color filter array, which provides a color pattern that is 50% green elements, 25% red elements, and 25% blue elements. FIG. 2 shows a 2×2 pixel block of a Bayer CFA that includes 2 green elements (referred to Gr and Gb), 1 red element (R), and 1 blue element (B). Thus, an image sensor utilizing a Bayer color filter array may provide information regarding the intensity of the light received by the camera 30 at the green, red, and blue wavelengths, whereby each image pixel records only one of the three colors (red, green or blue). This information, which may be referred to as “raw image data,” may then be processed using one or more demosaicing techniques to convert the raw image data into a full color image, generally by interpolating a set of red, green, and blue values for each pixel. Such demosaicing techniques may be performed by the ISP sub-system 32.
Continuing to FIGS. 3 and 4, the electronic device 10 is depicted in the form of a portable handheld electronic device 50, which may be a model of an iPod®, such as an iPod Touch®, or iPhone® available from Apple Inc. The handheld device 50 includes an enclosure 52, which may function to protect the interior components from physical damage and to shield them from electromagnetic interference (EMI). The enclosure 52 may be formed from any suitable material or combination of materials, such as plastic, metal, alloy, or a composite material, and may allow certain frequencies of electromagnetic radiation, such as wireless networking (e.g., 802.11 a/b/g/n networking) and/or telecommunication signals (e.g., GPRS, EDGE, 3G, LTE, etc.), to pass through to wireless communication circuitry (e.g., network device 24), which may be disposed within the enclosure 52, as shown in FIG. 3.
The enclosure 52 also includes various user input structures 14 through which a user may interface with the handheld device 50. For instance, each input structure 14 may control one or more respective device functions when pressed or actuated. By way of example, one or more of the input structures 14 may invoke a “home” screen or menu to be displayed, to toggle between a sleep, wake, or powered on/off mode, to silence a ringer for a cellular phone application, to increase or decrease a volume output, and so forth. It should be understood that the illustrated input structures 14 are merely exemplary, and that the handheld device 50 may include any number of suitable user input structures existing in various forms including buttons, switches, keys, knobs, scroll wheels, and so forth.
The handheld device 50 may include various I/O ports 12. For instance, the depicted I/O ports 12 may include a proprietary connection port 12 a (e.g., a 30-pin dock-connector available from Apple Inc.) for transmitting and receiving data and for charging a power source 26, which may include one or more removable, rechargeable, and/or replaceable batteries. The I/O ports may also include an audio connection port 12 b for connecting the device 50 to an audio output device (e.g., headphones or speakers). Further, in embodiments where the handheld device 50 provides mobile phone functionality, the I/O port 12 c may be provided for receiving a subscriber identify module (SIM) card (e.g., an expansion card 22).
The display 28, which may be an LCD, OLED, or any suitable type of display, may display various images generated by the handheld device 50. For example, the display 28 may display various system indicators 54 for providing feedback to a user with regard to one or more states of handheld device 50, such as power status, signal strength, external device connections, and so forth. The display 28 may also display a graphical user interface (GUI) 56 that allows a user to interact with the device 50. In certain embodiments, the presently displayed screen image of the GUI 56 may represent a home-screen of an operating system running on the device 50, which may be a version of the Mac OS® or iOS® (previously iPhone OS®) operating systems, available from Apple Inc.
The GUI 56 may include various graphical elements, such as icons 58 that may correspond to various applications that may be opened or executed upon user selection (e.g., receiving a user input corresponding to the selection of a particular icon 58). In some embodiments, the selection of an icon 58 may lead to a hierarchical navigation process, such that selection of an icon 58 leads to a screen or opens another graphical window that includes one or more additional icons or other GUI elements. In the illustrated embodiment, one of the icons 58 may represent a camera application 66 that may be used in conjunction with one or both of a first front-facing camera 30 a located on the front side of the device 50 and a second rear-facing camera 30 b (shown in phantom lines in FIG. 3) on the rear of the device 50 for acquiring images. Referring briefly to FIG. 4, a rear view of the handheld device 50 is illustrated showing the rear-facing camera 30 b as being integrated with the housing 52 and positioned on the rear of the handheld device 50. In the illustrated embodiment, the rear of the handheld device 50 may include a flash module (also referred to as a strobe) 64, which may be used to illuminate an image scene being captured using the rear-facing camera 30 b. By way of example, the flash module 64 may include a xenon lighting device and/or a light emitting diode (LED). In one embodiment, the front and rear facing cameras 30 a and 30 b may be utilized to provide video-conferencing capabilities, such as through the use of a video-conferencing application based upon FaceTime®, available from Apple Inc.
Additionally, the handheld device 50 may include various audio input and output elements. For example, the audio input/output elements 70, may include an input receiver, such as a microphone. Thus, in embodiments where the handheld device 50 includes mobile phone functionality, the input receivers may receive user audio input, such as a user's voice. Additionally, the audio input/output elements 70 may include one or more output transmitters, which may include one or more speakers that function to transmit audio signals to a user, such as during the playback of music data using a media player application 72. In a mobile phone embodiment, an additional audio output transmitter 74 may be provided, as shown in FIG. 3. Like the output transmitters of the audio input/output elements 70, the output transmitter 74 may also include one or more speakers that transmit audio signals to a user, such as voice data received during a telephone call. Thus, in a mobile phone embodiment, the audio input/ output elements 70 and 74 may collectively function as the audio receiving and transmitting elements of a telephone.
Having now provided some context with regard to some form factors that the electronic device 10 may take, certain HDR imaging techniques that may be implemented on the electronic device 10 in accordance with embodiments set forth in the present disclosure will now be discussed in further detail. For example, the image processing circuitry 32 may perform image merging of captured images to generate a composite HDR image. In one embodiment, for HDR imaging, the camera 30 may acquire multiple images in during a single exposure, including one or more images at a low exposure level (underexposed) and one or more images at a high exposure level (overexposed), which may be utilized to generate a single composite HDR image by the image processing circuitry 32. Alternatively, the camera 30 may acquire at least one image at a low exposure level (underexposed), at least one image at a normal exposure level, and at least one image at a high exposure level (overexposed). The image processing circuitry 32 may process these images to generate a composite HDR image.
FIG. 5 illustrates an example of an image 76 to be captured by the image sensor of the camera 30. The image 76 may be captured by the camera 30 via a process utilizing a rolling shutter method of image acquisition (i.e., a line scan acquisition and line scan reset for pixels to start the exposure). That is, the camera 30 may record an image not from a single snapshot at a single point in time, but rather by rolling (i.e., moving) the shutter across the exposable area of an image sensor of the camera 30 instead of exposing the image area (frame) all at the same time. Accordingly, a frame of information is captured by pixels in the image sensor of the camera 30 in rows (either vertically or horizontally oriented), such that all parts of the image 76 are not recorded at exactly the same time, though the entirety of the image 76 will may by reconstituted for display as the single image 76 during playback. In other embodiments, a global shutter method of image acquisition may be employed in which an entire frame is exposed for the image sensor at the same time.
As illustrated in FIG. 5, the scan direction for the image 76 may be represented by line 78. That is, the rolling shutter reset 80 may move from the uppermost portion 82 of the image 76 to the lowermost portion 84 of the image 76 to reset pixels in the image sensor. As the pixels of the image sensor are reset by the rolling shutter reset 80, the reset pixels may begin to collect light that corresponds to a portion of the image 76. That is, the pixels of the image sensor in the camera 30 may be exposed to light, which will be collected and utilized to generate the image 76. In one embodiment, the rolling shutter reset 80 may move from portion 82 to portion 84 of the image 76 (i.e., across a full frame) in a fixed amount of time “t” (i.e., the frame time). For example, this fixed amount of time t may be 1/15 of a second, 1/30 of a second, 1/60 of a second, 1/125 of a second, 1/250 of a second, 1/500 of a second, 1/1000 of a second, 1/5000 of a second, 1/10000 of a second, or another amount of time. This fixed amount of time t may also automatically be adjusted by, for example, the processor(s) 16 executing an imaging applications on the electronic device 10, such as Photo Booth®, Aperture®, iPhoto®, or Preview®, available from Apple Inc., or the “Camera” and/or “Photo” applications provided by Apple Inc. and available on models of the iPhone®, or adjusted by a user during interaction with, for example, one of the above mentioned imaging applications.
To generate a HDR image during a single exposure of the frame (i.e., fixed amount of time t during which the rolling shutter reset 80 moves across a frame), multiple reads of the same row of pixels of the image sensor may occur. For example, a first data read 86, of the data stored in a row of pixels, may be undertaken at a time n, where n is a fixed fractional time of time t. This time n may be, for example, ½, ⅓, ¼, ⅕, 1/10, 1/20, or another value of the frame time t. This time n may be represented as line 88 in FIG. 5. That is, the first data read 86 may occur at a time n subsequent to the reset of a row of pixels by the rolling shutter reset 80. Accordingly, as the rolling shutter reset 80 passes downwards along line 78, the first data read 86 may trail the rolling shutter reset 80 by time n. In this manner, data stored in the pixels for each row of the frame may be read at a time n after the rolling shutter reset 80 of that row of pixels. Thus, each row of pixels read as the first data read 86 passes across the image sensor will have been exposed to light for the same time n, which may be referred to as an exposure time or integration time.
It should be noted that this first data read 86 might correspond to generation of a picture at a low exposure level (underexposed). That is, the first data read 86 may be useful in generating a picture in which shadowed areas of the image 76 are poorly rendered but bright areas of the picture are rendered clearly. That is, the data corresponding to the first data read 86 may be useful in rendering the bright portions of a HDR image.
Subsequent to the first data read 86, a second data read 90 may be performed on the data stored in, for example, a row of pixels of the image sensor. This second data read 90 may be at, a time m represented by line 92 in FIG. 5. This time m may be a multiple of time n. For example, time m may be equal to n, 1.5 time n, two times n, 205 times n, three times n, 3.5 times n, four times n, or another multiple of time n. In one embodiment, time m represented along line 92 may be three times n such that the second data read 90 may be performed at an overall time of four times n (i.e., 4n). That is, the second data read 90 may occur at a time 4n subsequent to the reset of a row of pixels by rolling shutter reset 80. Accordingly, as the rolling shutter reset 80 passes downwards along line 78, the second data read 90 may trail the rolling shutter reset 80 by time 4n. In this manner, data stored in the pixels for each row of the frame may be read at a time 4n after the rolling shutter reset 80 of that row of pixels, Thus, each row of pixels read as the second data read 90 passes across the image sensor will have been exposed to light for the same time 4n. In this manner, multiple exposures (e.g., one with exposure time n and one with exposure time 4n) may be accomplished during <a single frame capture.
It should be noted that the second data read 90 might correspond to generation of a picture at a high exposure level (overexposed). That is, the second data read 90 may be useful in generating a picture in which shadowed areas of the image 76 are rendered clearly but bright areas of the picture may be washed out. That is, the data corresponding to the second data read 90 may be useful in rendering the dark portions of a HDR image. In this manner, data from the first data read 86 may be used to generate bright portions of a HDR image and data from the second data read 90 may be used to generate dark portions of a HDR image so that the composite HDR image may have an improved dynamic range and, thus, be more visually appealing than a picture rendered from data of either of the first data read 86 or the second data read 90. Additionally, other data reads may be undertaken in addition to the first data read 86 and the second data read 90. For example, a third data read may be undertaken at a time between the first data read 86 and the second data read 90 such that the third data read corresponds to a “normal” exposure (for example, at time 2n if the first data read 86 was at time n and the second data read was at time 4n). This third data read may be combined with the data from the first data read 86 and the second data read 90 to generate a composite HDR image.
Additionally, the processor(s) 16 executing an imaging application on the electronic device 10 may also alter the readout time n and any subsequent multiple thereof as well as the overall number of data reads. This alteration may be performed based on feedback regarding such factors as brightness of the subject to be photographed, the exposure index of the camera, noise, or other factors. For example, more data reads may occur at slower frame rates and delayed readout times n may occur at lower brightness levels. Through alteration of the readout times of the data reads (e.g., data readouts 86 and 90), the exposure times may be adjusted to allow for modifications to the HDR image to be generated.
FIG. 6 illustrates a block diagram of elements of the camera 30 and the image processing circuitry 32 that may be utilized to generate a HDR image. The camera 30 includes an image sensor 94 to capture and convert light into electrical signals. The image sensor 94 may be, for example, a CMOS image sensor (e.g., a CMOS active-pixel sensor (APS)) or a CCD (charge-coupled device) sensor. Generally, the image sensor 94 in the camera 30 includes an integrated circuit having an array of pixels, wherein each pixel includes a photodetector for sensing light from an image 76. The pixels may be reset via the rolling shutter reset 80 as described above with respect to FIG. 5 and may operate to read out the captured light values as electrical signals during a first data read 86, a second data read 90, or other data reads. For example, a first data read 86 may occur at time n discussed above to generate data corresponding to an underexposed picture, a second data read 90 may occur at time 4n to generate data corresponding to an overexposed picture, and a third data read may occur at a time 2n to generate data corresponding to an average (i.e., normal) exposed picture.
The rolling shutter reset 80, the first data read 86, the second data read 90, and any other data reads may be performed by the scan circuit 95. The scan circuit 95 may receive one or more activation signals, for example, from processor(s) 16 and may operate to transmit data read signals to various pixels of the image sensor 94. For example, the scan circuit 95 may transmit activation signals to a row of pixels during the first data read 86 (e.g., at time n) to cause data to be transmitted from the activated row of pixels. The scan circuit 95 may subsequently transmit activation signals to that same row of pixels during the second data read 90 (e.g., at time 4n) to cause data to be transmitted from the activated row of pixels. In this manner, the scan circuit 95 may allow for data to be read out of the image sensor 94 multiple times prior to a shutter reset.
The data read out of the image sensor 94 may be transmitted to an analog to digital (A/D) converter 96. The A/D converter 96 may, for example, be on the same chip or circuit as the image sensor 94 or the A/D converter 96 may be electrically coupled to the image sensor. The A/D converter 96 may be, for example, a pipelined A/D converter or a column A/D converter. The A/D converter 96 may receive the data read out from the image sensor 94 during, for example, the first data read 86 and may convert the received data into digital signals (digital data) that correspond to the received data values. This digital data may then be transmitted to the image processing circuitry 32 along path 98. Path 98 may include a Mobile Industry Processor Interface Camera Serial Interface (MIPI CSI), a standard mobile industry architecture (SMIA) interface, or any other suitable parallel or serial interface. Accordingly, in one embodiment, the data may be transmitted to the image processing circuitry 32 in a serial fashion.
However, as noted above, multiple data reads of the image sensor 94 may occur during a single exposure (e.g., a first data read 86, a second data read 90, and/or other data reads). Thus, the data received via each of these reads must share the path 98. To accomplish this, staggered reads of the image sensor 94 may be performed. That is, data from the first data read 86 may be transmitted to the A/D converter until such time as the second read 90 commences. At this time, data may be read from the image sensor 94 first along the row corresponding to the first data read 86 and then along the row corresponding to the second data read 90. This process may be repeated (e.g., in an interleaved manner) as long as the time that two reads are being performed overlaps. Moreover, if a third or more read is introduced, it may likewise be staggered with all other reads such that each read transmits data to the A/D converter 96 in a staggered fashion. In this manner, as the A/D converter 96 converts the data provided to it into digital form as it is received, the digital data transmitted along path 98 may include, for example, data read from the first data read 86, data read from the second data read 90, as well as data read from any additional data readouts.
In one embodiment, a buffer 97 may be implemented as a line buffer (i.e. memory to store data corresponding to a read out row of data) on the same chip or circuit as the image sensor 94. This buffer 97 may store the data received from a data read (e.g. first data read 86) so that the data may be binned (e.g., 4×4 binning) prior to conversion by the A/D converter 96. That is, data from a cluster of pixels may be combined into a single pixel value, thus reducing the impact of read noise imparted to the data read from the image sensor 94. In one embodiment, data related to both overexposed and underexposed pictures may be binned to reduce the amount of data transferred on the interface.
Additionally or alternatively, the buffer 97 may be utilized to align streams of data for multiple data reads. That is, buffer 97 may be a frame buffer that may store the entirety of one data read (e.g., the second data read 90) for transmission while a second data read (e.g., first data read 86) is transmitted without storage. That is, while the data reads may be staggered, the data transmitted along path 98 may include all of the data read out from the first data read 86 and subsequently all the data read out from the second data read 90 (and transmitted from the buffer 97). Use of the buffer 97 in this manner may allow for reduced complexity in separating data from data path 98 (i.e., the data on data path 98 will not be interleaved).
The digital data passed along path 98 may be received at the image processing circuitry 32. As noted above, this digital data may include data from more than one read of the image sensor 94 (i.e., data related to different exposure times). Accordingly, in one embodiment, the digital data corresponding to the first data read 86, the second data read 90, or any other data reads may include an identifier, such as one or more identification bits, so that the data received may be identified as belonging to a particular data read (e.g., the first data read 86). In this manner, the digital data may be tagged so that correct categorization of the received digital data as related to a particular data read, for example, by an image signal processor (ISP) 100, may occur. Moreover, it should be noted that the ISP 100 might be utilized to adjust the readout times (e.g., when data reads are performed). For example, the ISP 100 may be utilized to generate statistical data relating to such factors as exposure, white balance, and focus of the imaging device 30. This data may be utilized to adjust the readout times of the imaging device 30 and, thus, the HDR image.
As noted above, the ISP 100 may receive the digital data and may operate to separate the data received from path 98 into respective exposures. For example, the ISP 100 may separate data read from the first data read 86 and store that separated data in buffer 102, which may be a memory location. Similarly, the ISP 100 may separate data read from the second data read 90 from the received data along path 98 and store the separated data in buffer 104, which may be a memory location. In some embodiments, buffer 102 and 104 may be separate locations of a single physical memory circuit. In other embodiments, buffer 102 and 104 may be located in distinct memory circuits. Additionally, the ISP 100 may operate to separate as many data portions as are read from the image sensor 94, and, for example, may include additional buffers that correspond to each additional data read performed.
Once all the data related to, for example, a first data read 86 and a second data read 90 is separated, each of the buffers 102 and 104 may include a full set of data that may be utilized to form an HDR image. Accordingly, the ISP 100 may combine the data stored in buffer 102 and 104 to generate a composite HDR image, which may be transmitted to buffer 106 for retrieval by the processor(s) 16. In one embodiment, the HDR image may contain a higher bit depth than the data stored in buffer 102 and 104. For example, the HDR image stored in buffer 106 may include one, two, three, or more bits of extra information relative to the data stored in buffer 102 or 104. This higher bit depth of the HDR image may allow for more clarity and/or a more visually appealing picture to be generated.
FIG. 7 illustrates a flow chart 108 that details one embodiment of the operation of the ISP 100. In step 110, the ISP 100 may receive the digital data from path 98. The ISP 100 may separate the data received from path 98 into respective exposures in step 112. This separation may be accomplished by determining which data read a given set of data corresponds to by analyzing an identifier (i.e., one or more identification bits, for example, appended to the data) associated with the received data so that the received data be classified as belonging to a particular data read (e.g., the first data read 86).
Once the received image data has been separated, the ISP 100 may store the separated data in step 114. For example, data from the first data read 86 may be stored in buffer 102, which may be a memory location. Similarly, data from the second data read 90 may be stored in buffer 104, which may be a memory location separate from buffer 102. Additionally, data from the any additional data reads may be stored in additional buffers by the ISP 100.
Once all the data related to these data reads has been stored, the ISP 100, in step 116, may generate a HDR image. That is, each of the buffers 102 and 104, as well as any additional buffers utilized, may include a full set of data that may be utilized to form an HDR image. The ISP 100 may combine the data stored in the buffers to generate a composite HDR image, which may be transmitted to buffer 106 for transmission to the processor(s) 16 in step 118. In this manner, the ISP 100 may be utilized to generate a HDR image from a single rolling shutter reset 80 of the image capture device 30.
As will be understood, the various techniques described above and relating to the HDR imaging are provided herein by way of example only. Accordingly, it should be understood that the present disclosure should not be construed as being limited to only the examples provided. Further, it should be appreciated that the HDR imaging techniques may be implemented in any suitable manner, including hardware (suitably configured circuitry), software (e.g., via a computer program including executable code stored on one or more tangible computer readable medium), or via using a combination of both hardware and software elements.
The specific embodiments described above have been shown by way of example, and it should be understood that these embodiments may be susceptible to various modifications and alternative forms. It should be further understood that the claims are not intended to be limited to the particular forms disclosed, but rather to cover all modifications, equivalents, and alternatives falling within the spirit and scope of this disclosure.

Claims (25)

What is claimed is:
1. An image processing circuit comprising:
an image sensor having an array of pixels;
a data path configured to transmit image data related to multiple exposures of a single image;
a scan circuit configured to receive a plurality of activation signals from a processor to allow for the image data to be read out of the image sensor multiple times for each row of pixels in the array of pixels prior to a shutter reset, wherein a number of read outs and a frequency of read outs are determined by the processor prior to a first of the plurality of activation signals and based on feedback factors, the feedback factors associated with capture metrics for the single image;
an image signal processor coupled to the data path, wherein the image signal processor is configured to receive the image data and to separate the image data into at least two categories based on at least one criteria; and
a first buffer coupled to the image signal processor and configured to store a first portion of the image data, the first portion corresponding to at least one of the at least two categories.
2. The image processing circuit of claim 1, comprising a second buffer coupled to the image signal processor and configured to store a second portion of the image data, the second portion corresponding to a second one of the at least two categories.
3. The image processing circuit of claim 2, wherein the image signal processor is configured to retrieve the first and the second portions of the image data stored in each of the first and the second buffer and to utilize the retrieved first and second portions of the image data to generate a high dynamic range image.
4. The image processing circuit of claim 3, comprising a third buffer coupled to the image signal processor and configured to store the high dynamic range image.
5. The image processing circuit processor of claim 1, wherein the at least one criteria comprises a value corresponding to a time at which the image data was transmitted along the data path.
6. The image processing circuit processor of claim 1, wherein the at least two categories comprise an underexposed image data category and an overexposed image data category.
7. A method of generating a high definition image comprising:
determining feedback factors associated with image capture metrics for a single image;
determining a number of times and frequency of a plurality of read requests to send to each row of pixels in an image sensor prior to a shutter reset for the single image, the number of times and the frequency being based on the feedback factors, wherein the determining a number of times and frequency is performed by a first processor prior to a first of the plurality of read requests;
receiving at an image signal processor image data related to the plurality of read requests;
separating at the image signal processor the image data into at least two categories based on at least one criteria; and storing, in a first buffer, a first portion of the separated image data, the first portion corresponding to at least one of the at least two categories.
8. The method of claim 7, further comprising storing in a second buffer a second portion of the separated image data, the second portion corresponding to a second one of the at least two categories.
9. The method of claim 8, comprising retrieving the first and the second portion of the separated image data stored in each of the first and the second buffers and generating a high dynamic range image based on the retrieved data.
10. The method of claim 8, comprising storing in a third buffer a third portion of the separated image data, the third portion corresponding to a third one of the at least two categories and retrieving the first, the second and the third portions of the separated image data stored in each of the first, the second, and the third buffer and generating a high dynamic range image based on the retrieved data.
11. The method of claim 7, wherein separating the image data into at least two categories based on at least one criteria comprises separating the image data into at least two categories based on a time at which the image data was transmitted to the image signal processor.
12. The method of claim 7, wherein separating the image data into at least two categories comprises separating the image data into an underexposed image data category and an overexposed image data category.
13. An image capture device comprising:
a lens configured to receive light corresponding to capture of a single image;
an image sensor coupled to the lens, wherein the image sensor includes an array of pixel locations and is configured to receive the light corresponding to the single image from the lens and generate data values corresponding to the received light at the pixel locations in the image sensor
a processor configured to determine a number of times and a frequency of a plurality of read requests of each row of pixel locations in the array prior to a shutter reset to send to an image sensor for the single image based on feedback factors, wherein the determined number of times and frequency are determined prior to a first of the plurality of read requests and based on the feedback factors; and
a scan circuit coupled to the image sensor, wherein the scan circuit is configured to read out a plurality of sets of data values corresponding to the received light from a row of pixel locations in the image sensor for each of the plurality of read requests,
wherein each of the plurality of read requests occurs prior to a single shutter reset of the pixel locations.
14. The image capture device of claim 13, further comprising an analog to digital converter coupled to the image sensor, wherein the analog to digital converter receives the plurality of sets of data values and converts the plurality of sets of data values from analog signals to digital signals.
15. The image capture device of claim 13, further comprising a line buffer configured to receive and store a first set of the plurality of sets of data values prior to transmission to an analog to digital converter.
16. The image capture device of claim 13, further comprising a line buffer configured to receive and store all data signals read out from all pixel locations of the image sensor during a first read by the scan circuit.
17. An electronic device comprising:
an image sensor configured to receive light corresponding to a single image and generate data values corresponding to the received light at pixel locations for an array of pixels in the image sensor;
a processor configured to determine a number of times and frequency of a plurality of read requests of each row of pixel locations prior to a shutter reset to send to the image sensor for the single image, wherein the number of times and frequency are determined prior to a first of the plurality of read requests and based on the feedback factors and the feedback factors are based on the received light;
a scan circuit coupled to the image sensor, wherein the scan circuit is configured to read out a plurality of sets of data values corresponding to the received light from each of the rows of pixel locations in the image sensor for each of the plurality of read requests, wherein each of the plurality of read requests occurs prior to a common reset of the pixel locations;
an analog to digital converter coupled to the image sensor, wherein the analog to digital converter receives the plurality of sets of data values and converts the plurality of sets of data values into image data;
an image signal processor configured to receive the image data and to separate the image data into at least two categories based on at least one criteria; and
a first buffer coupled to the image signal processor, wherein the first buffer is configured to store separated data corresponding to at least one of the at least two categories.
18. The electronic device of claim 17, further comprising a data path coupling the analog to digital converter and the image sensor and the image signal processor, wherein the data path comprises a serial data line.
19. The electronic device of claim 17, further comprising a second buffer coupled to the image sensor and configured to store data signals read out from pixel locations of the image sensor during a first read by the scan circuit.
20. The image processing circuit of claim 17, further comprising a second buffer coupled to the image signal processor and configured to store separated data corresponding to a second one of the at least two categories.
21. The image processing circuit of claim 20, wherein the wherein the image signal processor is configured to retrieve the data stored in each of the first and second buffer and to utilize the retrieved data to generate a high dynamic range Image.
22. A non-transitory computer-readable storage medium storing instructions which, when executed by a computing device, cause the computing device to:
read, in response to a plurality of activation signals, image data from each row of pixels from an image sensor's array of pixels multiple times prior to a shutter reset, wherein a number of times each row of pixels from the image sensor's pixel array is read and a frequency at which each row of pixels from the image sensor's pixel array is read are determined prior to a first of the plurality of activation signals being read and are both based on feedback factors that are associated with capture metrics for a single image;
separate the image data into at least two categories based on at least one criteria; and
store a first portion of the image data in a first buffer, wherein the first portion corresponds to at least one of the two categories.
23. The non-transitory computer-readable medium of claim 22 which, when executed by a computing device, cause the computing device to store a second portion of the image data in a second buffer, wherein the second portion corresponds to a second one of the at least two categories.
24. The non-transitory computer-readable medium of claim 23 which, when executed by a computing device, cause the computing device to retrieve the first and the second portions of the image data and to utilize the retrieved first and second portions of the image data to generate a high dynamic range image.
25. The non-transitory computer-readable medium of claim 22, wherein the at least two categories comprise an underexposed image data category and an overexposed image data category.
US13/157,090 2011-06-09 2011-06-09 Image sensor having HDR capture capability Active 2031-08-07 US9077917B2 (en)

Priority Applications (7)

Application Number Priority Date Filing Date Title
US13/157,090 US9077917B2 (en) 2011-06-09 2011-06-09 Image sensor having HDR capture capability
PCT/US2012/041398 WO2012170717A1 (en) 2011-06-09 2012-06-07 Image sensor having hdr capture capability
EP12171184.0A EP2533520B1 (en) 2011-06-09 2012-06-07 Image sensor having HDR capture capability
TW101120795A TWI511558B (en) 2011-06-09 2012-06-08 Image sensor having hdr capture capability
CN201210269110.9A CN102857712B (en) 2011-06-09 2012-06-08 There is the image sensor of HDR capture capability
KR1020120062155A KR101361331B1 (en) 2011-06-09 2012-06-11 Image sensor having hdr capture capability
JP2012132345A JP5531054B2 (en) 2011-06-09 2012-06-11 Image sensor having HDR imaging function

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US13/157,090 US9077917B2 (en) 2011-06-09 2011-06-09 Image sensor having HDR capture capability

Publications (2)

Publication Number Publication Date
US20120314100A1 US20120314100A1 (en) 2012-12-13
US9077917B2 true US9077917B2 (en) 2015-07-07

Family

ID=46798978

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/157,090 Active 2031-08-07 US9077917B2 (en) 2011-06-09 2011-06-09 Image sensor having HDR capture capability

Country Status (7)

Country Link
US (1) US9077917B2 (en)
EP (1) EP2533520B1 (en)
JP (1) JP5531054B2 (en)
KR (1) KR101361331B1 (en)
CN (1) CN102857712B (en)
TW (1) TWI511558B (en)
WO (1) WO2012170717A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140146188A1 (en) * 2012-11-23 2014-05-29 Mediatek Inc. Data processing apparatus with adaptive compression algorithm selection for data communication based on sensor input/sensor configuration/display configuration over camera interface and related data processing method

Families Citing this family (54)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8788348B2 (en) 2011-12-20 2014-07-22 Wikipad, Inc. Combination game controller and point of sale input device
KR101938381B1 (en) * 2012-07-10 2019-01-14 삼성전자주식회사 Imaging apparatus and imaging method
US9179085B1 (en) 2014-11-06 2015-11-03 Duelight Llc Image sensor apparatus and method for obtaining low-noise, high-speed captures of a photographic scene
US9531961B2 (en) 2015-05-01 2016-12-27 Duelight Llc Systems and methods for generating a digital image using separate color and intensity data
US9167174B1 (en) 2014-11-05 2015-10-20 Duelight Llc Systems and methods for high-dynamic range images
US8976264B2 (en) 2012-09-04 2015-03-10 Duelight Llc Color balance in digital photography
US9137455B1 (en) 2014-11-05 2015-09-15 Duelight Llc Image sensor apparatus and method for obtaining multiple exposures with zero interframe time
US9918017B2 (en) 2012-09-04 2018-03-13 Duelight Llc Image sensor apparatus and method for obtaining multiple exposures with zero interframe time
US9167169B1 (en) 2014-11-05 2015-10-20 Duelight Llc Image sensor apparatus and method for simultaneously capturing multiple images
US9179062B1 (en) 2014-11-06 2015-11-03 Duelight Llc Systems and methods for performing operations on pixel data
US9154708B1 (en) 2014-11-06 2015-10-06 Duelight Llc Image sensor apparatus and method for simultaneously capturing flash and ambient illuminated images
US9160936B1 (en) 2014-11-07 2015-10-13 Duelight Llc Systems and methods for generating a high-dynamic range (HDR) pixel stream
US8446481B1 (en) * 2012-09-11 2013-05-21 Google Inc. Interleaved capture for high dynamic range image acquisition and synthesis
KR101948692B1 (en) 2012-10-09 2019-04-25 삼성전자주식회사 Phtographing apparatus and method for blending images
US8866927B2 (en) 2012-12-13 2014-10-21 Google Inc. Determining an image capture payload burst structure based on a metering image capture sweep
US9087391B2 (en) 2012-12-13 2015-07-21 Google Inc. Determining an image capture payload burst structure
US8866928B2 (en) 2012-12-18 2014-10-21 Google Inc. Determining exposure times using split paxels
US9247152B2 (en) 2012-12-20 2016-01-26 Google Inc. Determining image alignment failure
US8995784B2 (en) 2013-01-17 2015-03-31 Google Inc. Structure descriptors for image processing
US9686537B2 (en) 2013-02-05 2017-06-20 Google Inc. Noise models for image processing
US9807322B2 (en) 2013-03-15 2017-10-31 Duelight Llc Systems and methods for a digital image sensor
US10558848B2 (en) 2017-10-05 2020-02-11 Duelight Llc System, method, and computer program for capturing an image with correct skin tone exposure
US9819849B1 (en) 2016-07-01 2017-11-14 Duelight Llc Systems and methods for capturing digital images
US20140240469A1 (en) * 2013-02-28 2014-08-28 Motorola Mobility Llc Electronic Device with Multiview Image Capture and Depth Sensing
WO2014133557A1 (en) * 2013-02-28 2014-09-04 Wikipad, Inc. Combination game controller and point of sale input device
US9117134B1 (en) 2013-03-19 2015-08-25 Google Inc. Image merging with blending
US9066017B2 (en) 2013-03-25 2015-06-23 Google Inc. Viewfinder display based on metering images
US9955084B1 (en) 2013-05-23 2018-04-24 Oliver Markus Haynold HDR video camera
US9131201B1 (en) 2013-05-24 2015-09-08 Google Inc. Color correcting virtual long exposures with true long exposures
US9077913B2 (en) 2013-05-24 2015-07-07 Google Inc. Simulating high dynamic range imaging with virtual long-exposure images
TWI502990B (en) * 2013-06-27 2015-10-01 Altek Semiconductor Corp Method for generating high dynamic range image and image sensor thereof
US9615012B2 (en) 2013-09-30 2017-04-04 Google Inc. Using a second camera to adjust settings of first camera
US10104318B2 (en) 2013-12-04 2018-10-16 Rambus Inc. High dynamic-range image sensor
WO2015196456A1 (en) * 2014-06-27 2015-12-30 深圳市大疆创新科技有限公司 High dynamic range video record method and apparatus based on bayer color filter array
US10277771B1 (en) 2014-08-21 2019-04-30 Oliver Markus Haynold Floating-point camera
US10225485B1 (en) 2014-10-12 2019-03-05 Oliver Markus Haynold Method and apparatus for accelerated tonemapping
US10924688B2 (en) 2014-11-06 2021-02-16 Duelight Llc Image sensor apparatus and method for obtaining low-noise, high-speed captures of a photographic scene
US11463630B2 (en) 2014-11-07 2022-10-04 Duelight Llc Systems and methods for generating a high-dynamic range (HDR) pixel stream
TWI558204B (en) * 2014-11-25 2016-11-11 Trans Electric Co Ltd Image capture system
TWI558193B (en) * 2015-05-11 2016-11-11 Trans Electric Co Ltd Film production method
US20170142313A1 (en) * 2015-11-16 2017-05-18 Microsoft Corporation Image sensor system
US9743025B2 (en) * 2015-12-30 2017-08-22 Omnivision Technologies, Inc. Method and system of implementing an uneven timing gap between each image capture in an image sensor
CN106961557B (en) 2016-01-08 2020-01-31 中强光电股份有限公司 Light field camera and image processing method thereof
US10742390B2 (en) * 2016-07-13 2020-08-11 Novatek Microelectronics Corp. Method of improving clock recovery and related device
EP3507765A4 (en) 2016-09-01 2020-01-01 Duelight LLC Systems and methods for adjusting focus based on focus target information
US10187584B2 (en) * 2016-12-20 2019-01-22 Microsoft Technology Licensing, Llc Dynamic range extension to produce high dynamic range images
TW201822709A (en) * 2016-12-30 2018-07-01 曦威科技股份有限公司 Real-time heart rate detection method and real-time heart rate detection system therefor
CN107066987B (en) * 2017-04-26 2020-12-15 胡明建 Method for directly transmitting camera data to GPU for processing
CN109104584B (en) * 2017-06-21 2020-07-10 比亚迪股份有限公司 Image sensor and method for acquiring high dynamic range image thereof
TW201915818A (en) * 2017-10-05 2019-04-16 香港商印芯科技股份有限公司 Optical identification module
CA3079400C (en) * 2017-10-18 2023-09-05 Perkinelmer Health Sciences, Inc. Rapid, high dynamic range image acquisition with a charge-coupled device (ccd) camera
JP7278750B2 (en) * 2018-11-14 2023-05-22 キヤノン株式会社 Imaging device
TWI739431B (en) * 2019-12-09 2021-09-11 大陸商廣州印芯半導體技術有限公司 Data transmission system and data transmission method thereof
CN115379092A (en) * 2022-08-17 2022-11-22 中南大学 High dynamic range video acquisition method, system and terminal

Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5309243A (en) * 1992-06-10 1994-05-03 Eastman Kodak Company Method and apparatus for extending the dynamic range of an electronic imaging system
US6115065A (en) 1995-11-07 2000-09-05 California Institute Of Technology Image sensor producing at least two integration times from each sensing pixel
US20030210345A1 (en) * 2002-04-22 2003-11-13 Satoru Nakamura Image pickup device and method for capturing subject with wide range of brightness
US6707499B1 (en) 1998-12-08 2004-03-16 Industrial Technology Research Institute Technique to increase dynamic range of a CCD image sensor
US6927796B2 (en) 2001-09-24 2005-08-09 The Board Of Trustees Of The Leland Stanford Junior University CMOS image sensor system with self-reset digital pixel architecture for improving SNR and dynamic range
KR20060022804A (en) 2004-09-08 2006-03-13 매그나칩 반도체 유한회사 Image sensor readout circuit
US7030923B2 (en) 2000-03-27 2006-04-18 Sanyo Electric Co., Ltd. Digital camera having overlapped exposure
US7067791B2 (en) 2002-10-15 2006-06-27 Applera Cororation System and methods for dynamic range extension using variable length integration time sampling
US20060170662A1 (en) 2004-12-07 2006-08-03 Haruhisa Kurane Image pickup device
US20070273785A1 (en) * 2004-11-01 2007-11-29 Masahiro Ogawa Image Sensor
US7382407B2 (en) 2002-08-29 2008-06-03 Micron Technology, Inc. High intrascene dynamic range NTSC and PAL imager
US20080174685A1 (en) * 2007-01-22 2008-07-24 Omnivision Technologies, Inc. Image sensors with blooming reduction mechanisms
US20080291313A1 (en) * 2004-08-27 2008-11-27 Alexander Krymski High dynamic range imager with a rolling shutter
US20090091645A1 (en) 2007-10-03 2009-04-09 Nokia Corporation Multi-exposure pattern for enhancing dynamic range of images
US20090135263A1 (en) * 2007-11-27 2009-05-28 Noam Sorek Method and Apparatus for Expanded Dynamic Range Imaging
US7616243B2 (en) * 2007-03-07 2009-11-10 Altasens, Inc. Method and apparatus for improving and controlling dynamic range in an image sensor
US20090316031A1 (en) * 2008-06-19 2009-12-24 Yamaha Corporation CMOS solid state imaging device
US20100026868A1 (en) * 2006-08-29 2010-02-04 Shimon Pertsel Wide Dynamic Range Image Capturing System Method and Apparatus
US20100026838A1 (en) * 2006-11-20 2010-02-04 Ben Gurion University Of The Negev Research And Development Authority Optical pixel and image sensor
US20100165135A1 (en) * 2006-12-20 2010-07-01 Nokia Corporation Exposure control based on image sensor cost function
US20100328490A1 (en) * 2009-06-26 2010-12-30 Seiko Epson Corporation Imaging control apparatus, imaging apparatus and imaging control method
US20110013064A1 (en) * 2009-07-15 2011-01-20 Tower Semiconductor Ltd. CMOS Image Sensor With Wide (Intra-Scene) Dynamic Range
US20110122287A1 (en) * 2009-11-25 2011-05-26 Keiji Kunishige Imaging device and imaging device control method
US20110222793A1 (en) * 2010-03-09 2011-09-15 Sony Corporation Image processing apparatus, image processing method, and program

Patent Citations (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5309243A (en) * 1992-06-10 1994-05-03 Eastman Kodak Company Method and apparatus for extending the dynamic range of an electronic imaging system
US6115065A (en) 1995-11-07 2000-09-05 California Institute Of Technology Image sensor producing at least two integration times from each sensing pixel
US7190398B1 (en) 1995-11-07 2007-03-13 California Institute Of Technology Image sensor with high dynamic range linear output
US6707499B1 (en) 1998-12-08 2004-03-16 Industrial Technology Research Institute Technique to increase dynamic range of a CCD image sensor
US7030923B2 (en) 2000-03-27 2006-04-18 Sanyo Electric Co., Ltd. Digital camera having overlapped exposure
US6927796B2 (en) 2001-09-24 2005-08-09 The Board Of Trustees Of The Leland Stanford Junior University CMOS image sensor system with self-reset digital pixel architecture for improving SNR and dynamic range
US20030210345A1 (en) * 2002-04-22 2003-11-13 Satoru Nakamura Image pickup device and method for capturing subject with wide range of brightness
US7382407B2 (en) 2002-08-29 2008-06-03 Micron Technology, Inc. High intrascene dynamic range NTSC and PAL imager
US7067791B2 (en) 2002-10-15 2006-06-27 Applera Cororation System and methods for dynamic range extension using variable length integration time sampling
US20080291313A1 (en) * 2004-08-27 2008-11-27 Alexander Krymski High dynamic range imager with a rolling shutter
KR20060022804A (en) 2004-09-08 2006-03-13 매그나칩 반도체 유한회사 Image sensor readout circuit
US20070273785A1 (en) * 2004-11-01 2007-11-29 Masahiro Ogawa Image Sensor
US20060170662A1 (en) 2004-12-07 2006-08-03 Haruhisa Kurane Image pickup device
US20100026868A1 (en) * 2006-08-29 2010-02-04 Shimon Pertsel Wide Dynamic Range Image Capturing System Method and Apparatus
US20100026838A1 (en) * 2006-11-20 2010-02-04 Ben Gurion University Of The Negev Research And Development Authority Optical pixel and image sensor
US20100165135A1 (en) * 2006-12-20 2010-07-01 Nokia Corporation Exposure control based on image sensor cost function
US20080174685A1 (en) * 2007-01-22 2008-07-24 Omnivision Technologies, Inc. Image sensors with blooming reduction mechanisms
US7616243B2 (en) * 2007-03-07 2009-11-10 Altasens, Inc. Method and apparatus for improving and controlling dynamic range in an image sensor
US20090091645A1 (en) 2007-10-03 2009-04-09 Nokia Corporation Multi-exposure pattern for enhancing dynamic range of images
US7940311B2 (en) 2007-10-03 2011-05-10 Nokia Corporation Multi-exposure pattern for enhancing dynamic range of images
US20090135263A1 (en) * 2007-11-27 2009-05-28 Noam Sorek Method and Apparatus for Expanded Dynamic Range Imaging
US20090316031A1 (en) * 2008-06-19 2009-12-24 Yamaha Corporation CMOS solid state imaging device
US20100328490A1 (en) * 2009-06-26 2010-12-30 Seiko Epson Corporation Imaging control apparatus, imaging apparatus and imaging control method
JP2011010108A (en) 2009-06-26 2011-01-13 Seiko Epson Corp Imaging control apparatus, imaging apparatus, and imaging control method
US20110013064A1 (en) * 2009-07-15 2011-01-20 Tower Semiconductor Ltd. CMOS Image Sensor With Wide (Intra-Scene) Dynamic Range
US20110122287A1 (en) * 2009-11-25 2011-05-26 Keiji Kunishige Imaging device and imaging device control method
US20110222793A1 (en) * 2010-03-09 2011-09-15 Sony Corporation Image processing apparatus, image processing method, and program
US8483452B2 (en) * 2010-03-09 2013-07-09 Sony Corporation Image processing apparatus, image processing method, and program

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
Extended European Search Report received in corresponding EP Application No. 12171184.0, dated Jun. 20, 2013.
International Search Report regarding application No. PCT/US12/41398 dated Nov. 5, 2012.
Jinwei Gu, et al., "Coded Rolling Shutter Photography: Flexible Space-Time Sampling," Computational Photography (ICCP) 2010 IEEE International Conference on, IEEE, Piscataway, NJ, USA, Mar. 29, 2010, pp. 1-8, XP031763024, ISBN: 978-1-4244-7022-8.
Office Action received in JP Application No. 2012-132345, dated Aug. 21, 2013.
Office Action received in KR Application No. 10-2012-62155, dated Jul. 22, 2013.

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140146188A1 (en) * 2012-11-23 2014-05-29 Mediatek Inc. Data processing apparatus with adaptive compression algorithm selection for data communication based on sensor input/sensor configuration/display configuration over camera interface and related data processing method
US10200603B2 (en) 2012-11-23 2019-02-05 Mediatek Inc. Data processing system for transmitting compressed multimedia data over camera interface

Also Published As

Publication number Publication date
EP2533520B1 (en) 2016-10-12
JP5531054B2 (en) 2014-06-25
CN102857712B (en) 2016-06-15
EP2533520A3 (en) 2013-07-24
EP2533520A2 (en) 2012-12-12
CN102857712A (en) 2013-01-02
TW201301883A (en) 2013-01-01
WO2012170717A1 (en) 2012-12-13
TWI511558B (en) 2015-12-01
JP2013059017A (en) 2013-03-28
US20120314100A1 (en) 2012-12-13
KR101361331B1 (en) 2014-02-10
KR20130000325A (en) 2013-01-02

Similar Documents

Publication Publication Date Title
US9077917B2 (en) Image sensor having HDR capture capability
US11206353B2 (en) Electronic apparatus, method for controlling electronic apparatus, and control program for setting image-capture conditions of image sensor
JP6803982B2 (en) Optical imaging method and equipment
RU2542928C2 (en) System and method for processing image data using image signal processor having final processing logic
EP3018893A1 (en) Electronic apparatus, electronic apparatus control method, and control program
US20120081553A1 (en) Spatial filtering for image signal processing
WO2012044434A1 (en) Overflow control techniques for image signal processing
WO2012047426A1 (en) Techniques for synchronizing audio and video data in an image signal processing system
JP2006191622A (en) Isp built-in image sensor and dual-camera system
JPWO2015045829A1 (en) Imaging apparatus and imaging method
CN112785510B (en) Image processing method and related product
CN113810593B (en) Image processing method, device, storage medium and electronic equipment
US11503223B2 (en) Method for image-processing and electronic device
US20200204722A1 (en) Imaging apparatus, imaging method, and program
CN110278375B (en) Image processing method, image processing device, storage medium and electronic equipment
CN116055890A (en) Method and electronic device for generating high dynamic range video
US8934042B2 (en) Candidate image presenting method using thumbnail image and image signal processing device and imaging device performing the same
CN110266965B (en) Image processing method, image processing device, storage medium and electronic equipment
JP2008072501A (en) Imaging apparatus, imaging method, program, and recording medium

Legal Events

Date Code Title Description
AS Assignment

Owner name: APPLE INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:FRANK, MICHAEL;REEL/FRAME:026436/0751

Effective date: 20110607

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE

CC Certificate of correction
MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8