US20080226274A1 - Systems For Improved Autofocus in Digital Imaging Systems - Google Patents

Systems For Improved Autofocus in Digital Imaging Systems Download PDF

Info

Publication number
US20080226274A1
US20080226274A1 US12/127,139 US12713908A US2008226274A1 US 20080226274 A1 US20080226274 A1 US 20080226274A1 US 12713908 A US12713908 A US 12713908A US 2008226274 A1 US2008226274 A1 US 2008226274A1
Authority
US
United States
Prior art keywords
focus
autofocus
attitude
lens
digital imaging
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/127,139
Inventor
Anthony C. Spielberg
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US12/127,139 priority Critical patent/US20080226274A1/en
Publication of US20080226274A1 publication Critical patent/US20080226274A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/671Focus control based on electronic image sensor signals in combination with active ranging signals, e.g. using light or sound signals emitted toward objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6812Motion detection based on additional sensors, e.g. acceleration sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation

Definitions

  • the present invention is in the field of digital imaging systems and, in particular, to systems and methods for improved autofocus in digital imaging systems such as digital cameras.
  • Digital imaging systems such as digital cameras continue to increase in popularity, providing users with the ability to capture images (i.e., take photographs) with relative ease.
  • Digital imaging systems typically include a lens for directing the light comprising a digital image through a light path to an optical sensor array.
  • Autofocus systems (as well as other automations such as automatic exposure or flash) are often an important part of digital imaging systems as they improve the user experience by making such systems easier to use.
  • Whether an object in an image is ‘in focus’ (i.e., at the sharpest possible setting) for a digital imaging system depends on a number of factors, including the distance between the lens and the sensor array, the lens focal length, exposure aperture, and the distance to the subject.
  • Autofocus systems typically include a focusing motor that moves a portion of the lens of the digital imaging system in and out until the sharpest possible image of the desired subject is projected onto the optical sensor array.
  • a user would turn a focusing ring on the lens until the image (or portion of an image) in the viewfinder appeared in focus.
  • Autofocus systems typically rely on active autofocus, passive autofocus, or a combination of the two, and utilize one or more autofocus sensors within the field of view.
  • Active autofocus systems measure the distance to the subject (using, for example, sound or infrared signals) and adjust focus of the optical system accordingly.
  • Passive systems analyze the incoming image itself and drive the lens back and forth searching for the best focus and can include both phase detection systems and contrast measurement systems.
  • Complicated autofocus systems with many sensors can add significant cost and complexity to a digital imaging system, as autofocus sensors are relatively expensive and more accurate sensors (e.g., horizontal and vertical capability) are more expensive still.
  • DSLR digital single-lens-reflex
  • the solution of locking the autofocus and recomposing often provides unacceptable results, however, when the depth-of-focus (DOF) is small compared to the difference in subject distance between the scene composed as desired as the scene composed during focus lock.
  • DOF depth-of-focus
  • a user taking a portrait might lock focus on the subject's eyes but when the user recomposes, the plane of focus would be behind the eyes.
  • the DOF may vary depending on imaging sensor size, imaging lens focal length, exposure aperture, and subject distance. For close subject distances and/or large exposure apertures with small DOF, the problem is exacerbated and unacceptable focus shifts are introduced. There is, therefore, a need for an effective system to provide improved autofocus for digital imaging systems.
  • Embodiments may include, in response to locking of lens focus on a subject image, determining an initial subject focus distance and an initial attitude, and in response to a request for an exposure, determining a final attitude. Embodiments may also include determining a final target subject distance based on the initial subject focus distance and a focus correction distance, where the focus correction distance is based on the difference between the initial attitude and the final attitude. Embodiments may also include focusing the lens at the final target subject distance. Further embodiments may include after focusing the lens at the final target subject distance, exposing an image.
  • a further embodiment provides a digital imaging system having a user interface module to receive focus lock commands and exposure commands from a user and an image capture module to generate a digital image.
  • the system may also include an autofocus system module to automatically focus a lens, where the autofocus system module has a primary autofocus system to determine whether a lens is in focus.
  • the autofocus system module may also have an attitude sensor interface to receive an indication of an initial attitude in response to a focus lock command and a final attitude in response to an exposure command.
  • the autofocus system module may further include an autofocus correction module to determine a focus correction distance based on the initial attitude and the final attitude, where the primary autofocus system corrects lens focus based on the focus correction distance before an exposure is made.
  • a further embodiment provides a digital imaging system having a housing, a processor within the housing to control operation of the digital imaging system, an autofocus system, and an optical sensor array within the housing and in communication with the processor, where the optical sensor array generates an image in response to light.
  • the system further may include a attitude sensor to determine an initial attitude in response to a focus lock command and a final attitude in response to a shutter release command.
  • the processor of the system may determine a focus correction distance based on difference between the initial attitude and the final attitude and may modify focus based on the focus correction distance.
  • the attitude sensor may be a MEMS-based sensor.
  • FIG. 1 depicts an environment for a digital imaging system with an autofocus system and an attitude sensor according to one embodiment
  • FIG. 2 depicts a block diagram of a digital camera suitable for use as the digital imaging system of FIG. 1 according to one embodiment
  • FIG. 3 depicts a conceptual illustration of software components of a digital imaging system such as a digital camera according to one embodiment
  • FIG. 4 depicts an example of a flow chart for correcting focus based on a change in attitude according to one embodiment.
  • Embodiments may include, in response to locking of lens focus on a subject image, determining an initial subject focus distance and an initial attitude, and in response to a request for an exposure, determining a final attitude.
  • Attitude as described in more detail subsequently, may represent the orientation in space of an object with respect to defined axes.
  • Embodiments may also include determining a final target subject distance based on the initial subject focus distance and a focus correction distance, where the focus correction distance is based on the difference between the initial attitude and the final attitude.
  • Embodiments may also include focusing the lens at the final target subject distance. Further embodiments may include after focusing the lens at the final target subject distance, exposing an image.
  • Other embodiments may include determining the initial and final attitude with a micro-electro-mechanical (MEMS)-based sensor or other sensor.
  • MEMS micro-electro-mechanical
  • the system and methodology of the disclosed embodiments provides an improved method of determining autofocus of a digital imaging system such as a digital camera.
  • a sensor such as a MEMS-based sensor
  • the system may determine an initial attitude when a user selects focus lock and a final attitude after a user recomposes an image and attempts to take an exposure.
  • the attitude of an object may represent its orientation in space comprising the yaw, pitch, and roll of that object with respect to defined axes.
  • the change in attitude of an object may represent its combined angular rotation in yaw, pitch, and roll with respect to defined axes.
  • the disclosed system may advantageously provide a correction to the autofocus based on the difference between the initial attitude and the final attitude (i.e., the change in attitude) to correct for a change in distance resulting from rotation of the digital imaging system.
  • a user focusing on one part of an image, such as an important object like a portrait subject's eyes, may select focus lock and recompose the image.
  • Previous systems resulted in the potential for the initial subject to be out of focus after recomposition because in the change of distance resulting from rotation of the imaging system.
  • the disclosed system advantageously provides for a corrected focus distance so that the initial subject (i.e., the subject's eyes) will be in focus.
  • the disclosed system may be particularly useful where the depth-of-focus is small or shallow, such as with large apertures.
  • FIG. 1 depicts an environment for a digital imaging system with an autofocus system and an attitude sensor according to one embodiment.
  • the digital imaging system 100 includes a camera housing 102 , a lens 104 with an optional manual focus ring 106 , an attitude sensor 108 , and an autofocus system 110 .
  • the digital imaging system 100 of the depicted embodiment is pointing towards an initial subject 112 .
  • the digital imaging system 100 may in some embodiments be a digital camera such as a digital single-lens-reflex (DSLR) (either fixed-lens DSLR or interchangeable-lens DSLR), digital rangefinder camera, digital point-and-shoot (P&S) camera, or fixed-lens digital camera.
  • DSLR digital single-lens-reflex
  • P&S digital point-and-shoot
  • the digital imaging system 100 may alternatively be an imaging device integrated with another system, such as a mobile phone, personal digital assistant (PDA), wearable device, or mobile computer.
  • the digital imaging system 100 may be a digital camcorder, digital video camera, or other digital video-recording device.
  • a user may point the digital imaging system 100 in the direction of a subject in order to take a picture of that subject.
  • light enters through lens 104 and is directed through a light path until it strikes an optical sensor array (described in relation to FIG. 2 ), whereupon the optical sensor array captures the image.
  • Some digital imaging systems 100 have a single lens while others may have separate lenses for composition and taking.
  • a single lens 104 is used for both viewing an object of interest and for capturing and directing light towards the optical sensor array.
  • Any lenses 104 may also be permanently attached to the camera housing 102 (such as in the case of P&S camera) or may be detachable and interchangeable (such as in the case of interchangeable-lens DSLRs).
  • Lens 104 may also include a manual focus ring 106 to provide an alternative to the autofocus system 110 of the digital imaging system 100 or to provide fine tuning of the focus.
  • Autofocus systems 110 typically include a focusing motor that moves a portion of the lens 104 of the digital imaging system 100 in and out until the sharpest possible image of the desired subject is projected onto the optical sensor array. Autofocus systems 110 rely on one or more autofocus sensors within the user's field of view in order to determine whether or not objects ‘under’ the sensors are in focus. Autofocus systems 110 often determine a distance to the subject as part of their internal algorithms.
  • the digital imaging system 100 may also include one or more attitude sensors 108 , which may be located within camera housing 102 in some embodiments.
  • An attitude sensor 108 may determine the attitude of the digital imaging system 100 with respect to defined yaw, pitch, and roll axes.
  • the attitude sensor 108 may be a micro-electro-mechanical (MEMS)-based attitude sensor which measures the orientation of the digital imaging system 100 with respect to the earth's gravimetric field to determine the digital imaging system's 100 three-dimensional attitude in space.
  • MEMS-based attitude sensors may provide a relatively low-cost, reliable, and compact methodology of determining rotational position when compared to other sensors.
  • attitude sensor 108 may also have the capability of distinguishing landscape and portrait modes, eliminating the need for two sensors.
  • the attitude sensor 108 may be another type of sensor capable of detecting three-dimensional attitude in space, such as a gyroscope or inertial measurement unit (IMU).
  • the autofocus system 110 may utilize active autofocus, passive autofocus, or a combination of the two.
  • Active autofocus systems measure the distance to the subject (using, for example, sound or infrared signals) and adjust focus of the optical system accordingly.
  • Passive systems analyze the incoming image itself and drive the lens 104 back and forth searching for the best focus.
  • Passive autofocus systems can include both phase detection systems and contrast measurement systems. Phase detection autofocus systems divide the incoming light to particular autofocus sensors into pairs of images and then compare the resulting images to determine the proper focus.
  • Some contrast-based passive autofocus systems utilize an autofocus sensor such as a charge-coupled device (CCD) that provides input to algorithms to compute the contrast of the actual image elements.
  • CCD charge-coupled device
  • a CCD sensor typically has a group of pixels and an on-board processor analyzes the light hitting the sensor by looking at the difference in intensity among the adjacent pixels. As an out-of-focus scene has adjacent pixels with similar intensities, the processor may move the lens 104 and attempt to find the maximum intensity difference between adjacent pixels. The point of maximum intensity difference between pixels is then the point of best focus.
  • Autofocus systems may use a combination of different autofocus sensors and methods, such as a camera that utilizes a passive phase detection system with an autofocus ‘assist’ from an active autofocus system. In some embodiments, the autofocus sensor 110 may determine distance to the subject as part of its determination of proper focus.
  • a user may point the lens 104 of the digital imaging system 100 at an initial subject 112 such as a Christmas tree with a star on top as depicted in FIG. 1 .
  • the user may wish to center their focus (but not their composition) on the star of the tree, pointing the lens 104 of the digital camera system 100 along the line ‘A 2 ’ as depicted in FIG. 1 .
  • the autofocus system 110 of the digital imaging system 100 may then automatically focus on the star.
  • the user may then actuate focus lock by, for example, partially depressing the shutter-release button or actuating a focus lock button on the outside of the camera housing 102 to lock the lens focus in its current position.
  • focus lock by, for example, partially depressing the shutter-release button or actuating a focus lock button on the outside of the camera housing 102 to lock the lens focus in its current position.
  • the user is attempting to place the center of the plane of focus along line ‘A 1 ’ of FIG. 1 , centered on the star.
  • the user may then recompose their image by rotating the digital imaging system 100 and pointing the lens 104 along line ‘B 2 ’ as depicted in FIG. 1 while the lens focus remains locked, and then taking the photograph once recomposition is complete.
  • the plane of focus for the lens 104 will no longer be centered on the star because of the angilar rotation of the lens 104 and digital imaging system 100 .
  • the plane of focus for the lens 104 will instead be centered along line ‘B 1 ’ of FIG. 1 , behind the star, because of the locked lens focus, instead of along ‘A 1 ’ where the user desires.
  • the star may then become out of focus when the user takes the photograph after recomposing.
  • the DOF may vary depending on imaging sensor size, imaging lens focal length, exposure aperture, and subject distance.
  • the autofocus error may be exacerbated when the DOF is small, such as when the subject distance is relatively short or the exposure aperture is relatively large.
  • the autofocus performance of digital imaging system 100 in certain situations may be improved by correcting for the error caused by rotation (i.e., change in attitude) of the digital imaging system 100 after focus lock.
  • the attitude sensor 108 may determine the attitude of the digital imaging system 100 both at the time of focus lock (digital imaging system 100 oriented along line ‘A 2 ’) and when a photograph is taken (digital imaging system 100 oriented along line ‘B 2 ’). The difference between these two attitudes is the angle ‘ ⁇ ’ depicted in FIG. 1 . Using the calculated angle ‘ ⁇ ’ and a measurement of the distance along line ‘A 2 ’ when focus was locked (as described in more detail in relation to FIG.
  • the autofocus system 110 may calculate a focus correction distance and correct the focus before the exposure is taken.
  • the focus correction distance may effectively be the distance between lines ‘A 1 ’ and ‘B 1 ’.
  • the digital imaging system 100 provides more accurate focus that is consistent with a user's intentions.
  • the digital imaging system 100 may thus change the plane of focus from line ‘B 1 ’ to line ‘A 1 ’ so that the subject selected at focus lock (i.e., the star) is ‘in focus’ in the final exposure.
  • FIG. 2 depicts a block diagram of a digital camera 200 suitable for use as the digital imaging system 100 of FIG. 1 according to one embodiment.
  • the digital camera 200 of the depicted embodiment includes a processor 202 in connected to storage 204 , memory 206 , an optical sensor array 208 , an I/O driver/controller 210 , and an autofocus system 110 .
  • Processor 202 may include one or more system central processing units (CPUs) or processors to execute instructions.
  • Storage 204 may include storage devices for storing digital images captured by the digital camera 200 , such as removable media such as a microdrive or flash media devices such as a Secure Digital (SD)TM card (as defined by the SD Card Association), a CompactFlash® (CF) card, or a Memory Stick.
  • SD Secure Digital
  • CF CompactFlash®
  • Storage 204 may also include non-removable media such as hard drives or on-board non-volatile memory.
  • Memory 206 may include read-only memory (ROM), random access memory (RAM), or other types of memory (or combinations thereof) containing a plurality of executable instructions which, when executed on processor 202 , control the operation of the digital camera 200 .
  • the optical sensor array 208 may capture a digital image when exposed to light through lens 104 and store the image in storage 204 .
  • the I/O driver/controller 210 may facilitate communications between processor 202 and other components of the digital camera 200 , including user input devices and other hardware items.
  • the digital camera 200 includes two input devices, the shutter release button 212 and the focus lock button 214 .
  • a user may actuate the shutter release button 212 to take a photograph (i.e., to request the digital camera 200 to expose the optical sensor array 208 to light for a specified timeframe).
  • a user may actuate the optional focus lock button 214 whenever they wish to lock focus at its current position. By locking focus with the focus lock button 214 , a user may then recompose without having the autofocus modify their desired point of focus.
  • the user may achieve focus lock (and/or exposure lock) by partially depressing the shutter release button 212 , eliminating the need for the focus lock button 214 . In these embodiments, fully depressing the shutter release button 212 will take an exposure.
  • digital camera 200 may contain other user-actuated switches or buttons, such as playback buttons or manual focus rings 106 .
  • the I/O driver/controller 210 may also facilitate communications between processor 202 and other hardware items, such as a focusing motor 216 , autofocus sensors 218 , and the attitude sensor 108 .
  • the focusing motor 216 may be an electromechanical or other type of motor that drives the lens forward or backward or causes other changes in the physical state of the lens to adjust the focus of the lens, such as by changing the distance between optical elements within the lens.
  • the processor 202 based on commands from the autofocus system 110 , may command the direction and speed of the focusing motor 216 .
  • the autofocus sensors 218 may be, for example, CCD sensors that look at the difference in intensity among the adjacent pixels to provide input to algorithms to compute the contrast of the actual image elements.
  • the I/O driver/controller 210 may also facilitate communication with other hardware items, such as the lens 104 , a shutter (if utilized), LCD display, external ports, mirrors, etc.
  • the autofocus system 110 may be in communication with the processor 202 to receive input from the user input devices and other hardware as well as to send commands to the hardware devices.
  • the autofocus system 110 may, for example, receive input from the shutter release button 212 and focus lock button 214 in order to respond to user commands.
  • the autofocus system 110 may also receive information from the autofocus sensors 218 and attitude sensor 108 that it uses to determine the proper focus.
  • the autofocus system 110 may also receive status information (e.g., current location) from the focusing motor 216 and send commands to the focusing motor 216 to extend, retract, etc.
  • FIG. 3 depicts a conceptual illustration of software components of a digital imaging system 100 such as a digital camera 200 according to one embodiment.
  • the digital imaging system 100 of the depicted embodiment includes a user interface module 302 , an image capture module 304 , and an autofocus system module 306 .
  • the user interface module 302 may receive input from a user, such as actuations of a shutter release button 212 or focus lock button 214 , as well as provide output to a user via an LCD display or other output device (e.g., audio device).
  • the image capture module 304 may process the image recorded by the optical sensor array 208 , including any noise reduction, sharpening, changes to color or saturation, changing image formats, saving the image to the storage 204 , or any other task.
  • the autofocus system module 306 may control the autofocus system 110 and may include sub-modules such as a primary autofocus system 308 , a focusing motor controller 310 , an autofocus correction module 312 , and an attitude sensor interface 314 .
  • the primary autofocus system 308 may receive inputs from the autofocus sensors 218 to determine whether or not the image underneath the autofocus sensors is in focus, as is known in the art.
  • the primary autofocus system 308 may also produce commands to the focusing motor controller 310 for transmittal to the focusing motor 216 . Feedback from the focusing motor 216 may also be received by the focusing motor controller 310 and utilized by the primary autofocus system 308 .
  • the autofocus correction module 312 may determine a focus correction factor based on attitude sensor 108 information received from the attitude sensor interface 314 and based on distance information determined by the primary autofocus system 308 . As will be described in more detail in relation to FIG. 4 , the autofocus correction module 312 may correct for focusing errors resulting in the rotation of a digital imaging system 100 after focus lock has been requested.
  • the attitude sensor interface 314 may provide for interaction between the autofocus system module 306 and the attitude sensor 108 and may optionally perform processing on data received from the attitude sensor 108 .
  • FIG. 4 depicts an example of a flow chart 400 for correcting focus based on a change in attitude according to one embodiment.
  • the method of flow chart 400 may be performed, in one embodiment, by components of a digital imaging system 100 and, in particular, the autofocus system module 306 and its sub-modules.
  • Flow chart 400 begins with optional element 402 , where the autofocus system module 306 activates autofocus correction according to embodiments of the present invention.
  • the autofocus system module 306 may activate autofocus correction in response to a request via a user who inputs the request via a button or entry in a control menu of a LCD display.
  • the autofocus system module 306 may activate the autofocus correction based on default settings or other means (e.g., autofocus correction automatically on).
  • the primary autofocus system 308 of the autofocus system module 306 may determine at decision block 404 when a user has locked the focus, such as by actuating the focus lock button 214 or partially depressing the shutter release button 212 .
  • Focus lock may be requested after use of the digital imaging system, such as when a user utilizes the autofocus system 110 to focus on a desired subject. Once focus lock is detected, the method of flow chart 400 continues to element 406 ; otherwise, the method awaits a determination that focus has been locked.
  • the primary autofocus system 308 may determine the initial subject focus distance at element 406 .
  • the initial subject focus distance is the distance between the digital imaging system 100 and the point the user selects to be the point of focus the user locks in, as represented by distance ‘A 2 ’ in FIG. 1 .
  • the primary autofocus system 308 may determine the initial subject focus distance using any methodology, including as part of an active autofocus system (where distance measurement is part of the autofocus procedure) or based on a passive autofocus system (where distance may be measured or determined as part of the autofocus algorithm).
  • primary autofocus system 308 may determine the initial subject focus distance another way, such as by receiving input from a different distance-determining device or by calculating the distance based on the lens 104 position when focused.
  • the autofocus correction module 312 of the autofocus system module 306 may also determine an initial attitude at element 408 .
  • the initial attitude may be the attitude of digital imaging system 100 at the time when autofocus is selected.
  • the initial attitude may, in one example, be the attitude of digital imaging system 100 when it is pointing along line ‘A 2 ’ of FIG. 1 .
  • the initial attitude may be measured in relation to the earth's gravimetric field, such as when measured by a MEMS-based attitude sensor 108 .
  • the reference frame of the initial attitude may be any frame assuming that it is substantially similar to the reference frame of the final attitude determined later, as the difference between the two attitudes is the relevant angle, not the particular values for each of the initial and final attitudes (as described subsequently) in relation to element 414 .
  • the autofocus correction module 312 may determine that an exposure has been triggered, such as by receiving an indication directly or indirectly from the actuation of a shutter release button 212 by a user. If no indication of an exposure is received, the method of flow chart 400 may wait for such indication. This may occur when the user is recomposing their image after locking focus. Once an indication of the exposure has occurred, the method of flow chart 400 continues to element 412 , where the autofocus correction module 312 may determine a final attitude, similarly to the determination of element 408 .
  • the final attitude represents the attitude of digital imaging system 100 at the time when the exposure is triggered.
  • the final attitude may, in one example, be the attitude of digital imaging system 100 when it is pointing along line ‘B 2 ’ of FIG. 1 .
  • the autofocus correction module 312 may next determine the change in angle between the initial attitude and the final attitude at element 414 .
  • the change in angle (represented by angle ‘ ⁇ ’ in FIG. 1 ) may represent the total angular rotation of the digital imaging system 100 in roll, pitch, and yaw (i.e., the change in attitude) between the time when focus lock was initiated and the exposure was initiated.
  • the change in angle may be the difference between the measured initial and final attitudes.
  • the autofocus correction module 312 may determine a focus correction distance based on the change in angle.
  • the focus correction distance may be calculated by the following equation:
  • the autofocus correction module 312 may feedback the focus correction distance to the primary autofocus system 308 at element 418 .
  • the primary autofocus system 308 may modify the focus point based on the determined focus correction distance at element 420 , such as by modifying the initial target subject distance by the focus correction distance to generate a final target subject distance before an exposure is made.
  • the focus correction distance may be subtracted from the initial target subject distance to determine the final target subject distance.
  • the primary autofocus system 308 may then at element 422 command the focusing motor controller 310 to move the focusing motor 216 the appropriate amount based on the final target subject distance.
  • the digital imaging system 100 may then expose the digital image at element 424 , after which the method of flow chart 400 terminates.
  • routines executed to implement the embodiments of the invention may be part of an operating system or a specific application, component, program, module, object, or sequence of instructions.
  • the computer program of the present invention typically is comprised of a multitude of instructions that will be translated by the native computer into a machine-readable format and hence executable instructions.
  • programs are comprised of variables and data structures that either reside locally to the program or are found in memory or on storage devices.
  • various programs described hereinafter may be identified based upon the application for which they are implemented in a specific embodiment of the invention. However, it should be appreciated that any particular program nomenclature that follows is used merely for convenience, and thus the invention should not be limited to use solely in any specific application identified and/or implied by such nomenclature.

Abstract

Systems to improve autofocus in digital imaging systems, such as digital cameras, are disclosed. Embodiments may include systems for, in response to locking of lens focus on a subject image, determining an initial subject focus distance and an initial attitude, and in response to a request for an exposure, determining a final attitude. System embodiments may also include determining a final target subject distance based on the initial subject focus distance and a focus correction distance, where the focus correction distance is based on the difference between the initial attitude and the final attitude. System embodiments may also include focusing the lens at the final target subject distance. Further system embodiments may include after focusing the lens at the final target subject distance, exposing an image. Other system embodiments may include determining the initial and final attitude with a micro-electro-mechanical (MEMS)-based sensor or other sensor.

Description

    CROSS-REFERENCES TO RELATED APPLICATION(S)
  • Pursuant to 35 USC § 120, this continuation application claims priority to and benefit of U.S. patent application Ser. No. 11/266,668, entitled “SYSTEMS AND METHODS FOR IMPROVED AUTOFOCUS IN DIGITAL IMAGING SYSTEMS”, attorney docket number AUS920050630US1(4137), filed on Nov. 3, 2005, the disclosure of which is incorporated herein in its entirety for all purposes.
  • FIELD OF INVENTION
  • The present invention is in the field of digital imaging systems and, in particular, to systems and methods for improved autofocus in digital imaging systems such as digital cameras.
  • BACKGROUND
  • Digital imaging systems such as digital cameras continue to increase in popularity, providing users with the ability to capture images (i.e., take photographs) with relative ease. Digital imaging systems typically include a lens for directing the light comprising a digital image through a light path to an optical sensor array. Autofocus systems (as well as other automations such as automatic exposure or flash) are often an important part of digital imaging systems as they improve the user experience by making such systems easier to use. Whether an object in an image is ‘in focus’ (i.e., at the sharpest possible setting) for a digital imaging system depends on a number of factors, including the distance between the lens and the sensor array, the lens focal length, exposure aperture, and the distance to the subject. As subject distance can effect whether an object is in focus, some objects in an image may be ‘in focus’ while other objects may be ‘out of focus’. Autofocus systems typically include a focusing motor that moves a portion of the lens of the digital imaging system in and out until the sharpest possible image of the desired subject is projected onto the optical sensor array. In manual focus systems, a user would turn a focusing ring on the lens until the image (or portion of an image) in the viewfinder appeared in focus.
  • Autofocus systems typically rely on active autofocus, passive autofocus, or a combination of the two, and utilize one or more autofocus sensors within the field of view. Active autofocus systems measure the distance to the subject (using, for example, sound or infrared signals) and adjust focus of the optical system accordingly. Passive systems analyze the incoming image itself and drive the lens back and forth searching for the best focus and can include both phase detection systems and contrast measurement systems. Complicated autofocus systems with many sensors can add significant cost and complexity to a digital imaging system, as autofocus sensors are relatively expensive and more accurate sensors (e.g., horizontal and vertical capability) are more expensive still. In all but the most expensive digital single-lens-reflex (DSLR) camera, there will typically be only a few autofocus sensors within the user's field of view. Because autofocus sensors do not completely cover the field of view, the subject that the user desires to be in focus may not lie beneath an autofocus sensor, making it difficult to focus on the subject. In this case, users typically rotate the camera until an autofocus point (typically the center autofocus sensor, as it is usually the most accurate) falls over the area of interest and then lock the focus. After locking the focus, the user then may recompose with the subject at the desired location in the frame and then take the exposure.
  • The solution of locking the autofocus and recomposing often provides unacceptable results, however, when the depth-of-focus (DOF) is small compared to the difference in subject distance between the scene composed as desired as the scene composed during focus lock. A user taking a portrait (where a very small DOF created by large apertures is often aesthetically desirable), for example, might lock focus on the subject's eyes but when the user recomposes, the plane of focus would be behind the eyes. Thus, if the DOF is too small the subject's eyes become out-of-focus and an undesirable photograph results. The DOF may vary depending on imaging sensor size, imaging lens focal length, exposure aperture, and subject distance. For close subject distances and/or large exposure apertures with small DOF, the problem is exacerbated and unacceptable focus shifts are introduced. There is, therefore, a need for an effective system to provide improved autofocus for digital imaging systems.
  • SUMMARY OF THE INVENTION
  • The problems identified above are in large part addressed by the disclosed systems and methods for improved autofocus in digital imaging systems. Embodiments may include, in response to locking of lens focus on a subject image, determining an initial subject focus distance and an initial attitude, and in response to a request for an exposure, determining a final attitude. Embodiments may also include determining a final target subject distance based on the initial subject focus distance and a focus correction distance, where the focus correction distance is based on the difference between the initial attitude and the final attitude. Embodiments may also include focusing the lens at the final target subject distance. Further embodiments may include after focusing the lens at the final target subject distance, exposing an image.
  • A further embodiment provides a digital imaging system having a user interface module to receive focus lock commands and exposure commands from a user and an image capture module to generate a digital image. The system may also include an autofocus system module to automatically focus a lens, where the autofocus system module has a primary autofocus system to determine whether a lens is in focus. The autofocus system module may also have an attitude sensor interface to receive an indication of an initial attitude in response to a focus lock command and a final attitude in response to an exposure command. The autofocus system module may further include an autofocus correction module to determine a focus correction distance based on the initial attitude and the final attitude, where the primary autofocus system corrects lens focus based on the focus correction distance before an exposure is made.
  • A further embodiment provides a digital imaging system having a housing, a processor within the housing to control operation of the digital imaging system, an autofocus system, and an optical sensor array within the housing and in communication with the processor, where the optical sensor array generates an image in response to light. The system further may include a attitude sensor to determine an initial attitude in response to a focus lock command and a final attitude in response to a shutter release command. The processor of the system may determine a focus correction distance based on difference between the initial attitude and the final attitude and may modify focus based on the focus correction distance. In a further embodiment, the attitude sensor may be a MEMS-based sensor.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Advantages of the invention will become apparent upon reading the following detailed description and upon reference to the accompanying drawings in which, like references may indicate similar elements:
  • FIG. 1 depicts an environment for a digital imaging system with an autofocus system and an attitude sensor according to one embodiment;
  • FIG. 2 depicts a block diagram of a digital camera suitable for use as the digital imaging system of FIG. 1 according to one embodiment;
  • FIG. 3 depicts a conceptual illustration of software components of a digital imaging system such as a digital camera according to one embodiment; and
  • FIG. 4 depicts an example of a flow chart for correcting focus based on a change in attitude according to one embodiment.
  • DETAILED DESCRIPTION OF EMBODIMENTS
  • The following is a detailed description of example embodiments of the invention depicted in the accompanying drawings. The example embodiments are in such detail as to clearly communicate the invention. However, the amount of detail offered is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present invention as defined by the appended claims. The descriptions below are designed to make such embodiments obvious to a person of ordinary skill in the art.
  • Generally speaking, systems and methods to improve autofocus in digital imaging systems, such as digital cameras, are disclosed. Embodiments may include, in response to locking of lens focus on a subject image, determining an initial subject focus distance and an initial attitude, and in response to a request for an exposure, determining a final attitude. Attitude, as described in more detail subsequently, may represent the orientation in space of an object with respect to defined axes. Embodiments may also include determining a final target subject distance based on the initial subject focus distance and a focus correction distance, where the focus correction distance is based on the difference between the initial attitude and the final attitude. Embodiments may also include focusing the lens at the final target subject distance. Further embodiments may include after focusing the lens at the final target subject distance, exposing an image. Other embodiments may include determining the initial and final attitude with a micro-electro-mechanical (MEMS)-based sensor or other sensor.
  • The system and methodology of the disclosed embodiments provides an improved method of determining autofocus of a digital imaging system such as a digital camera. Using a sensor such as a MEMS-based sensor, the system may determine an initial attitude when a user selects focus lock and a final attitude after a user recomposes an image and attempts to take an exposure. The attitude of an object may represent its orientation in space comprising the yaw, pitch, and roll of that object with respect to defined axes. The change in attitude of an object may represent its combined angular rotation in yaw, pitch, and roll with respect to defined axes. The disclosed system may advantageously provide a correction to the autofocus based on the difference between the initial attitude and the final attitude (i.e., the change in attitude) to correct for a change in distance resulting from rotation of the digital imaging system. A user focusing on one part of an image, such as an important object like a portrait subject's eyes, may select focus lock and recompose the image. Previous systems resulted in the potential for the initial subject to be out of focus after recomposition because in the change of distance resulting from rotation of the imaging system. The disclosed system advantageously provides for a corrected focus distance so that the initial subject (i.e., the subject's eyes) will be in focus. The disclosed system may be particularly useful where the depth-of-focus is small or shallow, such as with large apertures.
  • While specific embodiments will be described below with reference to particular configurations of hardware and/or software, those of skill in the art will realize that embodiments of the present invention may advantageously be implemented with other substantially equivalent hardware and/or software systems. Aspects of the invention described herein may be stored or distributed on computer-readable media, including magnetic and optically readable and removable computer disks, as well as distributed electronically over the Internet or over other networks, including wireless networks. Data structures and transmission of data (including wireless transmission) particular to aspects of the invention are also encompassed within the scope of the invention.
  • Turning now to the drawings, FIG. 1 depicts an environment for a digital imaging system with an autofocus system and an attitude sensor according to one embodiment. In the depicted embodiment, the digital imaging system 100 includes a camera housing 102, a lens 104 with an optional manual focus ring 106, an attitude sensor 108, and an autofocus system 110. The digital imaging system 100 of the depicted embodiment is pointing towards an initial subject 112. The digital imaging system 100 may in some embodiments be a digital camera such as a digital single-lens-reflex (DSLR) (either fixed-lens DSLR or interchangeable-lens DSLR), digital rangefinder camera, digital point-and-shoot (P&S) camera, or fixed-lens digital camera. The digital imaging system 100 may alternatively be an imaging device integrated with another system, such as a mobile phone, personal digital assistant (PDA), wearable device, or mobile computer. In another alternative embodiment, the digital imaging system 100 may be a digital camcorder, digital video camera, or other digital video-recording device.
  • A user may point the digital imaging system 100 in the direction of a subject in order to take a picture of that subject. To take a picture, light enters through lens 104 and is directed through a light path until it strikes an optical sensor array (described in relation to FIG. 2), whereupon the optical sensor array captures the image. Some digital imaging systems 100 have a single lens while others may have separate lenses for composition and taking. For example, in a DSLR digital imaging system 100 a single lens 104 is used for both viewing an object of interest and for capturing and directing light towards the optical sensor array. Any lenses 104 may also be permanently attached to the camera housing 102 (such as in the case of P&S camera) or may be detachable and interchangeable (such as in the case of interchangeable-lens DSLRs). Lens 104 may also include a manual focus ring 106 to provide an alternative to the autofocus system 110 of the digital imaging system 100 or to provide fine tuning of the focus.
  • As described previously, whether an object in an image is ‘in focus’ (i.e., at the sharpest possible setting) depends on a number of factors, including the distance between the lens 104 and the optical sensor array, the lens focal length, exposure aperture, and the distance to the subject. As subject distance can affect whether an object is in focus, some objects in an image may be in focus while other objects may be out of focus. Autofocus systems 110 typically include a focusing motor that moves a portion of the lens 104 of the digital imaging system 100 in and out until the sharpest possible image of the desired subject is projected onto the optical sensor array. Autofocus systems 110 rely on one or more autofocus sensors within the user's field of view in order to determine whether or not objects ‘under’ the sensors are in focus. Autofocus systems 110 often determine a distance to the subject as part of their internal algorithms.
  • The digital imaging system 100 may also include one or more attitude sensors 108, which may be located within camera housing 102 in some embodiments. An attitude sensor 108 may determine the attitude of the digital imaging system 100 with respect to defined yaw, pitch, and roll axes. In some embodiments, the attitude sensor 108 may be a micro-electro-mechanical (MEMS)-based attitude sensor which measures the orientation of the digital imaging system 100 with respect to the earth's gravimetric field to determine the digital imaging system's 100 three-dimensional attitude in space. MEMS-based attitude sensors may provide a relatively low-cost, reliable, and compact methodology of determining rotational position when compared to other sensors. Many existing digital cameras include a MEMS-based orientation sensor or other orientation sensor to determine whether a photograph was taken in a landscape mode (i.e., horizontal) or portrait mode (i.e., vertical). In some embodiments, these existing orientation sensors are insufficient for a digital imaging system 100 according to the present invention. Existing orientation sensors, for example, are limited to measuring rotation of the camera about an axis parallel to the lens so that a determination may be made with respect to landscape or portrait orientation. Moreover, existing orientation sensors may not have the accuracy required of the disclosed embodiments as they only require accuracy sufficient to distinguish between landscape and portrait modes. In other embodiments, however, attitude sensor 108 may also have the capability of distinguishing landscape and portrait modes, eliminating the need for two sensors. In another alternative embodiment, the attitude sensor 108 may be another type of sensor capable of detecting three-dimensional attitude in space, such as a gyroscope or inertial measurement unit (IMU).
  • The autofocus system 110 may utilize active autofocus, passive autofocus, or a combination of the two. Active autofocus systems measure the distance to the subject (using, for example, sound or infrared signals) and adjust focus of the optical system accordingly. Passive systems analyze the incoming image itself and drive the lens 104 back and forth searching for the best focus. Passive autofocus systems can include both phase detection systems and contrast measurement systems. Phase detection autofocus systems divide the incoming light to particular autofocus sensors into pairs of images and then compare the resulting images to determine the proper focus. Some contrast-based passive autofocus systems utilize an autofocus sensor such as a charge-coupled device (CCD) that provides input to algorithms to compute the contrast of the actual image elements. A CCD sensor typically has a group of pixels and an on-board processor analyzes the light hitting the sensor by looking at the difference in intensity among the adjacent pixels. As an out-of-focus scene has adjacent pixels with similar intensities, the processor may move the lens 104 and attempt to find the maximum intensity difference between adjacent pixels. The point of maximum intensity difference between pixels is then the point of best focus. Autofocus systems may use a combination of different autofocus sensors and methods, such as a camera that utilizes a passive phase detection system with an autofocus ‘assist’ from an active autofocus system. In some embodiments, the autofocus sensor 110 may determine distance to the subject as part of its determination of proper focus.
  • In previous systems, locking focus on one subject and then recomposing could result in undesirable focusing errors in certain situations. For example, a user may point the lens 104 of the digital imaging system 100 at an initial subject 112 such as a Christmas tree with a star on top as depicted in FIG. 1. The user may wish to center their focus (but not their composition) on the star of the tree, pointing the lens 104 of the digital camera system 100 along the line ‘A2’ as depicted in FIG. 1. Once the star is ‘underneath’ an autofocus sensor, the autofocus system 110 of the digital imaging system 100 may then automatically focus on the star. The user may then actuate focus lock by, for example, partially depressing the shutter-release button or actuating a focus lock button on the outside of the camera housing 102 to lock the lens focus in its current position. By doing so, the user is attempting to place the center of the plane of focus along line ‘A1’ of FIG. 1, centered on the star. The user may then recompose their image by rotating the digital imaging system 100 and pointing the lens 104 along line ‘B2’ as depicted in FIG. 1 while the lens focus remains locked, and then taking the photograph once recomposition is complete. The plane of focus for the lens 104, however, will no longer be centered on the star because of the angilar rotation of the lens 104 and digital imaging system 100. The plane of focus for the lens 104 will instead be centered along line ‘B1’ of FIG. 1, behind the star, because of the locked lens focus, instead of along ‘A1’ where the user desires. With a sufficiently small DOF resulting from the digital imaging system 100 configuration and subject distance, the star may then become out of focus when the user takes the photograph after recomposing. As described previously, the DOF may vary depending on imaging sensor size, imaging lens focal length, exposure aperture, and subject distance. The autofocus error may be exacerbated when the DOF is small, such as when the subject distance is relatively short or the exposure aperture is relatively large.
  • Using the system of the disclosed system, the autofocus performance of digital imaging system 100 in certain situations may be improved by correcting for the error caused by rotation (i.e., change in attitude) of the digital imaging system 100 after focus lock. As will be described in more detail subsequently, the attitude sensor 108 may determine the attitude of the digital imaging system 100 both at the time of focus lock (digital imaging system 100 oriented along line ‘A2’) and when a photograph is taken (digital imaging system 100 oriented along line ‘B2’). The difference between these two attitudes is the angle ‘⊖’ depicted in FIG. 1. Using the calculated angle ‘⊖’ and a measurement of the distance along line ‘A2’ when focus was locked (as described in more detail in relation to FIG. 4), the autofocus system 110 may calculate a focus correction distance and correct the focus before the exposure is taken. The focus correction distance may effectively be the distance between lines ‘A1’ and ‘B1’. By correcting the focus in this matter, the digital imaging system 100 provides more accurate focus that is consistent with a user's intentions. The digital imaging system 100 may thus change the plane of focus from line ‘B1’ to line ‘A1’ so that the subject selected at focus lock (i.e., the star) is ‘in focus’ in the final exposure.
  • FIG. 2 depicts a block diagram of a digital camera 200 suitable for use as the digital imaging system 100 of FIG. 1 according to one embodiment. The digital camera 200 of the depicted embodiment includes a processor 202 in connected to storage 204, memory 206, an optical sensor array 208, an I/O driver/controller 210, and an autofocus system 110. Processor 202 may include one or more system central processing units (CPUs) or processors to execute instructions. Storage 204 may include storage devices for storing digital images captured by the digital camera 200, such as removable media such as a microdrive or flash media devices such as a Secure Digital (SD)™ card (as defined by the SD Card Association), a CompactFlash® (CF) card, or a Memory Stick. Storage 204 may also include non-removable media such as hard drives or on-board non-volatile memory. Memory 206 may include read-only memory (ROM), random access memory (RAM), or other types of memory (or combinations thereof) containing a plurality of executable instructions which, when executed on processor 202, control the operation of the digital camera 200. The optical sensor array 208 may capture a digital image when exposed to light through lens 104 and store the image in storage 204.
  • The I/O driver/controller 210 may facilitate communications between processor 202 and other components of the digital camera 200, including user input devices and other hardware items. In the depicted embodiment, the digital camera 200 includes two input devices, the shutter release button 212 and the focus lock button 214. A user may actuate the shutter release button 212 to take a photograph (i.e., to request the digital camera 200 to expose the optical sensor array 208 to light for a specified timeframe). A user may actuate the optional focus lock button 214 whenever they wish to lock focus at its current position. By locking focus with the focus lock button 214, a user may then recompose without having the autofocus modify their desired point of focus. In other embodiments, the user may achieve focus lock (and/or exposure lock) by partially depressing the shutter release button 212, eliminating the need for the focus lock button 214. In these embodiments, fully depressing the shutter release button 212 will take an exposure. One skilled in the art will recognize that digital camera 200 may contain other user-actuated switches or buttons, such as playback buttons or manual focus rings 106.
  • The I/O driver/controller 210 may also facilitate communications between processor 202 and other hardware items, such as a focusing motor 216, autofocus sensors 218, and the attitude sensor 108. The focusing motor 216 may be an electromechanical or other type of motor that drives the lens forward or backward or causes other changes in the physical state of the lens to adjust the focus of the lens, such as by changing the distance between optical elements within the lens. The processor 202, based on commands from the autofocus system 110, may command the direction and speed of the focusing motor 216. The autofocus sensors 218 may be, for example, CCD sensors that look at the difference in intensity among the adjacent pixels to provide input to algorithms to compute the contrast of the actual image elements. The I/O driver/controller 210 may also facilitate communication with other hardware items, such as the lens 104, a shutter (if utilized), LCD display, external ports, mirrors, etc.
  • The autofocus system 110 may be in communication with the processor 202 to receive input from the user input devices and other hardware as well as to send commands to the hardware devices. The autofocus system 110 may, for example, receive input from the shutter release button 212 and focus lock button 214 in order to respond to user commands. The autofocus system 110 may also receive information from the autofocus sensors 218 and attitude sensor 108 that it uses to determine the proper focus. The autofocus system 110 may also receive status information (e.g., current location) from the focusing motor 216 and send commands to the focusing motor 216 to extend, retract, etc.
  • FIG. 3 depicts a conceptual illustration of software components of a digital imaging system 100 such as a digital camera 200 according to one embodiment. The digital imaging system 100 of the depicted embodiment includes a user interface module 302, an image capture module 304, and an autofocus system module 306. The user interface module 302 may receive input from a user, such as actuations of a shutter release button 212 or focus lock button 214, as well as provide output to a user via an LCD display or other output device (e.g., audio device). The image capture module 304 may process the image recorded by the optical sensor array 208, including any noise reduction, sharpening, changes to color or saturation, changing image formats, saving the image to the storage 204, or any other task.
  • The autofocus system module 306 may control the autofocus system 110 and may include sub-modules such as a primary autofocus system 308, a focusing motor controller 310, an autofocus correction module 312, and an attitude sensor interface 314. The primary autofocus system 308 may receive inputs from the autofocus sensors 218 to determine whether or not the image underneath the autofocus sensors is in focus, as is known in the art. The primary autofocus system 308 may also produce commands to the focusing motor controller 310 for transmittal to the focusing motor 216. Feedback from the focusing motor 216 may also be received by the focusing motor controller 310 and utilized by the primary autofocus system 308. The autofocus correction module 312 may determine a focus correction factor based on attitude sensor 108 information received from the attitude sensor interface 314 and based on distance information determined by the primary autofocus system 308. As will be described in more detail in relation to FIG. 4, the autofocus correction module 312 may correct for focusing errors resulting in the rotation of a digital imaging system 100 after focus lock has been requested. The attitude sensor interface 314 may provide for interaction between the autofocus system module 306 and the attitude sensor 108 and may optionally perform processing on data received from the attitude sensor 108.
  • FIG. 4 depicts an example of a flow chart 400 for correcting focus based on a change in attitude according to one embodiment. The method of flow chart 400 may be performed, in one embodiment, by components of a digital imaging system 100 and, in particular, the autofocus system module 306 and its sub-modules. Flow chart 400 begins with optional element 402, where the autofocus system module 306 activates autofocus correction according to embodiments of the present invention. In some embodiments, the autofocus system module 306 may activate autofocus correction in response to a request via a user who inputs the request via a button or entry in a control menu of a LCD display. In other embodiments, the autofocus system module 306 may activate the autofocus correction based on default settings or other means (e.g., autofocus correction automatically on). Once autofocus correction is initiated, the primary autofocus system 308 of the autofocus system module 306 may determine at decision block 404 when a user has locked the focus, such as by actuating the focus lock button 214 or partially depressing the shutter release button 212. Focus lock may be requested after use of the digital imaging system, such as when a user utilizes the autofocus system 110 to focus on a desired subject. Once focus lock is detected, the method of flow chart 400 continues to element 406; otherwise, the method awaits a determination that focus has been locked.
  • Once focus has been locked, the primary autofocus system 308 may determine the initial subject focus distance at element 406. The initial subject focus distance is the distance between the digital imaging system 100 and the point the user selects to be the point of focus the user locks in, as represented by distance ‘A2’ in FIG. 1. The primary autofocus system 308 may determine the initial subject focus distance using any methodology, including as part of an active autofocus system (where distance measurement is part of the autofocus procedure) or based on a passive autofocus system (where distance may be measured or determined as part of the autofocus algorithm). Alternatively, primary autofocus system 308 may determine the initial subject focus distance another way, such as by receiving input from a different distance-determining device or by calculating the distance based on the lens 104 position when focused.
  • In addition to determining the initial subject focus distance once focus has been locked, the autofocus correction module 312 of the autofocus system module 306 may also determine an initial attitude at element 408. The initial attitude may be the attitude of digital imaging system 100 at the time when autofocus is selected. The initial attitude may, in one example, be the attitude of digital imaging system 100 when it is pointing along line ‘A2’ of FIG. 1. In one embodiment, the initial attitude may be measured in relation to the earth's gravimetric field, such as when measured by a MEMS-based attitude sensor 108. The reference frame of the initial attitude may be any frame assuming that it is substantially similar to the reference frame of the final attitude determined later, as the difference between the two attitudes is the relevant angle, not the particular values for each of the initial and final attitudes (as described subsequently) in relation to element 414.
  • At decision block 410, the autofocus correction module 312 may determine that an exposure has been triggered, such as by receiving an indication directly or indirectly from the actuation of a shutter release button 212 by a user. If no indication of an exposure is received, the method of flow chart 400 may wait for such indication. This may occur when the user is recomposing their image after locking focus. Once an indication of the exposure has occurred, the method of flow chart 400 continues to element 412, where the autofocus correction module 312 may determine a final attitude, similarly to the determination of element 408. The final attitude represents the attitude of digital imaging system 100 at the time when the exposure is triggered. The final attitude may, in one example, be the attitude of digital imaging system 100 when it is pointing along line ‘B2’ of FIG. 1. The autofocus correction module 312 may next determine the change in angle between the initial attitude and the final attitude at element 414. The change in angle (represented by angle ‘⊖’ in FIG. 1) may represent the total angular rotation of the digital imaging system 100 in roll, pitch, and yaw (i.e., the change in attitude) between the time when focus lock was initiated and the exposure was initiated. In some embodiments, the change in angle may be the difference between the measured initial and final attitudes.
  • After the change in angle is determined, the method of flow chart 400 continues to element 416, where the autofocus correction module 312 may determine a focus correction distance based on the change in angle. In one embodiment, the focus correction distance may be calculated by the following equation:

  • fcd=(d*(1−cos ⊖))
  • where ‘fcd’ is the focus correction distance, ‘d’ is the initial subject focus distance, and ‘⊖’ is the change in angle. One skilled in the art will understand that other methodologies are also possible to determine a focus correction distance based on the change in angle, such as by using table lookups or other methodologies. After determining the focus correction distance, the autofocus correction module 312 may feedback the focus correction distance to the primary autofocus system 308 at element 418. The primary autofocus system 308 may modify the focus point based on the determined focus correction distance at element 420, such as by modifying the initial target subject distance by the focus correction distance to generate a final target subject distance before an exposure is made. In one embodiment, the focus correction distance may be subtracted from the initial target subject distance to determine the final target subject distance. The primary autofocus system 308 may then at element 422 command the focusing motor controller 310 to move the focusing motor 216 the appropriate amount based on the final target subject distance. The digital imaging system 100 may then expose the digital image at element 424, after which the method of flow chart 400 terminates.
  • In general, the routines executed to implement the embodiments of the invention, may be part of an operating system or a specific application, component, program, module, object, or sequence of instructions. The computer program of the present invention typically is comprised of a multitude of instructions that will be translated by the native computer into a machine-readable format and hence executable instructions. Also, programs are comprised of variables and data structures that either reside locally to the program or are found in memory or on storage devices. In addition, various programs described hereinafter may be identified based upon the application for which they are implemented in a specific embodiment of the invention. However, it should be appreciated that any particular program nomenclature that follows is used merely for convenience, and thus the invention should not be limited to use solely in any specific application identified and/or implied by such nomenclature.
  • It will be apparent to those skilled in the art having the benefit of this disclosure that the present invention contemplates systems and methods for improving autofocus in a digital imaging system. It is understood that the form of the invention shown and described in the detailed description and the drawings are to be taken merely as examples. It is intended that the following claims be interpreted broadly to embrace all the variations of the example embodiments disclosed.

Claims (13)

1. A digital imaging system, the system comprising:
a user interface module, the user interface module being adapted to receive focus lock commands and exposure commands from a user;
an image capture module to generate a digital image; and
an autofocus system module to automatically focus a lens, the autofocus module comprising:
a primary autofocus system to determine whether the lens is in focus;
an attitude sensor interface, the attitude sensor interface being adapted to receive an indication of an initial attitude in response to a focus lock command and a final attitude in response to an exposure command;
an autofocus correction module to determine a focus correction distance based on the initial attitude and the final attitude; and
wherein the primary autofocus system corrects lens focus based on the focus correction distance before an exposure is made.
2. The system of claim 1, further comprising a focusing motor controller of the autofocus system module to modify the lens focus.
3. The system of claim 1, wherein the primary autofocus system is adapted to determine an initial target subject distance in response to receiving a focus lock command.
4. The system of claim 1, wherein the autofocus correction module is adapted to, in response to an exposure command, determine a final target subject distance based on an initial target subject distance and the focus correction distance.
5. A digital imaging system, the system comprising:
a housing;
a processor within the housing to control operation of the digital imaging system;
an optical sensor array within the housing and in communication with the processor, the optical sensor array being adapted to generate an image in response to light;
an autofocus system in communication with the processor;
an attitude sensor in communication with the processor, the attitude sensor being adapted to determine an initial attitude in response to a focus lock command and a final attitude in response to a shutter release command; and
wherein the processor is adapted to execute a series of operations to determine a focus correction distance based on the difference between the initial attitude and the final attitude, wherein the processor is further adapted to execute a series of operations to modify focus based on the focus correction distance.
6. The system of claim 5, further comprising a lens permanently mounted to the housing to focus light from outside the digital imaging system traversing a light path to the optical sensor array.
7. The system of claim 5, further comprising a lens attached to the housing to focus light from outside the digital imaging system traversing a light path to the optical sensor array.
8. The system of claim 5, further comprising a focusing motor in communication with the processor, the autofocus motor being adapted to move a lens to change its point of focus.
9. The system of claim 5, further comprising one or more autofocus sensors in communication with the processor.
10. The system of claim 5, wherein the digital imaging system is a digital camera.
11. The system of claim 5, wherein the attitude sensor is a micro-electro-mechanical (MEMS) sensor.
12. The system of claim 5, wherein the attitude sensor is further adapted to distinguish between a portrait orientation and a landscape orientation.
13. The system of claim 5, wherein the attitude sensor is one or more of a gyroscope or inertial measurement unit (IMU).
US12/127,139 2005-11-03 2008-05-27 Systems For Improved Autofocus in Digital Imaging Systems Abandoned US20080226274A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/127,139 US20080226274A1 (en) 2005-11-03 2008-05-27 Systems For Improved Autofocus in Digital Imaging Systems

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/266,668 US7409149B2 (en) 2005-11-03 2005-11-03 Methods for improved autofocus in digital imaging systems
US12/127,139 US20080226274A1 (en) 2005-11-03 2008-05-27 Systems For Improved Autofocus in Digital Imaging Systems

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/266,668 Continuation US7409149B2 (en) 2005-11-03 2005-11-03 Methods for improved autofocus in digital imaging systems

Publications (1)

Publication Number Publication Date
US20080226274A1 true US20080226274A1 (en) 2008-09-18

Family

ID=37996420

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/266,668 Active 2026-11-08 US7409149B2 (en) 2005-11-03 2005-11-03 Methods for improved autofocus in digital imaging systems
US12/127,139 Abandoned US20080226274A1 (en) 2005-11-03 2008-05-27 Systems For Improved Autofocus in Digital Imaging Systems

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/266,668 Active 2026-11-08 US7409149B2 (en) 2005-11-03 2005-11-03 Methods for improved autofocus in digital imaging systems

Country Status (2)

Country Link
US (2) US7409149B2 (en)
JP (1) JP5296307B2 (en)

Cited By (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080131019A1 (en) * 2006-12-01 2008-06-05 Yi-Ren Ng Interactive Refocusing of Electronic Images
US20100128145A1 (en) * 2008-11-25 2010-05-27 Colvin Pitts System of and Method for Video Refocusing
US20100141802A1 (en) * 2008-12-08 2010-06-10 Timothy Knight Light Field Data Acquisition Devices, and Methods of Using and Manufacturing Same
US20100182397A1 (en) * 2009-01-16 2010-07-22 Eun Jeong Choi Connector panel for view camera capable of docking digital single lens reflex camera
US20100265385A1 (en) * 2009-04-18 2010-10-21 Knight Timothy J Light Field Camera Image, File and Configuration Data, and Methods of Using, Storing and Communicating Same
US20110134309A1 (en) * 2006-03-01 2011-06-09 Asia Optical Co., Inc. Method to Evaluate Contrast Value for an Image and Applications Thereof
US20110234841A1 (en) * 2009-04-18 2011-09-29 Lytro, Inc. Storage and Transmission of Pictures Including Multiple Frames
US8648958B2 (en) 2004-10-01 2014-02-11 The Board Of Trustees Of The Leland Stanford Junior University Variable imaging arrangements and methods therefor
US8749620B1 (en) 2010-02-20 2014-06-10 Lytro, Inc. 3D light field cameras, images and files, and methods of using, operating, processing and viewing same
US8768102B1 (en) 2011-02-09 2014-07-01 Lytro, Inc. Downsampling light field images
US8811769B1 (en) 2012-02-28 2014-08-19 Lytro, Inc. Extended depth of field and variable center of perspective in light-field processing
US8831377B2 (en) 2012-02-28 2014-09-09 Lytro, Inc. Compensating for variation in microlens position during light-field image processing
US8948545B2 (en) 2012-02-28 2015-02-03 Lytro, Inc. Compensating for sensor saturation and microlens modulation during light-field image processing
CN104469167A (en) * 2014-12-26 2015-03-25 小米科技有限责任公司 Automatic focusing method and device
US8997021B2 (en) 2012-11-06 2015-03-31 Lytro, Inc. Parallax and/or three-dimensional effects for thumbnail image displays
US8995785B2 (en) 2012-02-28 2015-03-31 Lytro, Inc. Light-field processing and analysis, camera control, and user interfaces and interaction on light-field capture devices
US9001226B1 (en) 2012-12-04 2015-04-07 Lytro, Inc. Capturing and relighting images using multiple devices
US9184199B2 (en) 2011-08-01 2015-11-10 Lytro, Inc. Optical assembly including plenoptic microlens array
US9305375B2 (en) 2014-03-25 2016-04-05 Lytro, Inc. High-quality post-rendering depth blur
US9420276B2 (en) 2012-02-28 2016-08-16 Lytro, Inc. Calibration of light-field camera geometry via robust fitting
US9456141B2 (en) 2013-02-22 2016-09-27 Lytro, Inc. Light-field based autofocus
CN106161950A (en) * 2016-08-09 2016-11-23 乐视控股(北京)有限公司 Focusing position fixing means, device and terminal
US9607424B2 (en) 2012-06-26 2017-03-28 Lytro, Inc. Depth-assigned content for depth-enhanced pictures
US10092183B2 (en) 2014-08-31 2018-10-09 Dr. John Berestka Systems and methods for analyzing the eye
US10129524B2 (en) 2012-06-26 2018-11-13 Google Llc Depth-assigned content for depth-enhanced virtual reality images
US10205896B2 (en) 2015-07-24 2019-02-12 Google Llc Automatic lens flare detection and correction for light-field images
US10275892B2 (en) 2016-06-09 2019-04-30 Google Llc Multi-view scene segmentation and propagation
US10275898B1 (en) 2015-04-15 2019-04-30 Google Llc Wedge-based light-field video capture
US10298834B2 (en) 2006-12-01 2019-05-21 Google Llc Video refocusing
US10334151B2 (en) 2013-04-22 2019-06-25 Google Llc Phase detection autofocus using subaperture images
US10341632B2 (en) 2015-04-15 2019-07-02 Google Llc. Spatial random access enabled video system with a three-dimensional viewing volume
US10354399B2 (en) 2017-05-25 2019-07-16 Google Llc Multi-view back-projection to a light-field
US10412373B2 (en) 2015-04-15 2019-09-10 Google Llc Image capture for virtual reality displays
US10419737B2 (en) 2015-04-15 2019-09-17 Google Llc Data structures and delivery methods for expediting virtual reality playback
US10440407B2 (en) 2017-05-09 2019-10-08 Google Llc Adaptive control for immersive experience delivery
US10444931B2 (en) 2017-05-09 2019-10-15 Google Llc Vantage generation and interactive playback
US10469873B2 (en) 2015-04-15 2019-11-05 Google Llc Encoding and decoding virtual reality video
US10474227B2 (en) 2017-05-09 2019-11-12 Google Llc Generation of virtual reality with 6 degrees of freedom from limited viewer data
US10540818B2 (en) 2015-04-15 2020-01-21 Google Llc Stereo image generation and interactive playback
US10546424B2 (en) 2015-04-15 2020-01-28 Google Llc Layered content delivery for virtual and augmented reality experiences
US10545215B2 (en) 2017-09-13 2020-01-28 Google Llc 4D camera tracking and optical stabilization
US10552947B2 (en) 2012-06-26 2020-02-04 Google Llc Depth-based image blurring
US10567464B2 (en) 2015-04-15 2020-02-18 Google Llc Video compression with adaptive view-dependent lighting removal
US10565734B2 (en) 2015-04-15 2020-02-18 Google Llc Video capture, processing, calibration, computational fiber artifact removal, and light-field pipeline
US10594945B2 (en) 2017-04-03 2020-03-17 Google Llc Generating dolly zoom effect using light field image data
US10679361B2 (en) 2016-12-05 2020-06-09 Google Llc Multi-view rotoscope contour propagation
US10965862B2 (en) 2018-01-18 2021-03-30 Google Llc Multi-camera navigation interface
US20220070357A1 (en) * 2020-08-31 2022-03-03 Advanced Micro Devices, Inc. Instant auto-focus with distance estimation
US11328446B2 (en) 2015-04-15 2022-05-10 Google Llc Combining light-field data with active depth data for depth map generation
US20220311940A1 (en) * 2021-03-25 2022-09-29 Samsung Electronics Co., Ltd. Electronic apparatus including camera and control method thereof

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7409149B2 (en) * 2005-11-03 2008-08-05 International Business Machines Corporation Methods for improved autofocus in digital imaging systems
JP4876550B2 (en) * 2005-11-25 2012-02-15 セイコーエプソン株式会社 Imaging apparatus, control method, and control program
US7593627B2 (en) * 2006-08-18 2009-09-22 Sony Ericsson Mobile Communications Ab Angle correction for camera
JP4552974B2 (en) * 2007-06-22 2010-09-29 カシオ計算機株式会社 Camera device, focus control method and program
SG150414A1 (en) * 2007-09-05 2009-03-30 Creative Tech Ltd Methods for processing a composite video image with feature indication
US7391442B1 (en) 2007-11-01 2008-06-24 International Business Machines Corporation Digital camera including a distortion correction system
US7615729B2 (en) * 2007-12-10 2009-11-10 Aptina Imaging Corporation Apparatus and method for resonant lens focusing
WO2010149763A1 (en) 2009-06-26 2010-12-29 Hasselblad A/S Camera focus correction method and system
US8525918B2 (en) 2011-04-20 2013-09-03 Htc Corporation Portable electronic devices and auto-focus control methods for cameras therein
CN103185948B (en) * 2011-12-31 2015-06-10 索尼爱立信移动通讯有限公司 Camera module, electronic device containing camera module and automatic focusing method
JP6029380B2 (en) 2012-08-14 2016-11-24 キヤノン株式会社 Image processing apparatus, imaging apparatus including image processing apparatus, image processing method, and program
US20140204083A1 (en) * 2013-01-23 2014-07-24 Brent Thomson Systems and methods for real-time distortion processing
TWI524211B (en) 2013-09-11 2016-03-01 萬國商業機器公司 Electronic apparatus and display angle adjustment method therewith
US10419658B1 (en) 2014-07-20 2019-09-17 Promanthan Brains LLC, Series Point only Camera optimizing for several directions of interest
US9628695B2 (en) 2014-12-29 2017-04-18 Intel Corporation Method and system of lens shift correction for a camera array
CN106534687A (en) * 2016-11-18 2017-03-22 广东欧珀移动通信有限公司 Control method and control device
CN108663803B (en) * 2017-03-30 2021-03-26 腾讯科技(深圳)有限公司 Virtual reality glasses, lens barrel adjusting method and device
JP2018028678A (en) * 2017-10-05 2018-02-22 株式会社ニコン Imaging apparatus
CN113031195A (en) * 2021-03-10 2021-06-25 江苏金视传奇科技有限公司 Infrared noninductive automatic focusing device
KR20220133630A (en) * 2021-03-25 2022-10-05 삼성전자주식회사 Electronic device comprising camera and method for controlling the same

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5499074A (en) * 1990-11-09 1996-03-12 Nikon Corporation Autofocus camera
US5677760A (en) * 1994-08-23 1997-10-14 Olympus Optical Co., Ltd. Rangefinding device for use in a camera
US5764291A (en) * 1994-09-30 1998-06-09 Apple Computer, Inc. Apparatus and method for orientation-dependent camera exposure and focus setting optimization
US5900909A (en) * 1995-04-13 1999-05-04 Eastman Kodak Company Electronic still camera having automatic orientation sensing and image correction
US6148149A (en) * 1998-05-26 2000-11-14 Microsoft Corporation Automatic image rotation in digital cameras
US6222584B1 (en) * 1999-11-03 2001-04-24 Inventec Corporation Method of automatically rotating image storage data subject to image capture angle, and the related digital camera
US6262769B1 (en) * 1997-07-31 2001-07-17 Flashpoint Technology, Inc. Method and system for auto rotating a graphical user interface for managing portrait and landscape images in an image capture unit
US20040017506A1 (en) * 2002-07-26 2004-01-29 Livingston Kris R. Camera having camera orientation sensing capability
US6738095B2 (en) * 2002-09-11 2004-05-18 Eastman Kodak Company Orientation-sensitive electronic vertical shutter release lock
US6747690B2 (en) * 2000-07-11 2004-06-08 Phase One A/S Digital camera with integrated accelerometers
US7409149B2 (en) * 2005-11-03 2008-08-05 International Business Machines Corporation Methods for improved autofocus in digital imaging systems

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2763041B2 (en) * 1988-03-02 1998-06-11 株式会社ニコン Auto focus camera
US5239333A (en) * 1990-01-05 1993-08-24 Nikon Corporation Camera exposure calculation device dependent on type of scene to be photographed
US5495312A (en) * 1990-01-05 1996-02-27 Nikon Corporation Camera exposure calculation device dependent on type of scene to be photographed and focus condition detecting device dependent on camera attitude
US5311242A (en) * 1991-05-16 1994-05-10 Olympus Optical Co., Ltd. Autofocus camera and method of focus control therefor
JPH09329831A (en) * 1996-06-11 1997-12-22 Canon Inc Camera
US6965397B1 (en) * 1999-11-22 2005-11-15 Sportvision, Inc. Measuring camera attitude
JP2003222937A (en) * 2002-12-26 2003-08-08 Olympus Optical Co Ltd Camera having range-finding device
JP2007258989A (en) * 2006-03-22 2007-10-04 Eastman Kodak Co Digital camera, composition corrector, and composition correcting method

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5499074A (en) * 1990-11-09 1996-03-12 Nikon Corporation Autofocus camera
US5677760A (en) * 1994-08-23 1997-10-14 Olympus Optical Co., Ltd. Rangefinding device for use in a camera
US5764291A (en) * 1994-09-30 1998-06-09 Apple Computer, Inc. Apparatus and method for orientation-dependent camera exposure and focus setting optimization
US5900909A (en) * 1995-04-13 1999-05-04 Eastman Kodak Company Electronic still camera having automatic orientation sensing and image correction
US6262769B1 (en) * 1997-07-31 2001-07-17 Flashpoint Technology, Inc. Method and system for auto rotating a graphical user interface for managing portrait and landscape images in an image capture unit
US6148149A (en) * 1998-05-26 2000-11-14 Microsoft Corporation Automatic image rotation in digital cameras
US6222584B1 (en) * 1999-11-03 2001-04-24 Inventec Corporation Method of automatically rotating image storage data subject to image capture angle, and the related digital camera
US6747690B2 (en) * 2000-07-11 2004-06-08 Phase One A/S Digital camera with integrated accelerometers
US20040017506A1 (en) * 2002-07-26 2004-01-29 Livingston Kris R. Camera having camera orientation sensing capability
US6738095B2 (en) * 2002-09-11 2004-05-18 Eastman Kodak Company Orientation-sensitive electronic vertical shutter release lock
US7409149B2 (en) * 2005-11-03 2008-08-05 International Business Machines Corporation Methods for improved autofocus in digital imaging systems

Cited By (79)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9807287B2 (en) 2004-10-01 2017-10-31 Board Of Trustees Of The Leland Stanford Junior University Variable imaging arrangements and methods therefor
US8717489B2 (en) 2004-10-01 2014-05-06 The Board Of Trustees Of The Leland Stanford Junior University Variable imaging arrangements and methods therefor
US8648958B2 (en) 2004-10-01 2014-02-11 The Board Of Trustees Of The Leland Stanford Junior University Variable imaging arrangements and methods therefor
US9100557B2 (en) 2004-10-01 2015-08-04 The Board Of Trustees Of The Leland Stanford Junior University Variable imaging arrangements and methods therefor
US20110134309A1 (en) * 2006-03-01 2011-06-09 Asia Optical Co., Inc. Method to Evaluate Contrast Value for an Image and Applications Thereof
US8270755B2 (en) * 2006-03-01 2012-09-18 Asia Optical Co., Inc. Method to evaluate contrast value for an image and applications thereof
US9530195B2 (en) 2006-12-01 2016-12-27 Lytro, Inc. Interactive refocusing of electronic images
US10298834B2 (en) 2006-12-01 2019-05-21 Google Llc Video refocusing
US8559705B2 (en) 2006-12-01 2013-10-15 Lytro, Inc. Interactive refocusing of electronic images
US20080131019A1 (en) * 2006-12-01 2008-06-05 Yi-Ren Ng Interactive Refocusing of Electronic Images
US8446516B2 (en) 2008-11-25 2013-05-21 Lytro, Inc. Generating and outputting video data from refocusable light field video data
US8760566B2 (en) 2008-11-25 2014-06-24 Lytro, Inc. Video refocusing
US8279325B2 (en) 2008-11-25 2012-10-02 Lytro, Inc. System and method for acquiring, editing, generating and outputting video data
US20100128145A1 (en) * 2008-11-25 2010-05-27 Colvin Pitts System of and Method for Video Refocusing
US8570426B2 (en) 2008-11-25 2013-10-29 Lytro, Inc. System of and method for video refocusing
US8614764B2 (en) 2008-11-25 2013-12-24 Lytro, Inc. Acquiring, editing, generating and outputting video data
US20100129048A1 (en) * 2008-11-25 2010-05-27 Colvin Pitts System and Method for Acquiring, Editing, Generating and Outputting Video Data
US20100141802A1 (en) * 2008-12-08 2010-06-10 Timothy Knight Light Field Data Acquisition Devices, and Methods of Using and Manufacturing Same
US8724014B2 (en) 2008-12-08 2014-05-13 Lytro, Inc. Light field data acquisition
US8976288B2 (en) 2008-12-08 2015-03-10 Lytro, Inc. Light field data acquisition
US8289440B2 (en) 2008-12-08 2012-10-16 Lytro, Inc. Light field data acquisition devices, and methods of using and manufacturing same
US9467607B2 (en) 2008-12-08 2016-10-11 Lytro, Inc. Light field data acquisition
WO2010077625A1 (en) * 2008-12-08 2010-07-08 Refocus Imaging, Inc. Light field data acquisition devices, and methods of using and manufacturing same
US20100182397A1 (en) * 2009-01-16 2010-07-22 Eun Jeong Choi Connector panel for view camera capable of docking digital single lens reflex camera
US20110234841A1 (en) * 2009-04-18 2011-09-29 Lytro, Inc. Storage and Transmission of Pictures Including Multiple Frames
US8908058B2 (en) 2009-04-18 2014-12-09 Lytro, Inc. Storage and transmission of pictures including multiple frames
US20100265385A1 (en) * 2009-04-18 2010-10-21 Knight Timothy J Light Field Camera Image, File and Configuration Data, and Methods of Using, Storing and Communicating Same
US8749620B1 (en) 2010-02-20 2014-06-10 Lytro, Inc. 3D light field cameras, images and files, and methods of using, operating, processing and viewing same
US8768102B1 (en) 2011-02-09 2014-07-01 Lytro, Inc. Downsampling light field images
US9419049B2 (en) 2011-08-01 2016-08-16 Lytro, Inc. Optical assembly including plenoptic microlens array
US9305956B2 (en) 2011-08-01 2016-04-05 Lytro, Inc. Optical assembly including plenoptic microlens array
US9184199B2 (en) 2011-08-01 2015-11-10 Lytro, Inc. Optical assembly including plenoptic microlens array
US9386288B2 (en) 2012-02-28 2016-07-05 Lytro, Inc. Compensating for sensor saturation and microlens modulation during light-field image processing
US8948545B2 (en) 2012-02-28 2015-02-03 Lytro, Inc. Compensating for sensor saturation and microlens modulation during light-field image processing
US8811769B1 (en) 2012-02-28 2014-08-19 Lytro, Inc. Extended depth of field and variable center of perspective in light-field processing
US9172853B2 (en) 2012-02-28 2015-10-27 Lytro, Inc. Microlens array architecture for avoiding ghosting in projected images
US8995785B2 (en) 2012-02-28 2015-03-31 Lytro, Inc. Light-field processing and analysis, camera control, and user interfaces and interaction on light-field capture devices
US8831377B2 (en) 2012-02-28 2014-09-09 Lytro, Inc. Compensating for variation in microlens position during light-field image processing
US9420276B2 (en) 2012-02-28 2016-08-16 Lytro, Inc. Calibration of light-field camera geometry via robust fitting
US8971625B2 (en) 2012-02-28 2015-03-03 Lytro, Inc. Generating dolly zoom effect using light field image data
US10552947B2 (en) 2012-06-26 2020-02-04 Google Llc Depth-based image blurring
US10129524B2 (en) 2012-06-26 2018-11-13 Google Llc Depth-assigned content for depth-enhanced virtual reality images
US9607424B2 (en) 2012-06-26 2017-03-28 Lytro, Inc. Depth-assigned content for depth-enhanced pictures
US8997021B2 (en) 2012-11-06 2015-03-31 Lytro, Inc. Parallax and/or three-dimensional effects for thumbnail image displays
US9001226B1 (en) 2012-12-04 2015-04-07 Lytro, Inc. Capturing and relighting images using multiple devices
US9456141B2 (en) 2013-02-22 2016-09-27 Lytro, Inc. Light-field based autofocus
US10334151B2 (en) 2013-04-22 2019-06-25 Google Llc Phase detection autofocus using subaperture images
US9305375B2 (en) 2014-03-25 2016-04-05 Lytro, Inc. High-quality post-rendering depth blur
US11911109B2 (en) 2014-08-31 2024-02-27 Dr. John Berestka Methods for analyzing the eye
US10092183B2 (en) 2014-08-31 2018-10-09 Dr. John Berestka Systems and methods for analyzing the eye
US11452447B2 (en) 2014-08-31 2022-09-27 John Berestka Methods for analyzing the eye
US10687703B2 (en) 2014-08-31 2020-06-23 John Berestka Methods for analyzing the eye
US9729775B2 (en) 2014-12-26 2017-08-08 Xiaomi Inc. Auto-focusing method and auto-focusing device
CN104469167A (en) * 2014-12-26 2015-03-25 小米科技有限责任公司 Automatic focusing method and device
US10567464B2 (en) 2015-04-15 2020-02-18 Google Llc Video compression with adaptive view-dependent lighting removal
US11328446B2 (en) 2015-04-15 2022-05-10 Google Llc Combining light-field data with active depth data for depth map generation
US10412373B2 (en) 2015-04-15 2019-09-10 Google Llc Image capture for virtual reality displays
US10419737B2 (en) 2015-04-15 2019-09-17 Google Llc Data structures and delivery methods for expediting virtual reality playback
US10565734B2 (en) 2015-04-15 2020-02-18 Google Llc Video capture, processing, calibration, computational fiber artifact removal, and light-field pipeline
US10275898B1 (en) 2015-04-15 2019-04-30 Google Llc Wedge-based light-field video capture
US10469873B2 (en) 2015-04-15 2019-11-05 Google Llc Encoding and decoding virtual reality video
US10341632B2 (en) 2015-04-15 2019-07-02 Google Llc. Spatial random access enabled video system with a three-dimensional viewing volume
US10540818B2 (en) 2015-04-15 2020-01-21 Google Llc Stereo image generation and interactive playback
US10546424B2 (en) 2015-04-15 2020-01-28 Google Llc Layered content delivery for virtual and augmented reality experiences
US10205896B2 (en) 2015-07-24 2019-02-12 Google Llc Automatic lens flare detection and correction for light-field images
US10275892B2 (en) 2016-06-09 2019-04-30 Google Llc Multi-view scene segmentation and propagation
CN106161950A (en) * 2016-08-09 2016-11-23 乐视控股(北京)有限公司 Focusing position fixing means, device and terminal
US10679361B2 (en) 2016-12-05 2020-06-09 Google Llc Multi-view rotoscope contour propagation
US10594945B2 (en) 2017-04-03 2020-03-17 Google Llc Generating dolly zoom effect using light field image data
US10440407B2 (en) 2017-05-09 2019-10-08 Google Llc Adaptive control for immersive experience delivery
US10444931B2 (en) 2017-05-09 2019-10-15 Google Llc Vantage generation and interactive playback
US10474227B2 (en) 2017-05-09 2019-11-12 Google Llc Generation of virtual reality with 6 degrees of freedom from limited viewer data
US10354399B2 (en) 2017-05-25 2019-07-16 Google Llc Multi-view back-projection to a light-field
US10545215B2 (en) 2017-09-13 2020-01-28 Google Llc 4D camera tracking and optical stabilization
US10965862B2 (en) 2018-01-18 2021-03-30 Google Llc Multi-camera navigation interface
US20220070357A1 (en) * 2020-08-31 2022-03-03 Advanced Micro Devices, Inc. Instant auto-focus with distance estimation
US11902658B2 (en) * 2020-08-31 2024-02-13 Advanced Micro Devices, Inc. Instant auto-focus with distance estimation
US20220311940A1 (en) * 2021-03-25 2022-09-29 Samsung Electronics Co., Ltd. Electronic apparatus including camera and control method thereof
US11743585B2 (en) * 2021-03-25 2023-08-29 Samsung Electronics Co., Ltd. Electronic apparatus including camera and control method thereof

Also Published As

Publication number Publication date
US7409149B2 (en) 2008-08-05
US20070098380A1 (en) 2007-05-03
JP2007128077A (en) 2007-05-24
JP5296307B2 (en) 2013-09-25

Similar Documents

Publication Publication Date Title
US7409149B2 (en) Methods for improved autofocus in digital imaging systems
JP4769553B2 (en) Imaging device
US9781334B2 (en) Control method, camera device and electronic equipment
JP6659130B2 (en) Image processing apparatus, imaging apparatus, image processing method, and program
US20060062433A1 (en) Image sensing apparatus and control method thereof
EP1936435A1 (en) Optical apparatus with unit for correcting blur of captured image caused by displacement of optical apparatus in optical-axis direction
US7627240B2 (en) Optical device with improved autofocus performance and method related thereto
JP5263310B2 (en) Image generation apparatus, imaging apparatus, and image generation method
WO2016002355A1 (en) Image capturing device and image capturing method
US10901174B2 (en) Camera for limiting shifting of focus adjustment optical system
JP2019033344A (en) Image processing apparatus, image processing method, and program
JP6506036B2 (en) Imaging device
JP4710983B2 (en) Image composition device, imaging device, and image composition method
JP5495598B2 (en) Imaging apparatus and control method thereof
JP5693664B2 (en) IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, AND PROGRAM
JP4747673B2 (en) Electronic camera and image processing program
US20060187335A1 (en) Image pickup apparatus
JP2011217334A (en) Imaging apparatus and method of controlling the same
JP5970289B2 (en) Imaging apparatus and rotation drive control method for rotating optical element
WO2014097789A1 (en) Image processing device, image processing method, and recording medium
JP2009036987A (en) Photographing device and control method for photographing device
JP2006229697A (en) Image pickup device
JP2013128165A (en) Imaging apparatus
JP2011211260A (en) Panoramic image pickup device and panoramic image synthesizing method thereof
JP2009048126A (en) Photographing equipment and method of controlling same

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE