US20050212955A1 - System and method for analyzing a digital image - Google Patents

System and method for analyzing a digital image Download PDF

Info

Publication number
US20050212955A1
US20050212955A1 US11/054,291 US5429105A US2005212955A1 US 20050212955 A1 US20050212955 A1 US 20050212955A1 US 5429105 A US5429105 A US 5429105A US 2005212955 A1 US2005212955 A1 US 2005212955A1
Authority
US
United States
Prior art keywords
image data
imaging device
determining
preselected
strobe
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/054,291
Inventor
Murray Craig
Gregory Hofer
Susan Manson
Amy Battles
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hewlett Packard Development Co LP
Original Assignee
Hewlett Packard Development Co LP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett Packard Development Co LP filed Critical Hewlett Packard Development Co LP
Priority to US11/054,291 priority Critical patent/US20050212955A1/en
Assigned to HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P. reassignment HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BATTLES, AMY E., HOFER, GREGORY, CRAIG, MURRAY D., MANSON, SUSAN E.
Publication of US20050212955A1 publication Critical patent/US20050212955A1/en
Priority to US11/412,155 priority patent/US20060239674A1/en
Priority to US12/684,505 priority patent/US8780232B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof

Definitions

  • a digital camera captures an image
  • the image is stored electronically in a memory element associated with the camera and is available for immediate viewing. For example, it is common to capture an image using a digital camera and then immediately display the captured image on a display screen associated with the digital camera. This ability to immediately view the image is commonly referred to as “instant review.”
  • the ability to immediately review the recaptured image allows the user to immediately decide whether the image is satisfactory and worth keeping. The image may then be printed at a later time.
  • a method of analyzing images captured using an imaging device is provided herein.
  • the analysis provides suggestions for changing a parameter of the imaging device during subsequent image capture.
  • FIG. 1 is a block diagram illustrating an embodiment of a digital camera.
  • FIG. 2 is a graphical illustration of an embodiment of an image file.
  • FIG. 3 is a flow chart describing the operation of an embodiment of the image analysis and improvement logic of FIG. 1 .
  • FIG. 4 is a flowchart describing an embodiment of detecting over exposure errors and suggesting corrections thereto.
  • FIG. 5 is a flowchart describing an embodiment of detecting under exposure errors and suggesting corrections thereto.
  • FIG. 6 is a flowchart describing an embodiment of analyzing an image that is over exposed and that was captured using time value mode.
  • FIG. 7 is a flowchart describing an embodiment of analyzing an image for exposure wherein the image was captured using bracketing.
  • FIG. 8 is a flowchart describing embodiments for analyzing an image for blur when the handheld limit has been exceeded and the strobe was not activated.
  • FIG. 9 is a flowchart describing embodiments for analyzing an image for blur when the image was captured using the burst mode, the handheld limit was exceeded, and the strobe was not activated.
  • FIG. 10 is a flowchart describing an embodiment for analyzing an image for white balance errors.
  • Devices and methods for analyzing images are described herein.
  • the devices and methods described herein analyze image data that is representative of images.
  • the devices and methods for analyzing images may be implemented in hardware, software, firmware, or a combination thereof.
  • the system and method for analyzing images are implemented using a combination of hardware, software or firmware that is stored in a memory and that is executable by a suitable instruction execution system.
  • the device is a digital camera wherein software stored on hardware in the camera analyzes image data or otherwise instructs the digital camera to analyze image data.
  • the hardware portion of the system and method for analyzing a captured image can be implemented with any or a combination of the following technologies, which are all well known in the art: a discreet logic circuit(s) having logic gates for implementing logic functions upon data signals, an application specific integrated circuit (ASIC) having appropriate combinational logic gates, a programmable gate array(s) (PGA), a field programmable gate array (FPGA), etc.
  • ASIC application specific integrated circuit
  • PGA programmable gate array
  • FPGA field programmable gate array
  • the software portion of the system and method for analyzing a captured image can be stored in one or more memory elements and executed by a suitable general purpose or application specific processor.
  • the software for analyzing images which comprises an ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions.
  • a “computer-readable medium” can be any means, which contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
  • FIG. 1 is a block diagram illustrating an embodiment of a digital camera 100 , which is sometimes referred to herein simply as a camera 100 .
  • the digital camera 100 includes an application specific integrated circuit (ASIC) 102 that executes the image analysis logic 150 described herein.
  • the image analysis logic 150 can be software that is stored in memory and executed by the ASIC 102 .
  • the image analysis logic 150 maybe be implemented in firmware, which can be stored and executed in the ASIC 102 .
  • the digital camera 100 may include additional processors, digital signal processors (DSPs) and ASICs. It should be noted that the ASIC 102 may include other elements, which have been omitted. As described in greater detail below, the ASIC 102 controls many functions of the digital camera 100 .
  • DSPs digital signal processors
  • the camera 100 includes an image sensor 104 .
  • the image sensor 104 may comprise a charge coupled device (CCD) or an array of complementary metal oxide semiconductors (CMOS), which are both arrays of light sensors. Both the CCD and the CMOS sensor includes a two-dimensional array of photosensors, which are sometimes referred to as pixels.
  • the pixels convert specific wavelengths or colors of light intensities to voltages that are representative of the light intensities. In one embodiment, higher pixel values or voltages are representative of higher intensities of light and lower pixel values are representative of lower intensities of light.
  • the image sensor 104 captures an image of a subject by converting incident light into an analog signal.
  • the analog signal is transmitted via a connection 109 to an analog front end (AFE) processor 111 .
  • the analog front end processor 111 typically includes an analog-to-digital converter for converting the analog signal received from the image sensor 104 into a digital signal.
  • the analog front end processor 111 provides this digital signal as image data via a connection 112 to the ASIC 102 for image processing.
  • the ASIC 102 is coupled to one or more motor drivers 119 via a connection 118 .
  • the motor drivers 119 control the operation of various parameters of the lens 122 via a connection 121 .
  • lens controls such as zoom, focus, aperture and shutter operations can be controlled by the motor drivers 119 .
  • a connection 123 between the lens 122 and the image sensor 104 is shown as a dotted line to illustrate the operation of the lens 122 focusing on a subject and communicating light to the image sensor 104 , which captures the image provided by the lens 122 .
  • the ASIC 102 also sends display data via a connection 124 to a display controller 126 .
  • the display controller may be, for example, a national television system committee (NTSC)/phase alternate line (PAL) encoder, although, depending on the application, other standards for presenting a display data may be used.
  • the display controller 126 converts the display data from the ASIC 102 into a signal that can be forwarded via a connection 127 to an image display 128 .
  • the image display 128 which, as an example may be a liquid crystal display (LCD) or other display, displays the captured image to the user of a digital camera 100 .
  • the image display 128 is typically a color display located on the digital camera 100 .
  • the image shown to a user on the image display 128 may be shown before the image is captured and processed, in what is referred to as “live view” mode, or after the image is captured and processed, in what is referred to as “instant review” mode.
  • a previously captured may be displayed in what is referred to as “review” or “playback” mode.
  • the instant review mode is typically used to display the captured image to the user immediately after the image is captured and the playback mode is typically used to display the captured image to the user sometime after the image has been captured and stored in memory.
  • the instant review mode allows the user of the camera 100 to immediately view the captured image on the display 128 .
  • the image display 128 is typically small, only gross features, or characteristics, of the image can be visually observed.
  • the image display 128 may not accurately reproduce color, tint, brightness, etc., which may further make it difficult for a user to determine the quality of the captured image.
  • the difficulty in visually determining the quality of the captured image leads to the possibility of saving an image that may include deficiencies that, if visually detected, would likely cause the user to discard the image and attempt to capture another image having better quality.
  • the image analysis logic 150 dynamically analyzes one or more characteristics of the captured image.
  • the analysis logic 150 then presents the user, via the image display 128 and a user interface, an analysis of the captured image.
  • An exemplary dynamic analysis of the data for each pixel in a captured image is described below with reference to FIG. 2 .
  • information associated with each pixel may be analyzed to determine whether a significant number of the pixels forming the image are either black or white.
  • a predominance of white pixels may be indicative of overexposure and a predominance of black pixels may be indicative of underexposure.
  • pixels in an image are examined to determine whether sharp transitions exist between pixels. For example, a black pixel adjoining a white pixel may indicate that the image is in focus, while a black pixel separated from a white pixel by a number of gray pixels may indicate that the image is out of focus.
  • White balance is a characteristic of the image that generally refers to the color balance in the image to ensure that white portions of the image appear white.
  • An image in which each pixel is a different shade of the same color may indicate an image in which the white balance is improperly adjusted.
  • an image improvement logic 160 may be provided to present to the user a recommendation in the form of instructions presented on the image display 128 on ways in which to possibly improve a subsequent image.
  • the image improvement logic may suggest adjusting a condition under which the image was captured or adjusting a setting or parameter used to capture the image.
  • the image analysis logic 150 analyzes the captured image and, optionally, the camera settings used to capture the image, and determines a value of one or more characteristics of the captured image. For example, to determine whether the exposure of the image is satisfactory, if a predefined number of white pixels in the image is exceeded, then the image analysis logic 150 may indicate that the image is overexposed.
  • the image improvement logic 160 may determine whether a condition used to capture the image should be adjusted, or whether a camera setting should be adjusted, to improve a subsequent image. For example, if the image analysis logic 150 determines that the image is underexposed, the image improvement logic 160 may determine that a subsequent image may be improved by activating the camera flash for a subsequent image.
  • the analysis can be used by the image improvement logic 160 to suggest adjustments to the settings to improve a subsequent image. These suggested adjustments to the camera settings or parameters can be presented to the user on a help screen via the image display 128 , or, in an alternative configuration, can be automatically changed for a subsequent image.
  • image analysis logic 150 and the image improvement logic 160 may be a single unit. For example, they may exist in the same firmware or be a single computer program. They have been split into separate functions herein solely for illustration purposes.
  • the ASIC 102 is coupled to a microcontroller 161 via a connection 154 .
  • the microcontroller 161 can be a specific or general purpose microprocessor that controls the various operating aspects and parameters of the digital camera 100 .
  • the microcontroller 161 may be coupled to a user interface 164 via a connection 162 .
  • the user interface 164 may include, for example but not limited to, a keypad, one or more buttons, a mouse or pointing device, a shutter release, and any other buttons or switches that allow the user of the digital camera 100 to input commands.
  • the ASIC 102 is also coupled to various memory modules, which are collectively referred to as memory 136 .
  • the memory 136 may include memory internal to the digital camera 100 and/or memory external to the digital camera 100 .
  • the internal memory may, for example, comprise flash memory and the external memory may comprise, for example, a removable compact flash memory card.
  • the various memory elements may comprise volatile, and/or non-volatile memory, such as, for example but not limited to, synchronous dynamic random access memory (SDRAM) 141 , illustrated as a portion of the memory 136 and flash memory.
  • SDRAM synchronous dynamic random access memory
  • the memory elements may comprise memory distributed over various elements within the digital camera 100 .
  • the memory 136 may also store the image analysis logic 150 , the image improvement logic 160 , the settings file a 155 and the various software and firmware elements and components (not shown) that allow the digital camera 100 to perform its various functions.
  • the memory also stores an image file 135 , which represents a captured image.
  • the software code i.e., the image analysis logic 150
  • the settings file 155 comprises the various settings used when capturing an image.
  • the exposure time, aperture setting (f-stop), shutter speed, white balance, flash on or off, focus, contrast, saturation, sharpness, ISO speed, exposure compensation, color, resolution and compression, and other camera settings may be stored in the setting file 155 .
  • the setting file 155 may be accessed by the image analysis logic 150 to analyze a captured image by, in one example, determining the camera settings used to capture the image that is under analysis.
  • the ASIC 102 executes the image analysis logic 150 so that after an image is captured by the image sensor 104 , the image analysis logic 150 analyzes various characteristics of the captured image. These characteristics may include characteristics of the captured image, or alternatively, may include the settings used to capture the image. Further, if the image improvement logic 160 determines that the image could be improved by changing one or more of the conditions under which the image was captured, or by changing one or more camera settings, then the image improvement logic 160 can either suggest these changes via the user interface 164 and the image display 128 , or can automatically change the settings and prepare the camera for a subsequent image. Embodiments of the analysis are described in greater detail below.
  • FIG. 2 is a graphical illustration of an image file 135 .
  • the image file 135 includes a header portion 202 and a pixel array 208 .
  • the header portion or other portion may include data, sometimes referred to herein as metadata, that indicates settings of the camera or conditions in which the image was captured.
  • the metadata may be analyzed to determine whether improvements to subsequent images may be made.
  • the pixel array 208 comprises a plurality of pixels or pixel values, exemplary ones of which are illustrated using reference numerals 204 , 206 and 212 . Each pixel in the pixel array 208 represents a portion of the captured image represented by the image file 135 .
  • An array size can be, for example, 2272 pixels wide by 1712 pixels high.
  • the image file 135 can also be represented as a table of values for each pixel and can be stored, for example, in the memory 136 of FIG. 1 .
  • each pixel has an associated red (R), green (G), and blue (B) value.
  • the value for each R, G and B component can be, for example, a value between 0 and 255, where the value of each R, G and B component represents the color that the pixel has captured. For example, if pixel 204 has respective R, G and B values of 0, 0 and 0, respectively, (or close to 0,0,0) the pixel 204 represents the color black, or is close to black.
  • a respective value of 255 (or close to 255) for each R, G and B component represents the color white, or close to white.
  • R, G and B values between 0 and 255 represent a range of colors between black and white.
  • the data for each pixel in the image file 135 can be analyzed by the image analysis logic 150 to determine characteristics of the image. For example, characteristics including, but not limited to, the exposure, focus or the white balance of the captured image can be analyzed. A predominance of white pixels may be indicative of overexposure and a predominance of black pixels may be indicative of underexposure. To determine whether an image is in focus, pixels in an image are analyzed to determine whether sharp transitions exist between pixels. For example, a black pixel adjoining a white pixel may indicate that the image is in focus, while a black pixel separated from a white pixel by a number of gray pixels may indicate that the image is out of focus. An image in which each pixel is a different shade of the same color may indicate a problem with the white balance of the image. An example of determining the exposure will be described below with respect to FIG. 3 .
  • FIG. 3 is a flow chart 300 describing the operation of an embodiment of the image analysis logic 150 and the image improvement logic 160 of FIG. 1 .
  • Any process descriptions or blocks in the flow chart to follow should be understood as representing modules, segments or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process, and alternative implementations are included within the scope of the preferred embodiment.
  • functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
  • the image sensor 104 of FIG. 1 captures an image.
  • the image is stored in the memory 136 as image file 135 .
  • the image represented by the image data is displayed to the user of the digital camera 100 via the image display 128 of FIG. 1 during the “instant review” mode.
  • the instant review mode affords the user the opportunity to view the captured image subsequent to capture.
  • decision block 306 the user determines whether he or she wants to view the settings with which the image was captured. If the user wants to view the settings, the settings are displayed to the user on the image display 128 as indicated in block 308 . If the user does not want to view the settings, then, in decision block 312 , it is determined whether the user wants the image analysis logic 150 to analyze the image. If the user does not want the image to be analyzed, then, in block 314 the image can be saved or discarded. Alternatively, the image analysis logic 150 can be invoked automatically without user intervention.
  • the image analysis logic 150 analyzes the data within the image file 135 .
  • the data is analyzed to determine various characteristics of the captured image.
  • the following example will use exposure as the characteristic that is analyzed by the image analysis logic 150 .
  • other characteristics such as, focus and white balance, can be analyzed. Analysis of several of these other characteristics will be described in greater detail below.
  • the image analysis logic 150 When analyzing exposure, the image analysis logic 150 performs a pixel by pixel analysis to determine whether the image includes a predominance of either black or white pixels. It should be noted that rather than sampling all the pixels constituting the image, a sample of the pixels may be analyzed. In this example, the data associated with each pixel in the image file 135 is analyzed to determine whether a pixel is a black pixel or a white pixel. Each pixel is analyzed to determine its corresponding R, G and B values. For example, if the R, G and B values for the pixel 204 are all zeros, the pixel is considered a black pixel.
  • Each pixel in the pixel array 208 is analyzed in this manner to determine the number of black or white pixels in the pixel array 208 for this image file.
  • a determination in block 306 that a substantial portion of the pixels in the array 208 are black indicates that the image is likely underexposed.
  • a determination that many of pixels in the array 208 are white indicates that the image is likely overexposed.
  • the image may be of an all white or an all black subject, in which case the user may choose to disregard the analysis.
  • the data in the image file 135 can be analyzed in combination with other data available either in the image file 135 or from the settings file 155 in the camera 100 .
  • additional data sometimes referred to as metadata
  • saved in the header 202 of the image file 135 can be analyzed in conjunction with the information from each pixel in the array 208 .
  • This information might include, for example, the ISO setting and the aperture setting (f-stop) used to capture the image.
  • These data items can be used in conjunction with the pixel data above to develop additional information regarding the characteristic of the analyzed image. Analysis of the settings will be described in greater detail below.
  • the image analysis logic 150 can also analyze the camera settings used to capture the image and use those settings when analyzing the data in the image file 135 to develop additional data regarding the image file 135 .
  • the image analysis logic 150 can access the settings file 155 in the memory 136 of FIG. 1 to determine, for example, whether the flash was enabled, or to determine the position of the lens when the image was captured. In this manner, the image analysis logic 150 can gather a range of information relating to the captured image to perform an analysis on the captured image file 135 to determine whether the captured image meets certain criteria.
  • the image analysis logic 150 can access the settings file 155 to determine whether the flash was active when the image was captured. If the image analysis logic 150 determines that the flash was turned off, the image analysis logic 150 may communicate with the image improvement logic 160 to recommend that the user activate the flash so that a subsequent image may have less likelihood of being underexposed. It should be noted that the settings file 155 may be appended to the image file 135 .
  • decision block 318 it is determined whether the image data analyzed in block 316 represents an acceptable image. This can be an objective determination based on criteria that the user enters into the camera 100 via a user interface 164 , FIG. 1 , or can be preset in the camera 100 at the time of manufacture. Alternatively, the determination of whether the image data represents an acceptable image can be a subjective determination based on user input. If the image is determined to be acceptable, then no further calculations or analysis are performed.
  • the image improvement logic 160 evaluates the settings used to capture the data in the image file 135 to determine whether a condition or setting can be changed to improve the image.
  • the image improvement logic 160 can also develop recommendations to present to the user of the camera to improve a subsequent image. For example, if the analysis in block 316 suggests that the image was underexposed, the image improvement logic 160 may develop “advice” to be presented to the user. In this example, as will be described below, the image improvement logic 160 may suggest that the user activate the flash to improve a subsequent image. This suggestion may be provided to the user via the image display 128 in conjunction with the user interface 164 .
  • an instant review settings and help screen is displayed to the user.
  • the instant review and help screen may include, for example, a thumbnail size display of the image, a display of the setting used to capture the image, an evaluation of the image and, if the user desires, suggestions on ways to improve the image.
  • the evaluation of the image may include, for example, a notification that characteristics, such as exposure, focus and color balance are satisfactory.
  • Suggestions on ways in which to improve the image may be communicated to the user via the image display 128 and may include, for example, changing a condition under which the image was captured, changing a setting with which the image was captured, or a combination of both changing a condition and a setting.
  • decision block 326 the user determines whether another image is to be captured. If the user does not want to capture another image, the process ends. If, however, in decision block 326 , the user wants to capture another image, then, in decision block 332 , it is determined whether the user wants to manually change a parameter, such as a condition or setting, for the subsequent image or, if the parameter is to be set automatically the digital camera 100 , FIG. 1 .
  • a parameter such as a condition or setting
  • decision block 332 the user decides to manually change the setting
  • block 334 the user changes the setting and the process returns to block 302 where another image is captured and the process repeats. If, however, in decision block 332 , the user wants the digital camera 100 to automatically change the setting, then, in block 336 , the setting used to capture the previous image are changed according to the new setting determined in block 324 . The process then returns to block 302 to capture a subsequent image.
  • the data in the header 202 , FIG. 2 , of an image file 135 is sometimes referred to as metadata.
  • the metadata may include several characteristics related to the camera settings at the time the image was captured. These settings may be settings adjusted manually by the user or automatically by the camera.
  • the image analysis logic 150 the metadata, and not the data representative of the pixels 208 , is analyzed.
  • the following analysis provides determinations of some of the possible anomalies that may be detected by the image analysis logic 150 . Thus, fewer or more possible anomalies may be detected.
  • the pixel values may be analyzed to determine whether a preselected number of pixel values are above or below preselected values.
  • the metadata may also be analyzed to determine the camera settings and ambient conditions at the time the image was captured to determine if the camera settings were proper. It is noted that the time of image capture refers to a time in which the digital camera generated image data.
  • FIG. 4 is a flowchart 200 describing an embodiment of detecting over exposure errors and suggesting corrections to overcome the errors.
  • the embodiment of the method set forth in FIG. 4 suggests corrections when the image is over exposed by more than a predetermined amount and the camera is in aperture priority mode.
  • Aperture priority mode enables a user to select an aperture setting during image capture.
  • the digital camera may have the above-described aperture priority mode and another mode wherein the digital camera selects an aperture to use during image capture.
  • decision block 202 a decision is made as to whether the camera was in aperture priority mode during image capture.
  • aperture priority mode enables a user of the camera to manually select an aperture setting. Data stored in the metadata may indicate whether the camera was in aperture priority mode during image capture. If the camera is not in aperture priority mode, processing proceeds to block 204 where processing continues to the next analysis. More specifically, the suggestion ultimately offered by the flowchart 200 will not be applicable to the camera setting when the camera is not in aperture priority mode. If the camera is in aperture priority mode, the analysis continues to decision block 206 .
  • decision block 206 a decision is made as to whether the image is over exposed by a predetermined amount. For example, the image may be analyzed to determine if the exposure is greater than a preselected stop value. In the embodiment of the flow chart 200 , the decision block 206 determines whether the image is over exposed by more than two-thirds of a stop. It should be noted that other values of the stop may be used in the decision block 206 . If the image is not over exposed by more than the preselected stop value, processing continues to block 204 as described above. If the image is over exposed by more than the preselected stop value, processing continues to decision block 208 as described below.
  • the preselected value corresponds to two-thirds stop. It should be noted that in other embodiments, determinations may be made as to whether the exposure is between preselected values and an indication may be provided as to the amount of overexposure. A suggestion that the image may be over exposed may be provided by also determining an exposure compensation values set during generation of the image data.
  • the decision block 208 determines whether the exposure compensation is between plus and minus 0.6. It is noted that an exposure compensation of a value other than zero is indicative of a manual user setting. In this embodiment, if the exposure compensation is not within the preselected values, processing proceeds to block 204 as described above. If the exposure compensation is within the preselected values, processing proceeds to block 210 .
  • Block 210 determines the number of stops the image is over exposed. For example, the pixel values may be analyzed to determine the amount of over exposure. Based on the foregoing, block 212 causes the camera to display information related to correcting the over exposure problem. In the embodiment of the flowchart 200 , the information informs the user of the stop value of the over exposure and suggests using a smaller aperture setting, which relates to a larger f-number. Block 212 may also suggest using an automatic mode, wherein the camera selects the aperture and possibly the exposure compensation.
  • FIG. 5 is a flowchart 230 describing an embodiment of detecting under exposure errors and suggesting corrections thereto.
  • the method set forth in FIG. 54 suggests corrections when the image is under exposed by more than a predetermined amount and the camera is in aperture priority mode.
  • the under exposure corresponds to two-thirds stop and in another embodiment, the under exposure corresponds to one stop.
  • decision block 232 a decision is made as to whether the camera was in aperture priority mode during the generation of image data.
  • Data stored in the metadata may indicate whether the camera was in aperture priority mode. If the camera was not in aperture priority mode during generation of the image data, processing proceeds to block 234 where processing continues to the next analysis. More specifically, the suggestion for improving image quality ultimately offered by the flowchart 220 will not be applicable to the camera setting. If the camera was in aperture priority mode, the analysis continues to decision block 236 .
  • decision block 236 a decision is made as to whether the image is under exposed by a predetermined amount, which may be a preselected stop value.
  • a predetermined amount which may be a preselected stop value.
  • the decision block 236 determines whether the image is under exposed by more than two thirds of a stop. It should be noted that other under exposure values, such as one stop, may be used in the decision block 236 . If the image is not under exposed by more than the preselected stop value, processing continues to block 234 as described above. If the image is under exposed by more than the preselected amount, processing continues to decision block 238 as described below.
  • an indication of under exposure may be assisted by analyzing an exposure compensation setting during the generation of image data.
  • decision block 238 where a determination is made as to whether the exposure compensation was within preselected values. It should be noted that in other embodiments, determinations may be made as to whether the exposure compensation is greater or less than preselected values.
  • the decision block 238 determines whether the exposure compensation is set to zero. It is noted that an exposure compensation of a value other than zero is indicative of a manual user setting. If the exposure compensation is not within the preselected values, processing proceeds to block 234 as described above. If the exposure compensation is within the preselected values, processing proceeds to block 240 . It should be noted that in some embodiments, exposure compensation is not analyzed.
  • Block 240 determines the number of stops the image is under exposed. Based on the foregoing, block 242 causes the camera to display information related to correcting the under exposure problem. In the embodiment of the flowchart 230 , the information informs the user of the stop value of the under exposure and suggests using a larger aperture setting, which relates to a smaller f-number. Block 242 may also suggest setting the camera to automatic mode as described above.
  • Time value mode is sometimes referred to as Tv mode.
  • the time value mode enables a user to select the shutter speed of the camera, which determines the exposure time during image capture. More specifically, the shutter speed determines the amount of time that the photosensors charge during image capture. If the shutter speed is set too slow, the image may be over exposed. Likewise, if the shutter speed is set too fast, the image will be under exposed.
  • Block 264 An embodiment of analyzing an image to determine whether the image is over exposed due to an improper setting in time value mode is shown in the flow chart 260 FIG. 6 .
  • a determination is made as to whether the camera was in time value mode during image capture. The decision as to whether the camera was in time value mode during image capture may be made by analyzing the metadata associated with the image. If the camera was not in time value mode, the following analysis is not relevant and processing proceeds to block 264 . Block 264 simply directs the processing to analyze other possible problems with the captured image.
  • the setting of exposure compensation at the time of image capture may provide insight to exposure problems.
  • exposure compensation is analyzed at decision block 266 , where a determination is made as to whether the exposure compensation was set to a preselected value.
  • the decision as to whether the exposure compensation is set to a preselected value may be made by analyzing the metadata associated with the image.
  • the decision block 266 determines whether the exposure compensation is set to zero.
  • the decision block 266 may determine if the exposure compensation is greater than or less than preselected values or between preselected values.
  • processing proceeds to block 264 because, the analysis does not have bearing on the camera settings.
  • exposure compensation is not analyzed.
  • decision block 270 determines whether the exposure error is greater than a preselected value, meaning that the image is overexposed. If the image is not overexposed, processing proceeds to block 264 as described above. If the image is overexposed, processing proceeds to block 272 . As described above, the pixel values may be analyzed to determine if the image is over exposed.
  • Block 272 displays information regarding the image. Many different embodiments of the information may be displayed. In one embodiment, the information informs the user that the image is overexposed. The information may also include the amount of the overexposure and a suggestion that the scene brightness was so high that the camera could not select an appropriate F-number. A suggestion of a faster shutter speed or using an automatic mode may be provided to the user. Block 272 may also suggest setting the camera to automatic mode.
  • the analysis of the metadata and image data may determine that the image is under exposed and the camera is in a time value mode.
  • the analysis may be the same as with the flow chart 260 and the description provided above, except a determination at block 270 may determine that the image is under exposed. It follows that the suggestions to correct the problem would be the opposite as those provided at block 272 .
  • the information may indicate that the scene brightness was too low for the camera to select a low enough F-number.
  • the suggestion may include using a slower shutter speed or an automatic mode. As with block 272 , a suggestion may be made to use the automatic mode of the camera.
  • Some embodiments of the camera include a bracketing mode, which enables users to capture a plurality of images using different settings. More specifically, the camera captures a series of images using at least one preselected range of settings. For example, the camera may capture three images wherein each image has a different exposure compensation. A user may select the best image from the plurality of captured images. The camera may determine that some errors occurred while using the bracketing mode and may suggest procedures to correct the errors during subsequent image captures.
  • the exposure compensation is set to great to use the bracketing mode properly. More specifically, the absolute value of the exposure compensation may be too great to use the bracketing mode properly.
  • This determination may be made by analyzing the metadata to determine if the exposure compensation is greater or less than preselected values. In one embodiment, the determination is made if bracketing is set and the metadata indicates that the exposure compensation is set to a value of greater than 2.3 or a value less than ⁇ 2.3.
  • the camera may inform the user of the problem. For example, the camera may suggest setting the exposure compensation to a value closer to zero or using the automatic mode.
  • the user may want an image to be extremely over exposed.
  • the camera may analyze metadata and image data to determine that the camera was set to over expose images and that further over exposure may be achieved.
  • the images may be captured using varying shutter speeds or ISO speeds.
  • the varying shutter speeds may be set slow which would cause the images to be over exposed.
  • the image data may be examined to determine if it is over exposed. If so, the camera may suggest using a greater exposure compensation during generation of subsequent image data.
  • the exposure of the image may have encountered a maximum value during the bracketing sequence.
  • more than one image would be similar.
  • the camera may suggest lowering the exposure compensation during the bracketing sequence. Therefore, the images will vary from one to another. For example, the camera may determine that exposure compensation was set greater than 2.3 and that ten percent of the pixel values are greater than a preselected value. The camera may suggest lowering the exposure compensation to a value between plus and minus 2.0 during the subsequent bracketing sequence. In one embodiment, the camera determines that a maximum exposure of 3.0 stops was obtained during the bracketing sequence and recommends the above-described changes during the subsequent bracketing sequence.
  • the camera may detect that a user wanted greater under exposure during a bracketing sequence.
  • the shutter speed or ISO speed may be set fast so as to cause under exposure.
  • More than one image captured during the bracketing sequence may have reached the maximum under exposure of the camera, so the full use of the bracketing sequence may not be realized.
  • more than one image may be under exposed by a maximum of the camera of 3.0 stops.
  • the camera may suggest reducing the exposure compensation during a subsequent bracketing sequence.
  • the camera may suggest setting the exposure compensation to values of between plus and minus 2.0 during a subsequent bracketing sequence.
  • pixel values are analyzed to determine if they are clipped or dark.
  • Clipped pixel refer to pixel values that are at a maximum or saturated value. Clipped pixel values may be indicative of an image that is over exposed. Dark pixel values are indicative of an under exposed image. During bracketing, clipped or dark pixels are indicative of the exposure of the image being too light or too dark.
  • the bracketing mode may capture a plurality of images using different shutter or ISO speeds. Block 282 may determine if the bracketing mode enables exposure compensation of plus or minus 0.7 or greater. If the camera is not in bracketing mode, processing proceeds to block 284 where other possible problems with the captured image are analyzed. If the camera is in bracketing mode, processing proceeds to decision block 286 where a determination is made as to whether the number of clipped pixels exceed a preselected value. The number of clipped pixels may be determined by counting the number of pixel values in the image file that are saturated or that exceed a preselected value. In some embodiments, block 286 determines if the number of clipped pixel values exceed three percent.
  • Block 288 displays information related to the image being over exposed. In one embodiment, a message is displayed indicating that the image is over exposed and suggests setting the exposure compensation closer to zero during a subsequent bracketing sequence. The exposure compensation may also be set to automatic mode.
  • decision block 290 determines whether the number of dark pixels in the image file exceed a preselected number. The number of dark pixel values may be determined by counting the number of pixel values that are zero or that are less than a preselected number. In some embodiments, decision block 290 determines if greater than ten percent of the pixel values are less than a preselected value. If the determination of decision block 290 is negative, processing proceeds to block 284 as described above. If the determination of decision block 290 is positive, processing proceeds to block 292 . Block 292 causes the camera to display information similar to block 288 . However, block 292 may indicate that the exposure compensation is negative and that the image is under exposed. The suggestion is to adjust the exposure compensation closer to zero. Again, the suggestions may also include using the automatic mode.
  • One problems that may be analyzed is a situation where one of a plurality of images captured during bracketing mode required a strobe, which did not activate, and the other images did not require the strobe.
  • This situation may be indicative of a blurry image. More specifically, during bracketing mode, if the nominal image does not require a strobe, the other images may be captured without the strobe. If the camera determines that one of the images, other than the nominal image, required a strobe, it may indicate that the shutter speed dropped below the hand held limit for the desired zoom.
  • the hand held limit is a function of the zoom and the shutter speed and represents the limit to which a typical user is able to hold the camera and capture an image without the image being blurred.
  • the hand held limit may be reached by either a narrow or telephoto zoom or a long exposure time used during image capture. Either situation makes the image more susceptible to blurring caused by the user holding the camera, which may cause the camera to shake too much during image capture.
  • the metadata of the images captured during the bracketing are analyzed. Based on the zoom and the exposure time, a determination within the camera may be made that the hand held limit had been reached during image capture. In addition, based on the metadata, a determination may be made that the strobe should have been activated. In this situation, the strobe was forced not to activate because of the bracketing. The determination could then be made that the image may be blurry. A suggestion may be offered to the user to use a tripod or other camera stabilizing device. Other suggestions may include turning the flash on for the entire bracketing sequence.
  • the user captures images with the exposure compensation set to a positive value and the captured images are over exposed.
  • an analysis concludes that the user may have forgotten that the exposure compensation is set to a positive value.
  • the metadata may be analyzed to determine if the image was captured while the exposure compensation was set to a positive value. In some embodiments, the analysis determines of the exposure compensation was set equal to or greater than 0.6. In addition, the number of clipped pixels may be compared to a preselected number to determine if the number of clipped pixels exceeds the preselected number. For example, the analysis may determine if the number of clipped pixels exceeds three percent. If the above-described conditions are met, the camera may display a message to the user indicating that the image may be over exposed and that the exposure compensation is set to high. The camera may suggest lowering the exposure compensation or setting it closer to zero. The camera may also suggest using an automatic exposure compensation mode.
  • the analysis indicates that the subject is over exposed or under exposed.
  • the state of the exposure may be determined by sampling pixel values in a portion of the image. For example, pixel values from the center of the image where the subject is typically located may be analyzed. An excessive number of clipped or dark pixel values in a specific region relative to another region may indicate that the subject is over exposed or under exposed.
  • the metadata is analyzed to determine that the camera was not in aperture priority mode and the subject, as described above, is under exposed.
  • the subject may be backlit and the light meter may have measured the background rather than the subject.
  • the analysis may indicate that the subject is under exposed.
  • the solution suggested to the user may be to increase the exposure compensation to a positive number such as +0.3.
  • the processing program may also suggest forcing the flash on if the subject is within a preselected range, such as within ten feet of the camera.
  • the metadata is analyzed to determine that the camera is in automatic mode and the subject, as described above, is over exposed and the background is dark.
  • the metadata indicates that the strobe did not activate during image capture.
  • the processing program may indicate that the subject is over exposed.
  • the solution suggested to the user may be to reduce the exposure compensation to a negative number.
  • the processing program analyzes exposure problems with the image wherein the image was captured using spot mode.
  • Spot mode relates to the portion of an image use by the camera during focusing and setting up exposure for the remainder of the image. In spot mode, the portion is typically very small.
  • the determination as to whether the camera was in spot mode during image capture may be determined by analyzing the metadata. Over exposure or under exposure problems in certain portions of the image may be detected by analyzing the pixel values.
  • the analysis uses the center of the image to determine if it is dark and if surrounded by bright areas. Such a situation may indicate to the user that the image is over exposed.
  • the processing program may also indicate that the image was captured using spot mode and that the camera relied solely on the dark portion to calculate exposure. Thus, the remaining portion of the image is not properly exposed.
  • the camera may suggest setting the metering to average or center-weighted.
  • the camera may also suggest using an automatic mode.
  • the camera may also suggest using a wider portion of the image during focusing and setting up exposure.
  • the image is analyzed as being light in the center and dark in the surrounding areas.
  • this situation may be detected by analyzing the pixel values corresponding to various portions of the iamge. If such a situation is detected, the processing program may indicate that the image appears to be under exposed. The processing program may also indicate that the image was captured using the spot mode and that the camera did not use dark regions on the edge of the scene to calculate exposure. The camera may suggest setting the metering to average or center-weighted. The camera may also suggest using a wider portion of the image for focusing and exposure settings.
  • the program may find many situations in which an image is or may be under exposed.
  • the scene may be relatively dark and the strobe may not have activated during image capture.
  • the camera may select a slow shutter speed during image capture.
  • the camera may not have a shutter speed slow enough to properly expose the image of the relatively dark scene.
  • the program may determine that the image is under exposed by analyzing the pixel values as described above.
  • the program may also analyze the shutter speed during image capture, which may be stored in the metadata. If the camera does not have a slower shutter speed, the program may suggest using a higher ISO or a wider zoom. The wider zoom may cause the aperture to open a little wider.
  • the camera may also suggest illuminating the scene.
  • the processing program may analyze several items in the metadata to determine that the image may be blurry due to shaking of the camera at the time the image was captured.
  • An embodiment for determining whether the image may be blurry is shown in the flowchart 300 of FIG. 8 .
  • Other methods of detecting focus problems due to shaking are described further below.
  • the handheld limit is a function of zoom and exposure time.
  • the basis for the handheld limit is that a user of a camera that holds the camera is going to shake the camera during image capture, which is going to blur the image.
  • the camera may be programmed with a handheld number or limit, which may be based on the amount of shaking a typical user shakes while holding the camera. It is noted that a longer exposure time or greater zoom increases the handheld calculation closer to or beyond the handheld limit.
  • the hand held limit refers to a function of exposure time and zoom setting, and may include other variables.
  • the processing continues to analyze another aspect or potential problem with the image because conditions were not met to proceed with the following analysis.
  • a macro mode is a mode wherein the user captures an image of an object that is significantly close to the camera. For example, the object may be located a few inches from the camera. If all the above-described conditions are met, the process may determine that the image could be out of focus and may proceed to block 316 .
  • Block 316 displays information regarding the possible focus problem and suggestions to overcome the problem. For example, text indicating that the image may be out of focus if it was captured without stabilizing the camera may be displayed. In addition, suggestions of reducing the zoom, improving the lighting, and stabilizing the camera may be provided to the user.
  • decision block 318 determines whether the zoom was set to a wide angle during image capture. The determination may be made by comparing the setting of the zoom to a preselected value, wherein a zoom setting below the preselected value constitutes a wide angle zoom setting. If the determination of decision block 318 is affirmative, processing proceeds to block 320 , where text is displayed to indicate a possible problem with the image. The text may indicate that the image may be out of focus if it was captured if a tripod or the like was not used to stabilize the camera during image capture. Suggestions for correcting the problem may also be displayed and include using an automatic flash or strobe mode and stabilizing the camera during image capture.
  • processing proceeds to decision block 322 .
  • decision block 322 a determination is made as to whether the zoom setting of the camera during image capture was in a middle region. This may be accomplished by determining whether the zoom was set between two preselected values which represent the middle range of the zoom setting. If the determination of decision block 322 is affirmative, processing proceeds to block 324 .
  • text is displayed to suggest that a problem exists with the image and to offer suggestions for improving the image. For example, the text may indicate that the image may be out of focus if it was captured without stabilizing the camera or if the subject was moving. In order to improve a subsequent image, the text may suggest setting the flash to automatic mode, stabilizing the camera, and using a wider zoom.
  • block 326 If the determination of decision block 326 is negative, processing proceeds to block 326 , which is similar to block 306 . In summary, the analysis could not determine any problems with the image and the next parameter will be analyzed. It should be noted that block 326 may never be reached if the zoom settings of decision blocks 312 , 318 , and 322 encompass all possible zoom settings.
  • the image may be blurred or otherwise out of focus if it is determined that the handheld limit has been exceeded. Blurring may be more prominent in burst mode, which causes several images to be captured within a preselected period.
  • the strobe is typically not activated in burst mode because it does not have time to charge between image captures. Thus, if the images are captured in low light conditions, the shutter speed is slowed, which increases the likelihood that the hand held limit will be reached.
  • FIG. 9 An embodiment of analyzing focus problems in burst mode is provided in the flowchart 400 of FIG. 9 .
  • the flowchart of FIG. 9 is based on an embodiment wherein the strobe is disabled during burst mode.
  • decision block 404 of the flowchart 400 a determination is made as to whether the camera was in burst mode during image capture. If the determination of decision block 404 is negative, processing proceeds to block 406 . At block 406 , it is determined that the remaining analysis has no bearing and processing proceeds to the next analysis. If the determination of decision block 404 is affirmative, processing proceeds to decision block 410 where a determination is made as to whether the handheld limit was exceeded during image capture. As described above, the metadata may be analyzed to determine if the handheld limit has been exceeded. If the determination of decision block 410 is negative, processing proceeds to block 406 as described above.
  • decision block 412 determines if the camera focused at the time the user captured the image. Again, if the determination of decision block 412 is negative, processing proceeds to block 406 as described above. The decision as to whether the camera focused at the time of image capture is sometimes referred to as whether the “focus lock” was achieved, indicating that the focus of the camera was at a preselected threshold during image capture.
  • decision block 416 a determination is made as to whether the strobe would have otherwise activated during image capture.
  • the determination of decision block 416 determines if the scene was illuminated dim so that the strobe would have activated, except the camera was in burst mode.
  • the camera includes sunset mode, which is used to capture images in low light conditions. For example, the shutter speed may be relatively slow.
  • the metadata may be analyzed to determine if the camera was in sunset mode when the image was captured. In such situations, the strobe would typically activate but for the camera being in burst mode.
  • the image may be blurred because of the low light conditions and because the strobe was not activated.
  • Block 418 displays text on the camera indicating that the image may be out of focus if a tripod or other stabilizing device was not used during image capture.
  • the text may indicate that the image was captured during low light conditions with the strobe forced off due to the burst mode. Accordingly, the exposure time was long and may have exceeded the handheld limit.
  • a suggestion of using a tripod or otherwise steadying the camera during image capture may be provided to the user.
  • the camera may also suggest using a strobe, which may require capturing images on a mode other than burst mode.
  • the camera may suggest widening the zoom. As described above, a narrow zoom increases the likelihood of reaching the hand held limit.
  • Some embodiments of the camera include a switch that is used to cause the camera to capture images.
  • the switch may be in a first position when no pressure is applied.
  • the switch may be in a second position when a first force is applied.
  • the second position is typically achieved when a user presses lightly on the switch.
  • the second position cause the camera to focus on a scene. Because the focusing may take a while, the user may have to maintain the switch in the second position for a period while the camera focuses.
  • Many cameras provide a focus indicator, which provides the user with an indiction as to whether focus has been achieved or not.
  • Application of more force cause the switch into a third position, wherein the third position causes the camera to generate image data.
  • the camera may not achieve focus before an image is captured.
  • the status of the focus may be stored in the metadata associated with each image.
  • the program analyses the metadata to determine if focus was achieved prior to the generation of image data.
  • the time that the switch was maintained in the second position may provide an indication as to the status of the focus. For example, if the switch was rapidly pressed from the first position to the third position, the rapid pressing may have caused the camera to shake during image capture, which may blur the image. Thus, if the camera found that focus lock was not achieved or if the switch was pressed too fast, the program may indicate that the image may be blurry.
  • the camera suggests slowing down the speed in which the switch was pressed.
  • the camera may suggest maintaining the switch in the second position until focus lock is achieved.
  • the camera may also suggest attempting to focus on a high contrast portion of the scene. High contrast portions of a scene usually provide better references for focus.
  • the camera may focus faster if the lighting in the scene is intensified.
  • the program may suggest increasing light intensity during generation of subsequent image data.
  • the camera includes a close focus and an infinity focus.
  • the close focus in this embodiment, enables the camera to focus on objects between a first distance from the camera to infinity.
  • the infinity focus enables the camera to focus on objects located between a second distance from the camera and infinity, wherein the second distance is greater than the first distance.
  • a camera may also include a manual focus mode wherein a user manually focuses the camera.
  • the program may analyze the focus of the camera by analyzing the metadata to determine whether the camera obtained focus lock during generation of the image data.
  • the program may also analyze the focus setting to determine the focus mode of the camera during generation of the image data. If the camera was set to infinity focus during generation of the image data and the camera was unable to focus, the program may suggest using the close focus mode during generation of subsequent image data.
  • the close focus mode increases the range of focus of the camera so that there is a better chance of obtaining focus lock during generation of subsequent image data.
  • the program may suggest using an automatic focus mode wherein the close focus and infinity focus modes are automatic focus modes.
  • the above-described analysis may also apply to a macro mode, wherein the macro mode enables the camera to focus on objects located in close proximity to the camera. If the camera detects an object within the range of the macro mode, but detects that the camera was not set to focus in macro mode during generation of the image data, the program may suggest using another focus mode or moving further away from the object during generation of subsequent image data. The program may also suggest stabilizing the camera during generation of subsequent image data because use of the macro mode increases the possibility of encountering the above-described hand held limit. In another embodiment, the program may suggest reducing the aperture size when the camera is in a macro mode.
  • the camera include an indicator that provides an indication when focus lock is achieved.
  • the indicator is sometimes referred to as a focus assist indicator.
  • a focus assist indicator may be provided in the camera.
  • an LCD screen may provide an indication as to the status of the focus lock.
  • a light may change color depending on whether or not focus lock is achieved. In low light conditions, the light may irritate the user, so the user, in some embodiments, may disable the light. A problem occurs if the user attempts to capture images with the light disabled because the user may not know whether the camera has achieved focus lock.
  • the status of the indicator may be stored in the metadata. The status may include whether the indicator was disabled during generation of image data and whether the camera achieved focus lock. If the indicator was disabled, the program may suggest enabling the indicator during generation of subsequent image data. In one embodiment, the program may analyze the ambient light intensity and determine that the focus indicator should be enabled if the ambient light intensity is low. As stated above, achieving focus may be difficult in low light conditions, so the program may suggest enabling the focus assist indicator so as to improve the changes of achieving focus lock. In yet another embodiment, the camera may provide the indication to enable the focus assist indicator if the camera did not achieve focus lock.
  • the program may analyze a variety of image problems related to the strobe.
  • the strobe may be activated when it is too close to a subject, which results in over exposure of the subject.
  • the strobe may have an exposure compensation associated with it which may be set so as to cause over exposure or under exposure of an image.
  • the program may detect that an object blocked the strobe during image capture.
  • Various strobe anomalies are described below.
  • the intensity of light emitted by the strobe that is able to effectively illuminate a subject decreases rapidly with distance.
  • the strobe will not be effective when it is used to illuminate scenes or objects that are far away so as to be out of the range of the strobe.
  • the effective distance of the strobe is a function of the zoom and other camera settings.
  • the camera analyzes the metadata to determine if the strobe was activated when the image was captured.
  • the metadata or other data associated with an image may provide an indication as to the focal length of the camera when the image was captured. It follows that the zoom setting of the camera may be determined from the metadata or other data including the settings of other camera functions stored in the metadata.
  • This information will determine if the camera was focused beyond the effective range of the strobe and may provide an indication as to the distance between the camera and the scene or object at the time of image capture. It is noted that other embodiments for measuring the distance between the camera and a scene or object may be used by the camera.
  • the pixel values may be analyzed to determine if the image is under exposed. This may be achieved as described above by analyzing the number of pixel values that are at or below a predetermined value. If all the above-described conditions are met, a determination is made that the image may be dark or under exposed because it was captured with the subject beyond the effective range of the strobe. Text may be displayed indicating the reasons for the dark image. A suggestion may be provided that includes moving the camera and the subject closer or turning off the strobe and using a long exposure time. Other suggestions include increasing the exposure compensation of the strobe.
  • the program may make the above-described suggestions if the program determines that other criteria are also met. For example, the program may require that the camera is in an automatic mode to select shutter speed. The program may also only display the above-described information if the ISO was set to four-hundred. Furthermore, the program may require that the strobe be activated or fired at full power during image capture.
  • the intensity of light emitted by the strobe that is able to effectively illuminate the subject is proportional to the distance between the camera and the subject. In some situations, the strobe may be too close to the subject and may cause the subject to be over exposed.
  • the metadata may be analyzed. If the zoom was set to focus on a close subject or if the camera was set to a macro mode, it is assumed that the camera was located very close to the subject. It is noted that other methods may be used to determine the distance between the object or scene and the camera. The zoom may also be analyzed to determine if the lens was set to a magnification that is below a preselected value.
  • the camera may analyze the pixel values of the image to determine if a predetermined number of the pixel values are clipped or otherwise exceed a predetermined value. As described above, clipped pixel values are indicative of an over exposed image.
  • the camera may display text indicating that the image is likely over exposed.
  • the camera may suggest turning the flash off or moving the camera away from the subject during generation of subsequent image data.
  • the program may require that other criteria be met before the above-described information is displayed.
  • the program may display the above-described information if the camera was set to automatic shutter speed.
  • the program may require that the exposure compensation be set to zero.
  • the program may determine that the image is over exposed and the object was not excessively close to the camera during image capture.
  • the program may analyze the pixel values as described above and may determine that the strobe activated during image capture. Based on these findings, the program may suggest a plurality of improvements to the image quality.
  • the program determines that the strobe activated during image capture and the image is over exposed.
  • the program may suggest turning off or deactivating the strobe during generation of subsequent image data.
  • the program may analyze other settings of the camera at the time of image capture to determine that other settings or parameters of the camera, such as ISO speed, did not cause the over exposure.
  • the program may analyze a strobe exposure compensation setting at the time of image capture.
  • the value of this setting may be stored in metadata associated with the image data.
  • the program may analyze the strobe exposure compensation. If the program determines that the strobe exposure compensation is too high, the program may suggest that the user use a lower value during generation of subsequent image data.
  • the program may analyze the metadata and other data to determine if the image is possibly under exposed. In one embodiment, the program may analyze the image to determine if it is under exposed. If so, the program may suggest increasing the strobe exposure compensation. Embodiments of this include increasing the power output of the strobe and the pulse time of the strobe. In other embodiments, the program may suggest using a wide angle lens, which may cause the aperture to open wider. In other embodiments, the program may suggest increasing the exposure compensation. For example, the program may suggest setting the exposure compensation to a positive value such as 0.3. In yet other embodiments, the program may suggest moving closer to the subject and reducing the ISO speed. For example, the user may move the camera to approximately 7.5 feet from the subject and setting the ISO speed to 100 or an automatic mode. In some embodiments, the program may provide the above-described information if the strobe activated at maximum power. The program may also require that the shutter speed be in automatic mode or that the ISO speed be 100 or less.
  • the camera may have a mode that provides for capturing images captured under low light. This mode is sometimes referred to as night mode. Night mode typically increases the exposure time of the image, which may cause the hand held limit to be met.
  • the program may determine that the camera was set to the night mode by analyzing the metadata. If the program determines that the camera was in night mode, the program may suggest stabilizing the camera to avoid blurring. The program may also cause information to be displayed indicating that the resulting image may be blurry.
  • the program may cause the above-described information to be displayed if the program determines that the strobe activated, exposure time was greater than one sixtieth of a second, ambient light was low, or the camera focused.
  • the camera determines that the camera was set to automatic mode, meaning that the processor within the camera determined the best strobe setting. However, the camera also determined that the image is likely under exposed.
  • the metadata determines that the camera was set in automatic mode and the strobe was activated during image capture. The pixel values are then analyzed as described above to determine if a predetermined number of pixel values are less than a predetermined number. More specifically, the image is analyzed to determine if it is possibly under exposed.
  • the camera may display text indicating that the image may be under exposed.
  • the text may indicate that the strobe or flash likely did not provide adequate illumination for the ambient lighting conditions.
  • the text may suggest overriding the automatic mode and setting the exposure compensation to a positive value, which will increase the exposure time during image capture. Thus, subsequent images may not be under exposed.
  • the following analysis determines whether an object, such as the finger of the user, may have blocked the strobe during image capture.
  • a blocked strobe will result in an under exposed image.
  • the metadata is analyzed to determine if the focus distance used to capture the image was less than a predetermined distance, such as one meter. In such a situation, the subject of the image should be sufficiently illuminated so as not to generate an under exposed image.
  • the metadata may also be analyzed to determine if the strobe was activated during image capture. If the above-described conditions are met, the camera may analyze the image to determine if it is under exposed by greater than a preselected amount, such as one stop.
  • the program may provide information to the user if no reflected light from the strobe is detected, indicating that the strobe did not illuminate the scene.
  • the camera may display text indicating that the image was likely under exposed because an object, such as the finger of the user blocked the strobe.
  • the text may suggest that the strobe be clear during subsequent image captures.
  • the text may also suggest checking to make sure that the strobe is functioning.
  • the image When an image is captured using the strobe and the image includes a reflective surface, such as a window, the image may have a bright spot where the image of the reflected flash was captured.
  • the method used to analyze this condition may consist of analyzing the metadata to determine whether the strobe was activated when the image was captured. Further analysis consists of determining whether the image has a bright spot. This analysis may be accomplished by analyzing the pixel values. If the pixel values indicate that a portion of the image is much brighter than other portions of the image, the camera may determine that the image contains a reflective surface.
  • the program may determine the intensity or amount of reflected strobe light in the scene. If the scene contains intense strobe light, the information described herein may be displayed.
  • the camera may display text indicating that the image may have a bright spot caused by imaging the reflected flash.
  • Suggestions for overcoming this problem include turning the flash off and imaging the reflective object at an angle so that the flash will not reflect directly back to the camera.
  • White balance errors represent color variations due to different types of light sources illuminating a scene. For example, if a scene is illuminated with a flourescent ambient light, the colors in the scene have a specific temperature. The camera will then process the image data based on the selected illumination source. If the selected illumination source is not correct, the colors in the resulting image may be different than the scene illuminated with ambient light. This error is sometimes referred to as a white balance error.
  • the user may select the illumination source or source of ambient light. For example, the user may select tungsten or flourescent lighting as the illumination source of the scene being captured.
  • the camera may then capture an image and process image data based on the selected illumination source. If the camera is used to capture images that are illuminated using a different illumination source than the selected light source, the images may be susceptible to white balance errors.
  • the camera may analyze an image to determine if white balance errors may exist.
  • One embodiment of determining if possible white balance errors exist is shown in the flowchart 450 of FIG. 10 .
  • decision block 452 a determination is made as to whether the image was captured using a user selected illumination source. More specifically, the camera determines if the user provided information relating to the illumination source used to illuminate the captured scene. If the determination of decision block 452 is negative, processing proceeds to block 452 wherein the processing proceeds to the next analysis as described in the previous flowcharts.
  • decision block 458 a determination is made as to whether the image was captured using full color mode.
  • the analysis focuses on color, so the analysis only proceeds if the full color mode is selected.
  • the determination as to whether an image was captured while the camera was in full color mode may be made by analyzing the metadata. There is no need to perform this analysis if the image is a black and white image.
  • the preferred illumination source is calculated or otherwise determined by the camera. More specifically, the camera determines which illumination source it would have chosen had the user not selected the illumination source. The analysis may be accomplished by analyzing the metadata and possibly other data to determine the preferred illumination source had the camera been in an automatic mode when the image was captured.
  • Processing then proceeds to decision block 462 where a determination is made as to whether the selected illumination source and the preferred illumination source are the same. In other words, a determination is made as to whether the camera would have selected the same illumination source as the user selected. If the camera would have selected the same illumination source that the user did, processing proceeds to block 454 as described above. In other words, there is no better choice for an illumination source than the one selected by the user and the processing continues to the next analysis.
  • Block 464 causes text to be displayed on the camera that indicates the image may have problems due to white balance errors.
  • the text may indicate the selected illumination source and the preferred illumination source and may suggest changing the selected illumination source to the preferred illumination source.
  • the text may also suggest changing the camera settings so that the camera automatically chooses the illumination source when processing the image data.
  • the camera may enable a user to select various illumination sources from a list of illumination sources.
  • the camera may assign a color temperature to each illumination source in the list.
  • the color temperatures are a ranges of color temperatures.
  • the camera may analyze the image and select a color temperature that it would use during processing.
  • the program may analyze the color temperature selected by the user and compare it to the color temperature selected by the camera. If the color temperature selected by the camera is the same or within a preselected range of the color temperature selected by the user, the analysis is complete and no suggestions are offered. If, on the other hand, the color temperature selected by the camera differs from the color temperature selected by the user, the program may inform the user that white balance problems may exist in the replicated image. The program indicate the color temperature selected by the camera. In another embodiment, the program may suggest a light source for the user to select based on the color temperature selected by the camera during subsequent image capture.
  • the user may select a color temperature corresponding to flourescent lighting.
  • the camera may determine that a color temperature corresponding to tungsten lighting should have been selected. If the difference in color temperatures is greater than a preselected threshold, the program may suggest using the color temperature corresponding to tungsten lighting. As set forth above, the program may also provide information indicating that a white balance problem may exist.
  • the metadata and other data may be used to provide the user with ways to improve the image quality.
  • the program may analyze the settings or different camera parameters at the time of image capture and may provide suggestions for improving the image during subsequent image capture.
  • Portrait mode is a mode that is typically used for capturing images of people.
  • the subjects are typically located in close proximity to the camera and, thus, are usually within range of the strobe. It is noted that there are many situations when the subject is out of range of the strobe.
  • One embodiment for enhancements determines whether the image was captured using portrait mode. This determination may be made by examining the metadata, which may store information relating to whether or not the camera was set in portrait mode at the time of image capture. If the camera was not in portrait mode, this portion of the analysis concludes because it is not applicable.
  • the metadata may store information regarding whether or not the strobe activated during image capture. Images captured in portrait mode typically have higher quality if they are captured using the strobe. In such a situation, the camera may analyze the image data and determine that the strobe should have been activated. A message may be displayed indicating that the image may be enhanced by use of the strobe.
  • the camera may also determine whether a red eye elimination algorithm has been run on the image data. Such an algorithm removes red eye in images caused by capturing images of peoples' retinas. If such an algorithm did not run, the camera may suggest running such an algorithm. The red eye algorithm corrects the color of eyes. In one embodiment, the suggestion to run the algorithm may be displayed if the image was captured in portrait mode and the strobe activated during image capture.
  • the camera may analyze other image parameters before suggesting the use of the strobe during image capture.
  • the exposure compensation or exposure time may be analyzed.
  • the exposure compensation or time used during image capture may be stored in the metadata. If the exposure compensation is high or the exposure time is long, the camera may suggest using the strobe. The exposure time is typically selected by the camera. Therefore, if the exposure time is long, it is indicative of a dimly lit scene, which may require the strobe.
  • the camera may also analyze the number of dark pixels in an image.
  • dark pixels are pixel values that are below a predetermined value and represent dark portions of an image.
  • a large number of dark pixel values of an image captured using portrait mode may be indicative of an image having excessive shadows. Accordingly, the analysis may indicate to the user that the subject of the image may have undesired shadows. Therefore, the suggestions for an enhanced image may include using the strobe or setting the ambient lighting conditions used by the camera for image processing to a low light level.
  • the camera may also suggest using a low adaptive lighting setting.
  • the low adaptive lighting setting reduces the effects of low light in the scene.
  • the image data and the meta data may be analyzed to determine if the contrast in the scene is high or greater than a predetermined value.
  • the following analysis is not performed in panoramic or portrait modes. Images captured using the panoramic mode may have high contrasts due to the nature of capturing panoramic images. Images captured using the portrait mode may be subject to high contrast due to the nature of capturing portrait images.
  • the number of dark and clipped pixels in various portions of the image may be analyzed to determine the contrast. For example, pixel values in the center of the image may be analyzed to determine if they are generally greater than a predetermined value. Pixel values in other regions of the image may be analyzed to determine if they are generally less than a predetermined value. If a high number of pixel values are clipped and dark, the contrast may be too high.
  • the camera may display information suggesting setting the camera to lower ambient lighting as a basis for image processing, which may lower the contrast.
  • the program may suggest setting an adaptive lighting setting lower so as to capture images that may be located in shadows in the scene.
  • the camera may analyze the metadata to determine if the subject is beyond the range of the strobe and if the camera focused during image capture. These addition criteria may have to be met in order for the camera to display the above-described suggestions.
  • the camera may have a plurality of settings that may be automatic or may be set by a user. Problems may arise if users capture images using manual settings that conflict with one another. For example, if contrast, sharpness, saturation, and adaptive lighting are all set high, the resulting image may appear unrealistic.
  • the camera determines if the contrast, sharpness, and saturation were all set high, or greater than preselected values, during image capture. In addition, the camera determines if the image was captured using a full color mode. All these settings may be stored in the metadata. If all the above-described conditions are met, the camera may display information indicating that the image may appear unrealistic. The program may display a suggestion of reducing at least one of the contrast, sharpness, or saturation settings.
  • the camera determines if the contrast, sharpness, and adaptive lighting were all set high, or greater than preselected values, during image capture. In addition, the camera determines if the image was captured using a full color mode. All these settings may be stored in the metadata. If all the above-described conditions are met, the camera may display information indicating that the image may appear unrealistic. The program may display a suggestion of reducing at least one of the contrast, sharpness, or adaptive lighting.
  • the camera determines if the contrast, saturation, and adaptive lighting were all set high, or greater than preselected values, during image capture. In addition, the camera determines if the image was captured using a full color mode. All these settings may be stored in the metadata. If all the above-described conditions are met, the camera may display information indicating that the image may appear unrealistic. The program may display a suggestion of reducing at least one of the contrast, saturation, or adaptive lighting.
  • the camera determines if the sharpness, saturation, and adaptive lighting were all set high, or greater than preselected values, during image capture. In addition, the camera determines if the image was captured using a full color mode. All these settings may be stored in the metadata. If all the above-described conditions are met, the camera may display information indicating that the image may appear unrealistic. The program may display a suggestion of reducing at least one of the sharpness, saturation, or adaptive lighting.
  • the above-described analysis may only be performed if the camera was in focus when the image was captured. If the camera was not in focus, the analysis may have no bearing on the captured image.
  • the determination as to whether the image was in focus may be made by analyzing the metadata, which may store data indicating whether the image was in focus.
  • the ISO speed and the adaptive lighting are analyzed to determine if a possible conflict existed during image capture. If the ISO speed is set above a preselected value and the adaptive lighting is also set above a preselected value, the camera may display information indicating that the image quality may be poor. For example, the camera may indicate that the image may appear grainy or unrealistic because both the ISO speed and the adaptive lighting are set above predetermined thresholds. The camera may also suggest lowering either the ISO speed or the adaptive lighting setting.
  • the predetermined value or threshold speed for the ISO setting is 400 or gain greater than 29.0.
  • the camera may suggest setting either the ISO setting or the adaptive lighting to the default values or setting the camera so that the camera selects the values.
  • the camera may also determine whether it was in focus during image capture. If the camera was not in focus, the poor image quality may be due to focusing problems.
  • the performance of a digital camera can deteriorate as it gets hot.
  • the CCD and the image processing components may produce an image that is degraded when they are operated above a predetermined temperature.
  • the camera may have a temperature sensor and may store the temperature of the camera at the time images are captured. For example, the temperature of the camera at the time of image capture may be stored in the metadata. In one embodiment, the temperature threshold is forty degrees centigrade. If the camera determines that the temperature is above the threshold, the camera may display information indicating that excessive heat may have caused the image to be degraded. The camera may further suggest cooling the camera down prior to capturing more images. Cooling may include turning off the display for a period prior to capturing images.
  • the program measures the temperature of a CCD or the like that is used to generate image data.
  • the temperature of the camera may be calculated by determining the time period in which the display was active. Based on this calculation, the program may display the above-described information.
  • the program may analyze the adaptive lighting setting to determine if the camera temperature was above a preselected temperature during image capture and the adaptive lighting was set high. If the above-described conditions are met, the program may display information indicating that the image may be grainy or otherwise be poor quality. In order to improve subsequent images, the program may suggest reducing the adaptive lighting setting or turning it off. The program may also suggest reducing the temperature of the camera.
  • the portrait mode is typically used to capture images that are close to the camera.
  • a wide angle setting of the lens is typically used to capturing images or scenes over a wide angle, which are not close to the camera. Capturing an image in portrait mode while using a wide angle lens setting may distort the image.
  • the metadata may store information relating to the zoom setting and whether the camera was in wide angle mode during image capture.
  • the zoom setting and portrait mode may be determined by analyzing the metadata.
  • the zoom setting can be compared to a predetermined value to determine if the zoom setting was great enough to cause distortion in the image.
  • the metadata may be analyzed to determine if the image was captured using the portrait mode. If both these conditions are met, the camera may display information indicating that the image may be distorted. The camera may also suggest moving away from the subject and using a narrower zoom setting.
  • the camera may have an action mode, which is used to capture scenes containing moving subjects.
  • Action mode typically includes a very fast shutter speed and other settings that enable the image of the subject to be captured without blurring the image.
  • the metadata may contain information regarding whether an image was captured using action mode.
  • One problem with capturing moving subjects is obtaining proper focus. As the subject moves, toward or away from the camera, the focal length changes. In order to enhance an image, the camera may detect that an image was captured using the action mode. The camera may suggest setting the focus at a point where the subject is expected to be during image capture. This will assure that the camera is properly focused during image capture.
  • the camera may have a switch used to cause the camera to capture an image, wherein the switch has multiple positions.
  • the switch may have a first position when not force is applied. In the first position, the functions associated with the switch may be inactive.
  • a second position of the switch may be reached by applying a first force to the switch. The second position may cause the camera to focus on the scene. It is noted that focusing is typically not instantaneous and may require that the switch be maintained in the second position for a period.
  • the third position of the switch may be achieved by applying a third force to the switch wherein the third force is greater than the second force.
  • the camera or program may measure the time that the switch is in the second position. Thus, the program is able to determine whether the camera likely achieved focus lock meaning that the camera was able to focus on a scene. If the time that the switch was in the second position was shorter than a preselected time and the camera was in action mode, the program may display information indicating that the image may be blurry. The preselected period may, as an example, be approximately 1.5 seconds. The program may also suggest maintaining the switch in the second position for a longer period during capture of subsequent images.
  • the use of a digital zoom enables a camera user to enlarge a scene, however, the quality or resolution of the scene is degraded as a result. This degradation is more prominent when the user prints or otherwise displays an enlarged image. For example, if the user enlarges the scene to print it on a large sheet of paper, the resolution and, thus, the quality, of the image will be degraded. If the printed image is too large, the quality of the image will be significantly deteriorated.
  • the digital zoom setting used to capture an image may be stored in the metadata. During analysis, the camera may access the metadata to determine the digital zoom setting and provide the user with information regarding possible printing limitations of the image before degradation exceeds a predetermined threshold.
  • the camera determines whether the resolution of the captured image is less than one thousand columns. More specifically, the camera determines if less than one thousand columns of photodetectors on the CCD were used to capture the image. The camera may then display information indicating that the largest suggested image that may be reasonably replicated based on the image data is five inches by seven inches or thirteen centimeters by eighteen centimeters. The camera may also determine if less than eight hundred columns of photodetectors were used to capture the image. The camera may then display information indicating that the largest suggested image is four inches by six inches or ten centimeters by fifteen centimeters. The camera may also determine if less than one six hundred columns of photodetectors were used to capture the image. The camera may then display information indicating that the largest suggested image is three and one half inches by five inches or nine centimeters by thirteen centimeters.
  • the resolution used to capture an image may be increased. Therefore, the camera may suggest increasing the resolution, which increases the number of pixels used to capture an image. Therefore, larger images will be able to be replicated or displayed without degradation.
  • the camera may suggest eliminating the digital zoom in favor of optical zoom.

Abstract

A method of analyzing images captured using an imaging device is provided herein. The analysis provides suggestions for changing a parameter of the imaging device during subsequent image capture.

Description

    BACKGROUND
  • With the proliferation of low cost microprocessors, memory and image capture electronics, digital cameras are gaining popularity and are becoming more and more widely available to a larger number of consumers. One of the advantages of a digital camera over a conventional film camera is that when a digital camera captures an image, the image is stored electronically in a memory element associated with the camera and is available for immediate viewing. For example, it is common to capture an image using a digital camera and then immediately display the captured image on a display screen associated with the digital camera. This ability to immediately view the image is commonly referred to as “instant review.” The ability to immediately review the recaptured image allows the user to immediately decide whether the image is satisfactory and worth keeping. The image may then be printed at a later time.
  • Many characteristics for determining whether the image is satisfactory may not be readily visually noticeable on the small display associated with many digital cameras. The displays used on the cameras typically are not able to display an image with the clarity of a printed image. Therefore, the user may not be able to determine whether image quality was optimized simply by viewing the image displayed on the display. For example, while the image may appear to be in focus and exposed properly when viewed on the camera display, the image may appear out of focus and improperly exposed when it is printed. Unfortunately, printing the image is a time consuming and costly way to determine whether an image is satisfactory.
  • SUMMARY
  • A method of analyzing images captured using an imaging device is provided herein. The analysis provides suggestions for changing a parameter of the imaging device during subsequent image capture.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram illustrating an embodiment of a digital camera.
  • FIG. 2 is a graphical illustration of an embodiment of an image file.
  • FIG. 3 is a flow chart describing the operation of an embodiment of the image analysis and improvement logic of FIG. 1.
  • FIG. 4 is a flowchart describing an embodiment of detecting over exposure errors and suggesting corrections thereto.
  • FIG. 5 is a flowchart describing an embodiment of detecting under exposure errors and suggesting corrections thereto.
  • FIG. 6 is a flowchart describing an embodiment of analyzing an image that is over exposed and that was captured using time value mode.
  • FIG. 7 is a flowchart describing an embodiment of analyzing an image for exposure wherein the image was captured using bracketing.
  • FIG. 8 is a flowchart describing embodiments for analyzing an image for blur when the handheld limit has been exceeded and the strobe was not activated.
  • FIG. 9 is a flowchart describing embodiments for analyzing an image for blur when the image was captured using the burst mode, the handheld limit was exceeded, and the strobe was not activated.
  • FIG. 10 is a flowchart describing an embodiment for analyzing an image for white balance errors.
  • DETAILED DESCRIPTION
  • Devices and methods for analyzing images are described herein. The devices and methods described herein analyze image data that is representative of images. The devices and methods for analyzing images may be implemented in hardware, software, firmware, or a combination thereof. In one embodiment, the system and method for analyzing images are implemented using a combination of hardware, software or firmware that is stored in a memory and that is executable by a suitable instruction execution system. In the embodiments described herein, the device is a digital camera wherein software stored on hardware in the camera analyzes image data or otherwise instructs the digital camera to analyze image data.
  • The hardware portion of the system and method for analyzing a captured image can be implemented with any or a combination of the following technologies, which are all well known in the art: a discreet logic circuit(s) having logic gates for implementing logic functions upon data signals, an application specific integrated circuit (ASIC) having appropriate combinational logic gates, a programmable gate array(s) (PGA), a field programmable gate array (FPGA), etc. The software portion of the system and method for analyzing a captured image can be stored in one or more memory elements and executed by a suitable general purpose or application specific processor.
  • The software for analyzing images, which comprises an ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. In the context of this document, a “computer-readable medium” can be any means, which contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
  • FIG. 1 is a block diagram illustrating an embodiment of a digital camera 100, which is sometimes referred to herein simply as a camera 100. In the implementation to be described below, the digital camera 100 includes an application specific integrated circuit (ASIC) 102 that executes the image analysis logic 150 described herein. As will be described below, the image analysis logic 150 can be software that is stored in memory and executed by the ASIC 102. In an alternative embodiment, the image analysis logic 150 maybe be implemented in firmware, which can be stored and executed in the ASIC 102. Further, while illustrated using a single ASIC 102, the digital camera 100 may include additional processors, digital signal processors (DSPs) and ASICs. It should be noted that the ASIC 102 may include other elements, which have been omitted. As described in greater detail below, the ASIC 102 controls many functions of the digital camera 100.
  • The camera 100 includes an image sensor 104. The image sensor 104 may comprise a charge coupled device (CCD) or an array of complementary metal oxide semiconductors (CMOS), which are both arrays of light sensors. Both the CCD and the CMOS sensor includes a two-dimensional array of photosensors, which are sometimes referred to as pixels. The pixels convert specific wavelengths or colors of light intensities to voltages that are representative of the light intensities. In one embodiment, higher pixel values or voltages are representative of higher intensities of light and lower pixel values are representative of lower intensities of light.
  • In one embodiment of the camera 100, the image sensor 104 captures an image of a subject by converting incident light into an analog signal. The analog signal is transmitted via a connection 109 to an analog front end (AFE) processor 111. The analog front end processor 111 typically includes an analog-to-digital converter for converting the analog signal received from the image sensor 104 into a digital signal. The analog front end processor 111 provides this digital signal as image data via a connection 112 to the ASIC 102 for image processing.
  • The ASIC 102 is coupled to one or more motor drivers 119 via a connection 118. The motor drivers 119 control the operation of various parameters of the lens 122 via a connection 121. For example, lens controls, such as zoom, focus, aperture and shutter operations can be controlled by the motor drivers 119. A connection 123 between the lens 122 and the image sensor 104 is shown as a dotted line to illustrate the operation of the lens 122 focusing on a subject and communicating light to the image sensor 104, which captures the image provided by the lens 122.
  • The ASIC 102 also sends display data via a connection 124 to a display controller 126. The display controller may be, for example, a national television system committee (NTSC)/phase alternate line (PAL) encoder, although, depending on the application, other standards for presenting a display data may be used. The display controller 126 converts the display data from the ASIC 102 into a signal that can be forwarded via a connection 127 to an image display 128. The image display 128, which, as an example may be a liquid crystal display (LCD) or other display, displays the captured image to the user of a digital camera 100. The image display 128 is typically a color display located on the digital camera 100.
  • Depending on the configuration of the digital camera 100, the image shown to a user on the image display 128 may be shown before the image is captured and processed, in what is referred to as “live view” mode, or after the image is captured and processed, in what is referred to as “instant review” mode. In some embodiments, a previously captured may be displayed in what is referred to as “review” or “playback” mode. The instant review mode is typically used to display the captured image to the user immediately after the image is captured and the playback mode is typically used to display the captured image to the user sometime after the image has been captured and stored in memory.
  • The instant review mode allows the user of the camera 100 to immediately view the captured image on the display 128. Unfortunately, because the image display 128 is typically small, only gross features, or characteristics, of the image can be visually observed. Furthermore, the image display 128 may not accurately reproduce color, tint, brightness, etc., which may further make it difficult for a user to determine the quality of the captured image. The difficulty in visually determining the quality of the captured image leads to the possibility of saving an image that may include deficiencies that, if visually detected, would likely cause the user to discard the image and attempt to capture another image having better quality. In order to determine whether the image includes deficiencies that may not be apparent to the user when viewing the captured image on the image display 128 in the instant review mode, the image analysis logic 150 dynamically analyzes one or more characteristics of the captured image. The analysis logic 150 then presents the user, via the image display 128 and a user interface, an analysis of the captured image. An exemplary dynamic analysis of the data for each pixel in a captured image is described below with reference to FIG. 2. In one embodiment, information associated with each pixel may be analyzed to determine whether a significant number of the pixels forming the image are either black or white. A predominance of white pixels may be indicative of overexposure and a predominance of black pixels may be indicative of underexposure.
  • Similar dynamic analyses can be performed to determine whether an image is in focus or to determine the white balance the image is correct. In one embodiment of determining whether an image is in focus, pixels in an image are examined to determine whether sharp transitions exist between pixels. For example, a black pixel adjoining a white pixel may indicate that the image is in focus, while a black pixel separated from a white pixel by a number of gray pixels may indicate that the image is out of focus.
  • White balance is a characteristic of the image that generally refers to the color balance in the image to ensure that white portions of the image appear white. An image in which each pixel is a different shade of the same color may indicate an image in which the white balance is improperly adjusted.
  • Further, an image improvement logic 160 may be provided to present to the user a recommendation in the form of instructions presented on the image display 128 on ways in which to possibly improve a subsequent image. For example, the image improvement logic may suggest adjusting a condition under which the image was captured or adjusting a setting or parameter used to capture the image. As will be described below, in one embodiment the image analysis logic 150 analyzes the captured image and, optionally, the camera settings used to capture the image, and determines a value of one or more characteristics of the captured image. For example, to determine whether the exposure of the image is satisfactory, if a predefined number of white pixels in the image is exceeded, then the image analysis logic 150 may indicate that the image is overexposed. Further, if the image analysis logic 150 determines that one or more characteristics of the captured image is not satisfactory to yield a high quality image, the image improvement logic 160 may determine whether a condition used to capture the image should be adjusted, or whether a camera setting should be adjusted, to improve a subsequent image. For example, if the image analysis logic 150 determines that the image is underexposed, the image improvement logic 160 may determine that a subsequent image may be improved by activating the camera flash for a subsequent image.
  • When the image analysis logic 150 analyzes the data representing the captured image and the setting used to capture the image, the analysis can be used by the image improvement logic 160 to suggest adjustments to the settings to improve a subsequent image. These suggested adjustments to the camera settings or parameters can be presented to the user on a help screen via the image display 128, or, in an alternative configuration, can be automatically changed for a subsequent image.
  • It is noted that the image analysis logic 150 and the image improvement logic 160 may be a single unit. For example, they may exist in the same firmware or be a single computer program. They have been split into separate functions herein solely for illustration purposes.
  • The ASIC 102 is coupled to a microcontroller 161 via a connection 154. The microcontroller 161 can be a specific or general purpose microprocessor that controls the various operating aspects and parameters of the digital camera 100. For example, the microcontroller 161 may be coupled to a user interface 164 via a connection 162. The user interface 164 may include, for example but not limited to, a keypad, one or more buttons, a mouse or pointing device, a shutter release, and any other buttons or switches that allow the user of the digital camera 100 to input commands.
  • The ASIC 102 is also coupled to various memory modules, which are collectively referred to as memory 136. The memory 136 may include memory internal to the digital camera 100 and/or memory external to the digital camera 100. The internal memory may, for example, comprise flash memory and the external memory may comprise, for example, a removable compact flash memory card. The various memory elements may comprise volatile, and/or non-volatile memory, such as, for example but not limited to, synchronous dynamic random access memory (SDRAM) 141, illustrated as a portion of the memory 136 and flash memory. Furthermore, the memory elements may comprise memory distributed over various elements within the digital camera 100.
  • The memory 136 may also store the image analysis logic 150, the image improvement logic 160, the settings file a 155 and the various software and firmware elements and components (not shown) that allow the digital camera 100 to perform its various functions. The memory also stores an image file 135, which represents a captured image. When the system and method for analyzing an image is implemented in software, the software code (i.e., the image analysis logic 150) is typically executed from the SDRAM 141 in order to enable the efficient execution of the software in the ASIC 102. The settings file 155 comprises the various settings used when capturing an image. For example, the exposure time, aperture setting (f-stop), shutter speed, white balance, flash on or off, focus, contrast, saturation, sharpness, ISO speed, exposure compensation, color, resolution and compression, and other camera settings may be stored in the setting file 155. As will be described below, the setting file 155 may be accessed by the image analysis logic 150 to analyze a captured image by, in one example, determining the camera settings used to capture the image that is under analysis.
  • The ASIC 102 executes the image analysis logic 150 so that after an image is captured by the image sensor 104, the image analysis logic 150 analyzes various characteristics of the captured image. These characteristics may include characteristics of the captured image, or alternatively, may include the settings used to capture the image. Further, if the image improvement logic 160 determines that the image could be improved by changing one or more of the conditions under which the image was captured, or by changing one or more camera settings, then the image improvement logic 160 can either suggest these changes via the user interface 164 and the image display 128, or can automatically change the settings and prepare the camera for a subsequent image. Embodiments of the analysis are described in greater detail below.
  • FIG. 2 is a graphical illustration of an image file 135. The image file 135 includes a header portion 202 and a pixel array 208. The header portion or other portion may include data, sometimes referred to herein as metadata, that indicates settings of the camera or conditions in which the image was captured. The metadata may be analyzed to determine whether improvements to subsequent images may be made. The pixel array 208 comprises a plurality of pixels or pixel values, exemplary ones of which are illustrated using reference numerals 204, 206 and 212. Each pixel in the pixel array 208 represents a portion of the captured image represented by the image file 135. An array size can be, for example, 2272 pixels wide by 1712 pixels high. When processed, the image file 135 can also be represented as a table of values for each pixel and can be stored, for example, in the memory 136 of FIG. 1. For example, each pixel has an associated red (R), green (G), and blue (B) value. The value for each R, G and B component can be, for example, a value between 0 and 255, where the value of each R, G and B component represents the color that the pixel has captured. For example, if pixel 204 has respective R, G and B values of 0, 0 and 0, respectively, (or close to 0,0,0) the pixel 204 represents the color black, or is close to black. Conversely, for the pixel 212, a respective value of 255 (or close to 255) for each R, G and B component represents the color white, or close to white. R, G and B values between 0 and 255 represent a range of colors between black and white.
  • The data for each pixel in the image file 135 can be analyzed by the image analysis logic 150 to determine characteristics of the image. For example, characteristics including, but not limited to, the exposure, focus or the white balance of the captured image can be analyzed. A predominance of white pixels may be indicative of overexposure and a predominance of black pixels may be indicative of underexposure. To determine whether an image is in focus, pixels in an image are analyzed to determine whether sharp transitions exist between pixels. For example, a black pixel adjoining a white pixel may indicate that the image is in focus, while a black pixel separated from a white pixel by a number of gray pixels may indicate that the image is out of focus. An image in which each pixel is a different shade of the same color may indicate a problem with the white balance of the image. An example of determining the exposure will be described below with respect to FIG. 3.
  • FIG. 3 is a flow chart 300 describing the operation of an embodiment of the image analysis logic 150 and the image improvement logic 160 of FIG. 1. Any process descriptions or blocks in the flow chart to follow should be understood as representing modules, segments or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process, and alternative implementations are included within the scope of the preferred embodiment. For example, functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
  • In block 302 the image sensor 104 of FIG. 1 captures an image. The image is stored in the memory 136 as image file 135. In block 304, the image represented by the image data is displayed to the user of the digital camera 100 via the image display 128 of FIG. 1 during the “instant review” mode. The instant review mode affords the user the opportunity to view the captured image subsequent to capture.
  • In decision block 306, the user determines whether he or she wants to view the settings with which the image was captured. If the user wants to view the settings, the settings are displayed to the user on the image display 128 as indicated in block 308. If the user does not want to view the settings, then, in decision block 312, it is determined whether the user wants the image analysis logic 150 to analyze the image. If the user does not want the image to be analyzed, then, in block 314 the image can be saved or discarded. Alternatively, the image analysis logic 150 can be invoked automatically without user intervention.
  • In block 316, the image analysis logic 150 analyzes the data within the image file 135. The data is analyzed to determine various characteristics of the captured image. The following example will use exposure as the characteristic that is analyzed by the image analysis logic 150. However, other characteristics, such as, focus and white balance, can be analyzed. Analysis of several of these other characteristics will be described in greater detail below.
  • When analyzing exposure, the image analysis logic 150 performs a pixel by pixel analysis to determine whether the image includes a predominance of either black or white pixels. It should be noted that rather than sampling all the pixels constituting the image, a sample of the pixels may be analyzed. In this example, the data associated with each pixel in the image file 135 is analyzed to determine whether a pixel is a black pixel or a white pixel. Each pixel is analyzed to determine its corresponding R, G and B values. For example, if the R, G and B values for the pixel 204 are all zeros, the pixel is considered a black pixel. Each pixel in the pixel array 208 is analyzed in this manner to determine the number of black or white pixels in the pixel array 208 for this image file. A determination in block 306 that a substantial portion of the pixels in the array 208 are black indicates that the image is likely underexposed. Conversely, a determination that many of pixels in the array 208 are white indicates that the image is likely overexposed. Of course the image may be of an all white or an all black subject, in which case the user may choose to disregard the analysis.
  • In an alternative embodiment, the data in the image file 135 can be analyzed in combination with other data available either in the image file 135 or from the settings file 155 in the camera 100. For example, additional data, sometimes referred to as metadata, saved in the header 202 of the image file 135 can be analyzed in conjunction with the information from each pixel in the array 208. This information might include, for example, the ISO setting and the aperture setting (f-stop) used to capture the image. These data items can be used in conjunction with the pixel data above to develop additional information regarding the characteristic of the analyzed image. Analysis of the settings will be described in greater detail below.
  • Furthermore, the image analysis logic 150 can also analyze the camera settings used to capture the image and use those settings when analyzing the data in the image file 135 to develop additional data regarding the image file 135. For example, the image analysis logic 150 can access the settings file 155 in the memory 136 of FIG. 1 to determine, for example, whether the flash was enabled, or to determine the position of the lens when the image was captured. In this manner, the image analysis logic 150 can gather a range of information relating to the captured image to perform an analysis on the captured image file 135 to determine whether the captured image meets certain criteria. To illustrate an example, if the image analysis logic 150 determines that the image is underexposed, i.e., the image file contains many black pixels, the image analysis logic 150 can access the settings file 155 to determine whether the flash was active when the image was captured. If the image analysis logic 150 determines that the flash was turned off, the image analysis logic 150 may communicate with the image improvement logic 160 to recommend that the user activate the flash so that a subsequent image may have less likelihood of being underexposed. It should be noted that the settings file 155 may be appended to the image file 135.
  • In decision block 318, it is determined whether the image data analyzed in block 316 represents an acceptable image. This can be an objective determination based on criteria that the user enters into the camera 100 via a user interface 164, FIG. 1, or can be preset in the camera 100 at the time of manufacture. Alternatively, the determination of whether the image data represents an acceptable image can be a subjective determination based on user input. If the image is determined to be acceptable, then no further calculations or analysis are performed.
  • If, however, in decision block 318 the image analysis logic 150 determines that certain conditions under which the image was captured or settings used to capture the image can be changed to improve the image, then, in block 322, the image improvement logic 160 evaluates the settings used to capture the data in the image file 135 to determine whether a condition or setting can be changed to improve the image. In addition, the image improvement logic 160 can also develop recommendations to present to the user of the camera to improve a subsequent image. For example, if the analysis in block 316 suggests that the image was underexposed, the image improvement logic 160 may develop “advice” to be presented to the user. In this example, as will be described below, the image improvement logic 160 may suggest that the user activate the flash to improve a subsequent image. This suggestion may be provided to the user via the image display 128 in conjunction with the user interface 164.
  • In block 324, an instant review settings and help screen is displayed to the user. The instant review and help screen may include, for example, a thumbnail size display of the image, a display of the setting used to capture the image, an evaluation of the image and, if the user desires, suggestions on ways to improve the image. The evaluation of the image may include, for example, a notification that characteristics, such as exposure, focus and color balance are satisfactory. Suggestions on ways in which to improve the image may be communicated to the user via the image display 128 and may include, for example, changing a condition under which the image was captured, changing a setting with which the image was captured, or a combination of both changing a condition and a setting.
  • In decision block 326, the user determines whether another image is to be captured. If the user does not want to capture another image, the process ends. If, however, in decision block 326, the user wants to capture another image, then, in decision block 332, it is determined whether the user wants to manually change a parameter, such as a condition or setting, for the subsequent image or, if the parameter is to be set automatically the digital camera 100, FIG. 1.
  • If, in decision block 332, the user decides to manually change the setting, then, in block 334, the user changes the setting and the process returns to block 302 where another image is captured and the process repeats. If, however, in decision block 332, the user wants the digital camera 100 to automatically change the setting, then, in block 336, the setting used to capture the previous image are changed according to the new setting determined in block 324. The process then returns to block 302 to capture a subsequent image.
  • Having described some embodiments of analyzing characteristics of an image and camera settings, other embodiments will now be described.
  • In the following embodiments, the data in the header 202, FIG. 2, of an image file 135 is sometimes referred to as metadata. As described above, the metadata may include several characteristics related to the camera settings at the time the image was captured. These settings may be settings adjusted manually by the user or automatically by the camera. In some embodiments of the image analysis logic 150, the metadata, and not the data representative of the pixels 208, is analyzed.
  • It should be noted that the following analysis provides determinations of some of the possible anomalies that may be detected by the image analysis logic 150. Thus, fewer or more possible anomalies may be detected.
  • Exposure Errors
  • Several possible exposure errors or anomalies may be detected by analyzing the metadata and the image data. Several methods may be used to determine these possible exposure errors. For example, as described above, the pixel values may be analyzed to determine whether a preselected number of pixel values are above or below preselected values. The metadata may also be analyzed to determine the camera settings and ambient conditions at the time the image was captured to determine if the camera settings were proper. It is noted that the time of image capture refers to a time in which the digital camera generated image data.
  • Over Exposure in Aperture Priority Mode
  • Reference is made to FIG. 4, which is a flowchart 200 describing an embodiment of detecting over exposure errors and suggesting corrections to overcome the errors. In summary, the embodiment of the method set forth in FIG. 4 suggests corrections when the image is over exposed by more than a predetermined amount and the camera is in aperture priority mode. Aperture priority mode enables a user to select an aperture setting during image capture. In this embodiment of the digital camera, the digital camera may have the above-described aperture priority mode and another mode wherein the digital camera selects an aperture to use during image capture.
  • In decision block 202, a decision is made as to whether the camera was in aperture priority mode during image capture. As described above, aperture priority mode enables a user of the camera to manually select an aperture setting. Data stored in the metadata may indicate whether the camera was in aperture priority mode during image capture. If the camera is not in aperture priority mode, processing proceeds to block 204 where processing continues to the next analysis. More specifically, the suggestion ultimately offered by the flowchart 200 will not be applicable to the camera setting when the camera is not in aperture priority mode. If the camera is in aperture priority mode, the analysis continues to decision block 206.
  • In decision block 206, a decision is made as to whether the image is over exposed by a predetermined amount. For example, the image may be analyzed to determine if the exposure is greater than a preselected stop value. In the embodiment of the flow chart 200, the decision block 206 determines whether the image is over exposed by more than two-thirds of a stop. It should be noted that other values of the stop may be used in the decision block 206. If the image is not over exposed by more than the preselected stop value, processing continues to block 204 as described above. If the image is over exposed by more than the preselected stop value, processing continues to decision block 208 as described below.
  • In decision block 208, a determination is made as to whether the image is over exposed by more than a preselected value. In one embodiment, the preselected value corresponds to two-thirds stop. It should be noted that in other embodiments, determinations may be made as to whether the exposure is between preselected values and an indication may be provided as to the amount of overexposure. A suggestion that the image may be over exposed may be provided by also determining an exposure compensation values set during generation of the image data. In one embodiment, the decision block 208 determines whether the exposure compensation is between plus and minus 0.6. It is noted that an exposure compensation of a value other than zero is indicative of a manual user setting. In this embodiment, if the exposure compensation is not within the preselected values, processing proceeds to block 204 as described above. If the exposure compensation is within the preselected values, processing proceeds to block 210.
  • At this point, it has been determined that the image is over exposed by a preselected number of stops and the camera is in aperture priority mode. In addition, in this embodiment, the exposure compensation is not within preselected values. Block 210 then determines the number of stops the image is over exposed. For example, the pixel values may be analyzed to determine the amount of over exposure. Based on the foregoing, block 212 causes the camera to display information related to correcting the over exposure problem. In the embodiment of the flowchart 200, the information informs the user of the stop value of the over exposure and suggests using a smaller aperture setting, which relates to a larger f-number. Block 212 may also suggest using an automatic mode, wherein the camera selects the aperture and possibly the exposure compensation.
  • Under Exposure in Aperture Priority Mode
  • Reference is made to FIG. 5, which is a flowchart 230 describing an embodiment of detecting under exposure errors and suggesting corrections thereto. In summary, the method set forth in FIG. 54 suggests corrections when the image is under exposed by more than a predetermined amount and the camera is in aperture priority mode. In one embodiment, the under exposure corresponds to two-thirds stop and in another embodiment, the under exposure corresponds to one stop.
  • In decision block 232, a decision is made as to whether the camera was in aperture priority mode during the generation of image data. Data stored in the metadata may indicate whether the camera was in aperture priority mode. If the camera was not in aperture priority mode during generation of the image data, processing proceeds to block 234 where processing continues to the next analysis. More specifically, the suggestion for improving image quality ultimately offered by the flowchart 220 will not be applicable to the camera setting. If the camera was in aperture priority mode, the analysis continues to decision block 236.
  • In decision block 236, a decision is made as to whether the image is under exposed by a predetermined amount, which may be a preselected stop value. In the embodiment of the flow chart 230, the decision block 236 determines whether the image is under exposed by more than two thirds of a stop. It should be noted that other under exposure values, such as one stop, may be used in the decision block 236. If the image is not under exposed by more than the preselected stop value, processing continues to block 234 as described above. If the image is under exposed by more than the preselected amount, processing continues to decision block 238 as described below.
  • As with over exposure, an indication of under exposure may be assisted by analyzing an exposure compensation setting during the generation of image data. In the embodiment of the analysis of FIG. 5, such an analysis is performed in decision block 238 where a determination is made as to whether the exposure compensation was within preselected values. It should be noted that in other embodiments, determinations may be made as to whether the exposure compensation is greater or less than preselected values. In one embodiment, the decision block 238 determines whether the exposure compensation is set to zero. It is noted that an exposure compensation of a value other than zero is indicative of a manual user setting. If the exposure compensation is not within the preselected values, processing proceeds to block 234 as described above. If the exposure compensation is within the preselected values, processing proceeds to block 240. It should be noted that in some embodiments, exposure compensation is not analyzed.
  • At this point, it has been determined that the image is under exposed by a preselected number of stops and the camera was in aperture priority mode during generation of the image data. In addition, in this embodiment, the exposure compensation was not within preselected values. Block 240 determines the number of stops the image is under exposed. Based on the foregoing, block 242 causes the camera to display information related to correcting the under exposure problem. In the embodiment of the flowchart 230, the information informs the user of the stop value of the under exposure and suggests using a larger aperture setting, which relates to a smaller f-number. Block 242 may also suggest setting the camera to automatic mode as described above.
  • Over Exposure in Time Value Mode
  • The analysis of the metadata and image data may determine that the image is over exposed and the camera is in a time value mode. Time value mode is sometimes referred to as Tv mode. The time value mode enables a user to select the shutter speed of the camera, which determines the exposure time during image capture. More specifically, the shutter speed determines the amount of time that the photosensors charge during image capture. If the shutter speed is set too slow, the image may be over exposed. Likewise, if the shutter speed is set too fast, the image will be under exposed.
  • An embodiment of analyzing an image to determine whether the image is over exposed due to an improper setting in time value mode is shown in the flow chart 260 FIG. 6. At decision block 262, a determination is made as to whether the camera was in time value mode during image capture. The decision as to whether the camera was in time value mode during image capture may be made by analyzing the metadata associated with the image. If the camera was not in time value mode, the following analysis is not relevant and processing proceeds to block 264. Block 264 simply directs the processing to analyze other possible problems with the captured image.
  • As stated above, in some embodiments, the setting of exposure compensation at the time of image capture may provide insight to exposure problems. In the embodiment of FIG. 6, exposure compensation is analyzed at decision block 266, where a determination is made as to whether the exposure compensation was set to a preselected value. The decision as to whether the exposure compensation is set to a preselected value may be made by analyzing the metadata associated with the image. In one embodiment, the decision block 266 determines whether the exposure compensation is set to zero. In other embodiments, the decision block 266 may determine if the exposure compensation is greater than or less than preselected values or between preselected values. In the embodiment of FIG. 6, if the result of decision block 266 is negative, then processing proceeds to block 264 because, the analysis does not have bearing on the camera settings. In some embodiments, exposure compensation is not analyzed.
  • If the result of decision block 266 is affirmative, processing continues to decision block 270. It is noted that in order for processing to reach decision block 270, the camera is in time value mode and the exposure compensation is either at, below, above, or between preselected values. Decision block 270 determines whether the exposure error is greater than a preselected value, meaning that the image is overexposed. If the image is not overexposed, processing proceeds to block 264 as described above. If the image is overexposed, processing proceeds to block 272. As described above, the pixel values may be analyzed to determine if the image is over exposed.
  • Block 272 displays information regarding the image. Many different embodiments of the information may be displayed. In one embodiment, the information informs the user that the image is overexposed. The information may also include the amount of the overexposure and a suggestion that the scene brightness was so high that the camera could not select an appropriate F-number. A suggestion of a faster shutter speed or using an automatic mode may be provided to the user. Block 272 may also suggest setting the camera to automatic mode.
  • Under Exposure in Time Value Mode
  • The analysis of the metadata and image data may determine that the image is under exposed and the camera is in a time value mode. The analysis may be the same as with the flow chart 260 and the description provided above, except a determination at block 270 may determine that the image is under exposed. It follows that the suggestions to correct the problem would be the opposite as those provided at block 272. For example, the information may indicate that the scene brightness was too low for the camera to select a low enough F-number. The suggestion may include using a slower shutter speed or an automatic mode. As with block 272, a suggestion may be made to use the automatic mode of the camera.
  • Bracketing Problems
  • Some embodiments of the camera include a bracketing mode, which enables users to capture a plurality of images using different settings. More specifically, the camera captures a series of images using at least one preselected range of settings. For example, the camera may capture three images wherein each image has a different exposure compensation. A user may select the best image from the plurality of captured images. The camera may determine that some errors occurred while using the bracketing mode and may suggest procedures to correct the errors during subsequent image captures.
  • Exposure Compensation is Set to Great
  • In one embodiment, the exposure compensation is set to great to use the bracketing mode properly. More specifically, the absolute value of the exposure compensation may be too great to use the bracketing mode properly. This determination may be made by analyzing the metadata to determine if the exposure compensation is greater or less than preselected values. In one embodiment, the determination is made if bracketing is set and the metadata indicates that the exposure compensation is set to a value of greater than 2.3 or a value less than −2.3.
  • If the above conditions are met, the camera may inform the user of the problem. For example, the camera may suggest setting the exposure compensation to a value closer to zero or using the automatic mode.
  • More Over Exposure May Be Desired
  • In one embodiment of bracketing, the user may want an image to be extremely over exposed. The camera may analyze metadata and image data to determine that the camera was set to over expose images and that further over exposure may be achieved. In such an embodiment, the images may be captured using varying shutter speeds or ISO speeds. The varying shutter speeds may be set slow which would cause the images to be over exposed. The image data may be examined to determine if it is over exposed. If so, the camera may suggest using a greater exposure compensation during generation of subsequent image data.
  • In a similar embodiment, the exposure of the image may have encountered a maximum value during the bracketing sequence. Thus, more than one image would be similar. In order to take better advantage of the bracketing, the camera may suggest lowering the exposure compensation during the bracketing sequence. Therefore, the images will vary from one to another. For example, the camera may determine that exposure compensation was set greater than 2.3 and that ten percent of the pixel values are greater than a preselected value. The camera may suggest lowering the exposure compensation to a value between plus and minus 2.0 during the subsequent bracketing sequence. In one embodiment, the camera determines that a maximum exposure of 3.0 stops was obtained during the bracketing sequence and recommends the above-described changes during the subsequent bracketing sequence.
  • More Under Exposure May Be Desired
  • In an embodiment similar to the above-described embodiment, the camera may detect that a user wanted greater under exposure during a bracketing sequence. For example, the shutter speed or ISO speed may be set fast so as to cause under exposure. More than one image captured during the bracketing sequence may have reached the maximum under exposure of the camera, so the full use of the bracketing sequence may not be realized. For example, more than one image may be under exposed by a maximum of the camera of 3.0 stops.
  • As with the section above, the camera may suggest reducing the exposure compensation during a subsequent bracketing sequence. For example, the camera may suggest setting the exposure compensation to values of between plus and minus 2.0 during a subsequent bracketing sequence.
  • Clipped and Dark Pixel Values
  • For the following embodiments, pixel values are analyzed to determine if they are clipped or dark. Clipped pixel refer to pixel values that are at a maximum or saturated value. Clipped pixel values may be indicative of an image that is over exposed. Dark pixel values are indicative of an under exposed image. During bracketing, clipped or dark pixels are indicative of the exposure of the image being too light or too dark.
  • One embodiment of analyzing the exposure of an image captured using bracketing is shown by the flow chart 280 of FIG. 7. In decision block 282, a determination is made as to whether the camera is in bracketing mode. The bracketing mode may capture a plurality of images using different shutter or ISO speeds. Block 282 may determine if the bracketing mode enables exposure compensation of plus or minus 0.7 or greater. If the camera is not in bracketing mode, processing proceeds to block 284 where other possible problems with the captured image are analyzed. If the camera is in bracketing mode, processing proceeds to decision block 286 where a determination is made as to whether the number of clipped pixels exceed a preselected value. The number of clipped pixels may be determined by counting the number of pixel values in the image file that are saturated or that exceed a preselected value. In some embodiments, block 286 determines if the number of clipped pixel values exceed three percent.
  • If the determination from decision block 286 is affirmative, processing proceeds to block 288. More specifically, if the number of clipped pixels exceeds the preselected value, the captured image is probably over exposed. Block 288 displays information related to the image being over exposed. In one embodiment, a message is displayed indicating that the image is over exposed and suggests setting the exposure compensation closer to zero during a subsequent bracketing sequence. The exposure compensation may also be set to automatic mode.
  • If the determination of decision block 286 is negative, processing proceeds to decision block 290. Decision block 290 determines whether the number of dark pixels in the image file exceed a preselected number. The number of dark pixel values may be determined by counting the number of pixel values that are zero or that are less than a preselected number. In some embodiments, decision block 290 determines if greater than ten percent of the pixel values are less than a preselected value. If the determination of decision block 290 is negative, processing proceeds to block 284 as described above. If the determination of decision block 290 is positive, processing proceeds to block 292. Block 292 causes the camera to display information similar to block 288. However, block 292 may indicate that the exposure compensation is negative and that the image is under exposed. The suggestion is to adjust the exposure compensation closer to zero. Again, the suggestions may also include using the automatic mode.
  • Strobe Required in Bracketing Mode
  • One problems that may be analyzed is a situation where one of a plurality of images captured during bracketing mode required a strobe, which did not activate, and the other images did not require the strobe. This situation may be indicative of a blurry image. More specifically, during bracketing mode, if the nominal image does not require a strobe, the other images may be captured without the strobe. If the camera determines that one of the images, other than the nominal image, required a strobe, it may indicate that the shutter speed dropped below the hand held limit for the desired zoom. The hand held limit is a function of the zoom and the shutter speed and represents the limit to which a typical user is able to hold the camera and capture an image without the image being blurred. The hand held limit may be reached by either a narrow or telephoto zoom or a long exposure time used during image capture. Either situation makes the image more susceptible to blurring caused by the user holding the camera, which may cause the camera to shake too much during image capture.
  • During analysis, the metadata of the images captured during the bracketing are analyzed. Based on the zoom and the exposure time, a determination within the camera may be made that the hand held limit had been reached during image capture. In addition, based on the metadata, a determination may be made that the strobe should have been activated. In this situation, the strobe was forced not to activate because of the bracketing. The determination could then be made that the image may be blurry. A suggestion may be offered to the user to use a tripod or other camera stabilizing device. Other suggestions may include turning the flash on for the entire bracketing sequence.
  • Exposure Compensation
  • Exposure Compensation Set Too High
  • In a first situation, the user captures images with the exposure compensation set to a positive value and the captured images are over exposed. In summary, an analysis concludes that the user may have forgotten that the exposure compensation is set to a positive value.
  • The metadata may be analyzed to determine if the image was captured while the exposure compensation was set to a positive value. In some embodiments, the analysis determines of the exposure compensation was set equal to or greater than 0.6. In addition, the number of clipped pixels may be compared to a preselected number to determine if the number of clipped pixels exceeds the preselected number. For example, the analysis may determine if the number of clipped pixels exceeds three percent. If the above-described conditions are met, the camera may display a message to the user indicating that the image may be over exposed and that the exposure compensation is set to high. The camera may suggest lowering the exposure compensation or setting it closer to zero. The camera may also suggest using an automatic exposure compensation mode.
  • Over Exposed or Under Exposure of the Subject
  • In this situation, the analysis indicates that the subject is over exposed or under exposed. The state of the exposure may be determined by sampling pixel values in a portion of the image. For example, pixel values from the center of the image where the subject is typically located may be analyzed. An excessive number of clipped or dark pixel values in a specific region relative to another region may indicate that the subject is over exposed or under exposed.
  • In one embodiment, the metadata is analyzed to determine that the camera was not in aperture priority mode and the subject, as described above, is under exposed. In such a situation, the subject may be backlit and the light meter may have measured the background rather than the subject. The analysis may indicate that the subject is under exposed. The solution suggested to the user may be to increase the exposure compensation to a positive number such as +0.3. The processing program may also suggest forcing the flash on if the subject is within a preselected range, such as within ten feet of the camera.
  • In another embodiment, the metadata is analyzed to determine that the camera is in automatic mode and the subject, as described above, is over exposed and the background is dark. In addition the metadata indicates that the strobe did not activate during image capture. The processing program may indicate that the subject is over exposed. The solution suggested to the user may be to reduce the exposure compensation to a negative number.
  • Exposure Problems in Spot Mode
  • In some embodiments, the processing program analyzes exposure problems with the image wherein the image was captured using spot mode. Spot mode relates to the portion of an image use by the camera during focusing and setting up exposure for the remainder of the image. In spot mode, the portion is typically very small. The determination as to whether the camera was in spot mode during image capture may be determined by analyzing the metadata. Over exposure or under exposure problems in certain portions of the image may be detected by analyzing the pixel values.
  • In one embodiment, the analysis uses the center of the image to determine if it is dark and if surrounded by bright areas. Such a situation may indicate to the user that the image is over exposed. The processing program may also indicate that the image was captured using spot mode and that the camera relied solely on the dark portion to calculate exposure. Thus, the remaining portion of the image is not properly exposed. The camera may suggest setting the metering to average or center-weighted. The camera may also suggest using an automatic mode. The camera may also suggest using a wider portion of the image during focusing and setting up exposure.
  • In another embodiment, the image is analyzed as being light in the center and dark in the surrounding areas. As described above, this situation may be detected by analyzing the pixel values corresponding to various portions of the iamge. If such a situation is detected, the processing program may indicate that the image appears to be under exposed. The processing program may also indicate that the image was captured using the spot mode and that the camera did not use dark regions on the edge of the scene to calculate exposure. The camera may suggest setting the metering to average or center-weighted. The camera may also suggest using a wider portion of the image for focusing and exposure settings.
  • Under Exposed Image
  • The program may find many situations in which an image is or may be under exposed. In the following situation, the scene may be relatively dark and the strobe may not have activated during image capture. In order to compensate for the dark scene, the camera may select a slow shutter speed during image capture. However, the camera may not have a shutter speed slow enough to properly expose the image of the relatively dark scene. The program may determine that the image is under exposed by analyzing the pixel values as described above. The program may also analyze the shutter speed during image capture, which may be stored in the metadata. If the camera does not have a slower shutter speed, the program may suggest using a higher ISO or a wider zoom. The wider zoom may cause the aperture to open a little wider. The camera may also suggest illuminating the scene.
  • Focus Errors
  • Focus Problems Due to an Unstable Camera
  • The processing program may analyze several items in the metadata to determine that the image may be blurry due to shaking of the camera at the time the image was captured. An embodiment for determining whether the image may be blurry is shown in the flowchart 300 of FIG. 8. Other methods of detecting focus problems due to shaking are described further below.
  • Some portions of the flowchart 300 and the methods described therein describe the handheld limit of the camera. As set forth above, the handheld limit is a function of zoom and exposure time. The basis for the handheld limit is that a user of a camera that holds the camera is going to shake the camera during image capture, which is going to blur the image. The camera may be programmed with a handheld number or limit, which may be based on the amount of shaking a typical user shakes while holding the camera. It is noted that a longer exposure time or greater zoom increases the handheld calculation closer to or beyond the handheld limit.
  • At decision block 304, a determination is made as to whether the handheld limit has been reached. If the hand held limit has not been reached, meaning that the shaking caused by a user probably did not affect the image, processing proceeds to block 306. As noted above, the hand held limit refers to a function of exposure time and zoom setting, and may include other variables. At block 306, the processing continues to analyze another aspect or potential problem with the image because conditions were not met to proceed with the following analysis.
  • At decision block 308, a determination is made as to whether the strobe or flash activated. If the strobe activated, processing proceeds to block 306 as described above. If the strobe activated, then the camera likely compensated for low light conditions and the following analysis is not relevant. If the strobe did not activate, processing proceeds to decision block 310 where a determination is made as to whether the strobe should have been activated. For example, the image may have been captured under low light conditions wherein the strobe was forced not to activate. It should be noted that the hand held limit may be reached if the strobe activated. Thus, in some embodiments, determinations may be made that the hand held limit was met even if the strobe activated during image capture.
  • If the above-described conditions are met, processing continues to decision block 312 where a determination is made as to whether the camera is in a macro mode. A macro mode is a mode wherein the user captures an image of an object that is significantly close to the camera. For example, the object may be located a few inches from the camera. If all the above-described conditions are met, the process may determine that the image could be out of focus and may proceed to block 316. Block 316 displays information regarding the possible focus problem and suggestions to overcome the problem. For example, text indicating that the image may be out of focus if it was captured without stabilizing the camera may be displayed. In addition, suggestions of reducing the zoom, improving the lighting, and stabilizing the camera may be provided to the user.
  • If the determination of decision block 312 is negative, processing proceeds to decision block 318 where a determination is made regarding the zoom setting used during image capture. More specifically, decision block 318 determines whether the zoom was set to a wide angle during image capture. The determination may be made by comparing the setting of the zoom to a preselected value, wherein a zoom setting below the preselected value constitutes a wide angle zoom setting. If the determination of decision block 318 is affirmative, processing proceeds to block 320, where text is displayed to indicate a possible problem with the image. The text may indicate that the image may be out of focus if it was captured if a tripod or the like was not used to stabilize the camera during image capture. Suggestions for correcting the problem may also be displayed and include using an automatic flash or strobe mode and stabilizing the camera during image capture.
  • If the determination from decision block 318 is negative, processing proceeds to decision block 322. At decision block 322, a determination is made as to whether the zoom setting of the camera during image capture was in a middle region. This may be accomplished by determining whether the zoom was set between two preselected values which represent the middle range of the zoom setting. If the determination of decision block 322 is affirmative, processing proceeds to block 324. At block 324, text is displayed to suggest that a problem exists with the image and to offer suggestions for improving the image. For example, the text may indicate that the image may be out of focus if it was captured without stabilizing the camera or if the subject was moving. In order to improve a subsequent image, the text may suggest setting the flash to automatic mode, stabilizing the camera, and using a wider zoom.
  • If the determination of decision block 326 is negative, processing proceeds to block 326, which is similar to block 306. In summary, the analysis could not determine any problems with the image and the next parameter will be analyzed. It should be noted that block 326 may never be reached if the zoom settings of decision blocks 312, 318, and 322 encompass all possible zoom settings.
  • Focus Problems in Burst Mode
  • As set forth above, the image may be blurred or otherwise out of focus if it is determined that the handheld limit has been exceeded. Blurring may be more prominent in burst mode, which causes several images to be captured within a preselected period. The strobe is typically not activated in burst mode because it does not have time to charge between image captures. Thus, if the images are captured in low light conditions, the shutter speed is slowed, which increases the likelihood that the hand held limit will be reached.
  • An embodiment of analyzing focus problems in burst mode is provided in the flowchart 400 of FIG. 9. The flowchart of FIG. 9 is based on an embodiment wherein the strobe is disabled during burst mode. In decision block 404 of the flowchart 400, a determination is made as to whether the camera was in burst mode during image capture. If the determination of decision block 404 is negative, processing proceeds to block 406. At block 406, it is determined that the remaining analysis has no bearing and processing proceeds to the next analysis. If the determination of decision block 404 is affirmative, processing proceeds to decision block 410 where a determination is made as to whether the handheld limit was exceeded during image capture. As described above, the metadata may be analyzed to determine if the handheld limit has been exceeded. If the determination of decision block 410 is negative, processing proceeds to block 406 as described above.
  • If the determination of decision block 410 is affirmative, processing proceeds to decision block 412. Decision block 412 determines if the camera focused at the time the user captured the image. Again, if the determination of decision block 412 is negative, processing proceeds to block 406 as described above. The decision as to whether the camera focused at the time of image capture is sometimes referred to as whether the “focus lock” was achieved, indicating that the focus of the camera was at a preselected threshold during image capture.
  • If all the above-described conditions have been met, processing proceeds to decision block 416 where a determination is made as to whether the strobe would have otherwise activated during image capture. In other words, the determination of decision block 416 determines if the scene was illuminated dim so that the strobe would have activated, except the camera was in burst mode. In one embodiment of the camera described herein, the camera includes sunset mode, which is used to capture images in low light conditions. For example, the shutter speed may be relatively slow. The metadata may be analyzed to determine if the camera was in sunset mode when the image was captured. In such situations, the strobe would typically activate but for the camera being in burst mode. If the camera was in sunset mode, the image may be blurred because of the low light conditions and because the strobe was not activated. There may be other embodiments of the camera which cause the strobe to be inactive during image capture. For example, macro modes and modes to image documents may cause the strobe to be disabled.
  • If the conditions set forth above are met, processing proceeds to block 418. Block 418 displays text on the camera indicating that the image may be out of focus if a tripod or other stabilizing device was not used during image capture. The text may indicate that the image was captured during low light conditions with the strobe forced off due to the burst mode. Accordingly, the exposure time was long and may have exceeded the handheld limit. A suggestion of using a tripod or otherwise steadying the camera during image capture may be provided to the user. The camera may also suggest using a strobe, which may require capturing images on a mode other than burst mode. In another embodiment, the camera may suggest widening the zoom. As described above, a narrow zoom increases the likelihood of reaching the hand held limit.
  • If the determination from decision block 416 is negative, processing proceeds to block 406 because the present analysis has no bearing on the image quality.
  • No Focus Lock During Image Capture
  • Some embodiments of the camera include a switch that is used to cause the camera to capture images. The switch may be in a first position when no pressure is applied. The switch may be in a second position when a first force is applied. The second position is typically achieved when a user presses lightly on the switch. The second position cause the camera to focus on a scene. Because the focusing may take a while, the user may have to maintain the switch in the second position for a period while the camera focuses. Many cameras provide a focus indicator, which provides the user with an indiction as to whether focus has been achieved or not. Application of more force cause the switch into a third position, wherein the third position causes the camera to generate image data.
  • If the switch is pressed too fast, the camera may not achieve focus before an image is captured. The status of the focus may be stored in the metadata associated with each image. In some embodiments of the camera, the program analyses the metadata to determine if focus was achieved prior to the generation of image data. In some embodiments, the time that the switch was maintained in the second position may provide an indication as to the status of the focus. For example, if the switch was rapidly pressed from the first position to the third position, the rapid pressing may have caused the camera to shake during image capture, which may blur the image. Thus, if the camera found that focus lock was not achieved or if the switch was pressed too fast, the program may indicate that the image may be blurry.
  • In one embodiment, the camera suggests slowing down the speed in which the switch was pressed. For example, the camera may suggest maintaining the switch in the second position until focus lock is achieved. The camera may also suggest attempting to focus on a high contrast portion of the scene. High contrast portions of a scene usually provide better references for focus. In some embodiments, the camera may focus faster if the lighting in the scene is intensified. Thus, the program may suggest increasing light intensity during generation of subsequent image data.
  • Camera Set to Focus at Infinity—Image Out of Focus
  • Many camera embodiments include different focus ranges. In one embodiment, the camera includes a close focus and an infinity focus. The close focus, in this embodiment, enables the camera to focus on objects between a first distance from the camera to infinity. The infinity focus enables the camera to focus on objects located between a second distance from the camera and infinity, wherein the second distance is greater than the first distance. A camera may also include a manual focus mode wherein a user manually focuses the camera.
  • The program may analyze the focus of the camera by analyzing the metadata to determine whether the camera obtained focus lock during generation of the image data. The program may also analyze the focus setting to determine the focus mode of the camera during generation of the image data. If the camera was set to infinity focus during generation of the image data and the camera was unable to focus, the program may suggest using the close focus mode during generation of subsequent image data. The close focus mode increases the range of focus of the camera so that there is a better chance of obtaining focus lock during generation of subsequent image data. Likewise, if the camera was set to manual focus and did not achieve focus lock, the program may suggest using an automatic focus mode wherein the close focus and infinity focus modes are automatic focus modes.
  • The above-described analysis may also apply to a macro mode, wherein the macro mode enables the camera to focus on objects located in close proximity to the camera. If the camera detects an object within the range of the macro mode, but detects that the camera was not set to focus in macro mode during generation of the image data, the program may suggest using another focus mode or moving further away from the object during generation of subsequent image data. The program may also suggest stabilizing the camera during generation of subsequent image data because use of the macro mode increases the possibility of encountering the above-described hand held limit. In another embodiment, the program may suggest reducing the aperture size when the camera is in a macro mode.
  • Blurry Image—Focus Assist Indicator Deactivated
  • Some embodiments of the camera include an indicator that provides an indication when focus lock is achieved. The indicator is sometimes referred to as a focus assist indicator. Several versions of the indicator may be provided in the camera. For example, an LCD screen may provide an indication as to the status of the focus lock. In another embodiment, a light may change color depending on whether or not focus lock is achieved. In low light conditions, the light may irritate the user, so the user, in some embodiments, may disable the light. A problem occurs if the user attempts to capture images with the light disabled because the user may not know whether the camera has achieved focus lock.
  • The status of the indicator may be stored in the metadata. The status may include whether the indicator was disabled during generation of image data and whether the camera achieved focus lock. If the indicator was disabled, the program may suggest enabling the indicator during generation of subsequent image data. In one embodiment, the program may analyze the ambient light intensity and determine that the focus indicator should be enabled if the ambient light intensity is low. As stated above, achieving focus may be difficult in low light conditions, so the program may suggest enabling the focus assist indicator so as to improve the changes of achieving focus lock. In yet another embodiment, the camera may provide the indication to enable the focus assist indicator if the camera did not achieve focus lock.
  • Strobe Errors
  • The program may analyze a variety of image problems related to the strobe. For example, the strobe may be activated when it is too close to a subject, which results in over exposure of the subject. The strobe may have an exposure compensation associated with it which may be set so as to cause over exposure or under exposure of an image. In other embodiments, the program may detect that an object blocked the strobe during image capture. Various strobe anomalies are described below.
  • Strobe Used Beyond its Range
  • The intensity of light emitted by the strobe that is able to effectively illuminate a subject decreases rapidly with distance. Thus, the strobe will not be effective when it is used to illuminate scenes or objects that are far away so as to be out of the range of the strobe. The effective distance of the strobe is a function of the zoom and other camera settings. In one embodiment, the camera analyzes the metadata to determine if the strobe was activated when the image was captured. The metadata or other data associated with an image may provide an indication as to the focal length of the camera when the image was captured. It follows that the zoom setting of the camera may be determined from the metadata or other data including the settings of other camera functions stored in the metadata. This information will determine if the camera was focused beyond the effective range of the strobe and may provide an indication as to the distance between the camera and the scene or object at the time of image capture. It is noted that other embodiments for measuring the distance between the camera and a scene or object may be used by the camera.
  • If a determination is made that the camera attempted to capture an image beyond the effective range of the strobe, the pixel values may be analyzed to determine if the image is under exposed. This may be achieved as described above by analyzing the number of pixel values that are at or below a predetermined value. If all the above-described conditions are met, a determination is made that the image may be dark or under exposed because it was captured with the subject beyond the effective range of the strobe. Text may be displayed indicating the reasons for the dark image. A suggestion may be provided that includes moving the camera and the subject closer or turning off the strobe and using a long exposure time. Other suggestions include increasing the exposure compensation of the strobe.
  • In other embodiments, the program may make the above-described suggestions if the program determines that other criteria are also met. For example, the program may require that the camera is in an automatic mode to select shutter speed. The program may also only display the above-described information if the ISO was set to four-hundred. Furthermore, the program may require that the strobe be activated or fired at full power during image capture.
  • Strobe Too Close to the Subject
  • As described above, the intensity of light emitted by the strobe that is able to effectively illuminate the subject is proportional to the distance between the camera and the subject. In some situations, the strobe may be too close to the subject and may cause the subject to be over exposed. In order to determine the distance between the strobe and the subject at the time the image was captured, the metadata may be analyzed. If the zoom was set to focus on a close subject or if the camera was set to a macro mode, it is assumed that the camera was located very close to the subject. It is noted that other methods may be used to determine the distance between the object or scene and the camera. The zoom may also be analyzed to determine if the lens was set to a magnification that is below a preselected value. If the above-described conditions are met, the camera may analyze the pixel values of the image to determine if a predetermined number of the pixel values are clipped or otherwise exceed a predetermined value. As described above, clipped pixel values are indicative of an over exposed image.
  • If the above-described conditions are met, the camera may display text indicating that the image is likely over exposed. The camera may suggest turning the flash off or moving the camera away from the subject during generation of subsequent image data.
  • In some embodiments, the program may require that other criteria be met before the above-described information is displayed. In one embodiment, the program may display the above-described information if the camera was set to automatic shutter speed. In another embodiment, the program may require that the exposure compensation be set to zero.
  • Image is Over Exposed Using the Strobe
  • In other embodiments, the program may determine that the image is over exposed and the object was not excessively close to the camera during image capture. The program may analyze the pixel values as described above and may determine that the strobe activated during image capture. Based on these findings, the program may suggest a plurality of improvements to the image quality.
  • In one embodiment, the program determines that the strobe activated during image capture and the image is over exposed. The program may suggest turning off or deactivating the strobe during generation of subsequent image data. The program may analyze other settings of the camera at the time of image capture to determine that other settings or parameters of the camera, such as ISO speed, did not cause the over exposure.
  • In other embodiments, the program may analyze a strobe exposure compensation setting at the time of image capture. The value of this setting may be stored in metadata associated with the image data. The program may analyze the strobe exposure compensation. If the program determines that the strobe exposure compensation is too high, the program may suggest that the user use a lower value during generation of subsequent image data.
  • Under Exposed Image Using the Strobe
  • The program may analyze the metadata and other data to determine if the image is possibly under exposed. In one embodiment, the program may analyze the image to determine if it is under exposed. If so, the program may suggest increasing the strobe exposure compensation. Embodiments of this include increasing the power output of the strobe and the pulse time of the strobe. In other embodiments, the program may suggest using a wide angle lens, which may cause the aperture to open wider. In other embodiments, the program may suggest increasing the exposure compensation. For example, the program may suggest setting the exposure compensation to a positive value such as 0.3. In yet other embodiments, the program may suggest moving closer to the subject and reducing the ISO speed. For example, the user may move the camera to approximately 7.5 feet from the subject and setting the ISO speed to 100 or an automatic mode. In some embodiments, the program may provide the above-described information if the strobe activated at maximum power. The program may also require that the shutter speed be in automatic mode or that the ISO speed be 100 or less.
  • Strobe Used When the Camera is in Night Mode
  • The camera may have a mode that provides for capturing images captured under low light. This mode is sometimes referred to as night mode. Night mode typically increases the exposure time of the image, which may cause the hand held limit to be met. The program may determine that the camera was set to the night mode by analyzing the metadata. If the program determines that the camera was in night mode, the program may suggest stabilizing the camera to avoid blurring. The program may also cause information to be displayed indicating that the resulting image may be blurry.
  • In other embodiments, the program may cause the above-described information to be displayed if the program determines that the strobe activated, exposure time was greater than one sixtieth of a second, ambient light was low, or the camera focused.
  • Under Exposed Image Using Automatic Mode
  • In this analysis, the camera determines that the camera was set to automatic mode, meaning that the processor within the camera determined the best strobe setting. However, the camera also determined that the image is likely under exposed. In this analysis, the metadata determines that the camera was set in automatic mode and the strobe was activated during image capture. The pixel values are then analyzed as described above to determine if a predetermined number of pixel values are less than a predetermined number. More specifically, the image is analyzed to determine if it is possibly under exposed.
  • If the above conditions are met, the camera may display text indicating that the image may be under exposed. For example, the text may indicate that the strobe or flash likely did not provide adequate illumination for the ambient lighting conditions. The text may suggest overriding the automatic mode and setting the exposure compensation to a positive value, which will increase the exposure time during image capture. Thus, subsequent images may not be under exposed.
  • Object Blocking Strobe During Image Capture
  • The following analysis determines whether an object, such as the finger of the user, may have blocked the strobe during image capture. A blocked strobe will result in an under exposed image. In one embodiment of this analysis, the metadata is analyzed to determine if the focus distance used to capture the image was less than a predetermined distance, such as one meter. In such a situation, the subject of the image should be sufficiently illuminated so as not to generate an under exposed image. The metadata may also be analyzed to determine if the strobe was activated during image capture. If the above-described conditions are met, the camera may analyze the image to determine if it is under exposed by greater than a preselected amount, such as one stop. In other embodiments, the program may provide information to the user if no reflected light from the strobe is detected, indicating that the strobe did not illuminate the scene.
  • If the above-described conditions are met, the camera may display text indicating that the image was likely under exposed because an object, such as the finger of the user blocked the strobe. The text may suggest that the strobe be clear during subsequent image captures. The text may also suggest checking to make sure that the strobe is functioning.
  • Strobe Activated in Proximity of a Reflective Surface
  • When an image is captured using the strobe and the image includes a reflective surface, such as a window, the image may have a bright spot where the image of the reflected flash was captured. The method used to analyze this condition may consist of analyzing the metadata to determine whether the strobe was activated when the image was captured. Further analysis consists of determining whether the image has a bright spot. This analysis may be accomplished by analyzing the pixel values. If the pixel values indicate that a portion of the image is much brighter than other portions of the image, the camera may determine that the image contains a reflective surface. In some embodiments, the program may determine the intensity or amount of reflected strobe light in the scene. If the scene contains intense strobe light, the information described herein may be displayed.
  • If the above-described conditions are met, the camera may display text indicating that the image may have a bright spot caused by imaging the reflected flash. Suggestions for overcoming this problem include turning the flash off and imaging the reflective object at an angle so that the flash will not reflect directly back to the camera.
  • White Balance Errors
  • White balance errors, in summary, represent color variations due to different types of light sources illuminating a scene. For example, if a scene is illuminated with a flourescent ambient light, the colors in the scene have a specific temperature. The camera will then process the image data based on the selected illumination source. If the selected illumination source is not correct, the colors in the resulting image may be different than the scene illuminated with ambient light. This error is sometimes referred to as a white balance error.
  • User Selects Improper Illumination Source
  • In some embodiments of the camera, the user may select the illumination source or source of ambient light. For example, the user may select tungsten or flourescent lighting as the illumination source of the scene being captured. The camera may then capture an image and process image data based on the selected illumination source. If the camera is used to capture images that are illuminated using a different illumination source than the selected light source, the images may be susceptible to white balance errors.
  • The camera may analyze an image to determine if white balance errors may exist. One embodiment of determining if possible white balance errors exist is shown in the flowchart 450 of FIG. 10. At decision block 452, a determination is made as to whether the image was captured using a user selected illumination source. More specifically, the camera determines if the user provided information relating to the illumination source used to illuminate the captured scene. If the determination of decision block 452 is negative, processing proceeds to block 452 wherein the processing proceeds to the next analysis as described in the previous flowcharts.
  • If the determination of decision block 452 is affirmative, processing proceeds to decision block 458. At decision block 458, a determination is made as to whether the image was captured using full color mode. In this embodiment, the analysis focuses on color, so the analysis only proceeds if the full color mode is selected. The determination as to whether an image was captured while the camera was in full color mode may be made by analyzing the metadata. There is no need to perform this analysis if the image is a black and white image.
  • If the determination of decision block 458 is affirmative, processing proceeds to block 460. At block 460, the preferred illumination source is calculated or otherwise determined by the camera. More specifically, the camera determines which illumination source it would have chosen had the user not selected the illumination source. The analysis may be accomplished by analyzing the metadata and possibly other data to determine the preferred illumination source had the camera been in an automatic mode when the image was captured.
  • Processing then proceeds to decision block 462 where a determination is made as to whether the selected illumination source and the preferred illumination source are the same. In other words, a determination is made as to whether the camera would have selected the same illumination source as the user selected. If the camera would have selected the same illumination source that the user did, processing proceeds to block 454 as described above. In other words, there is no better choice for an illumination source than the one selected by the user and the processing continues to the next analysis.
  • If the illumination source selected by the user does not match the illumination source that the camera would have chosen, processing proceeds to block 464. Block 464 causes text to be displayed on the camera that indicates the image may have problems due to white balance errors. The text may indicate the selected illumination source and the preferred illumination source and may suggest changing the selected illumination source to the preferred illumination source. The text may also suggest changing the camera settings so that the camera automatically chooses the illumination source when processing the image data.
  • Several embodiments may be used in the white balance analysis. For example, the camera may enable a user to select various illumination sources from a list of illumination sources. The camera may assign a color temperature to each illumination source in the list. In some embodiments, the color temperatures are a ranges of color temperatures. During image capture, the camera may analyze the image and select a color temperature that it would use during processing.
  • The program may analyze the color temperature selected by the user and compare it to the color temperature selected by the camera. If the color temperature selected by the camera is the same or within a preselected range of the color temperature selected by the user, the analysis is complete and no suggestions are offered. If, on the other hand, the color temperature selected by the camera differs from the color temperature selected by the user, the program may inform the user that white balance problems may exist in the replicated image. The program indicate the color temperature selected by the camera. In another embodiment, the program may suggest a light source for the user to select based on the color temperature selected by the camera during subsequent image capture.
  • Numerous embodiments exist for selection of color temperature. For example, the user may select a color temperature corresponding to flourescent lighting. The camera may determine that a color temperature corresponding to tungsten lighting should have been selected. If the difference in color temperatures is greater than a preselected threshold, the program may suggest using the color temperature corresponding to tungsten lighting. As set forth above, the program may also provide information indicating that a white balance problem may exist.
  • Enhancements
  • The metadata and other data may be used to provide the user with ways to improve the image quality. The program may analyze the settings or different camera parameters at the time of image capture and may provide suggestions for improving the image during subsequent image capture.
  • Portrait Mode Enhancements
  • Portrait mode is a mode that is typically used for capturing images of people. The subjects are typically located in close proximity to the camera and, thus, are usually within range of the strobe. It is noted that there are many situations when the subject is out of range of the strobe.
  • One embodiment for enhancements determines whether the image was captured using portrait mode. This determination may be made by examining the metadata, which may store information relating to whether or not the camera was set in portrait mode at the time of image capture. If the camera was not in portrait mode, this portion of the analysis concludes because it is not applicable.
  • If a determination is made that the image was captured using portrait mode, a determination is made as to whether the strobe activated during image capture. The metadata may store information regarding whether or not the strobe activated during image capture. Images captured in portrait mode typically have higher quality if they are captured using the strobe. In such a situation, the camera may analyze the image data and determine that the strobe should have been activated. A message may be displayed indicating that the image may be enhanced by use of the strobe.
  • The camera may also determine whether a red eye elimination algorithm has been run on the image data. Such an algorithm removes red eye in images caused by capturing images of peoples' retinas. If such an algorithm did not run, the camera may suggest running such an algorithm. The red eye algorithm corrects the color of eyes. In one embodiment, the suggestion to run the algorithm may be displayed if the image was captured in portrait mode and the strobe activated during image capture.
  • In other embodiments, the camera may analyze other image parameters before suggesting the use of the strobe during image capture. For example, the exposure compensation or exposure time may be analyzed. As described above, the exposure compensation or time used during image capture may be stored in the metadata. If the exposure compensation is high or the exposure time is long, the camera may suggest using the strobe. The exposure time is typically selected by the camera. Therefore, if the exposure time is long, it is indicative of a dimly lit scene, which may require the strobe.
  • The camera may also analyze the number of dark pixels in an image. As described above, dark pixels are pixel values that are below a predetermined value and represent dark portions of an image. A large number of dark pixel values of an image captured using portrait mode may be indicative of an image having excessive shadows. Accordingly, the analysis may indicate to the user that the subject of the image may have undesired shadows. Therefore, the suggestions for an enhanced image may include using the strobe or setting the ambient lighting conditions used by the camera for image processing to a low light level.
  • The camera may also suggest using a low adaptive lighting setting. The low adaptive lighting setting reduces the effects of low light in the scene.
  • Too High Contrast in the Scene
  • The image data and the meta data may be analyzed to determine if the contrast in the scene is high or greater than a predetermined value. In one embodiment, the following analysis is not performed in panoramic or portrait modes. Images captured using the panoramic mode may have high contrasts due to the nature of capturing panoramic images. Images captured using the portrait mode may be subject to high contrast due to the nature of capturing portrait images.
  • The number of dark and clipped pixels in various portions of the image may be analyzed to determine the contrast. For example, pixel values in the center of the image may be analyzed to determine if they are generally greater than a predetermined value. Pixel values in other regions of the image may be analyzed to determine if they are generally less than a predetermined value. If a high number of pixel values are clipped and dark, the contrast may be too high. The camera may display information suggesting setting the camera to lower ambient lighting as a basis for image processing, which may lower the contrast. In some embodiments, the program may suggest setting an adaptive lighting setting lower so as to capture images that may be located in shadows in the scene.
  • In other embodiments, the camera may analyze the metadata to determine if the subject is beyond the range of the strobe and if the camera focused during image capture. These addition criteria may have to be met in order for the camera to display the above-described suggestions.
  • Conflicting Settings
  • As described above, the camera may have a plurality of settings that may be automatic or may be set by a user. Problems may arise if users capture images using manual settings that conflict with one another. For example, if contrast, sharpness, saturation, and adaptive lighting are all set high, the resulting image may appear unrealistic.
  • In one embodiment, the camera determines if the contrast, sharpness, and saturation were all set high, or greater than preselected values, during image capture. In addition, the camera determines if the image was captured using a full color mode. All these settings may be stored in the metadata. If all the above-described conditions are met, the camera may display information indicating that the image may appear unrealistic. The program may display a suggestion of reducing at least one of the contrast, sharpness, or saturation settings.
  • In a similar embodiment, the camera determines if the contrast, sharpness, and adaptive lighting were all set high, or greater than preselected values, during image capture. In addition, the camera determines if the image was captured using a full color mode. All these settings may be stored in the metadata. If all the above-described conditions are met, the camera may display information indicating that the image may appear unrealistic. The program may display a suggestion of reducing at least one of the contrast, sharpness, or adaptive lighting.
  • In a similar embodiment, the camera determines if the contrast, saturation, and adaptive lighting were all set high, or greater than preselected values, during image capture. In addition, the camera determines if the image was captured using a full color mode. All these settings may be stored in the metadata. If all the above-described conditions are met, the camera may display information indicating that the image may appear unrealistic. The program may display a suggestion of reducing at least one of the contrast, saturation, or adaptive lighting.
  • In a similar embodiment, the camera determines if the sharpness, saturation, and adaptive lighting were all set high, or greater than preselected values, during image capture. In addition, the camera determines if the image was captured using a full color mode. All these settings may be stored in the metadata. If all the above-described conditions are met, the camera may display information indicating that the image may appear unrealistic. The program may display a suggestion of reducing at least one of the sharpness, saturation, or adaptive lighting.
  • In an embodiment related to the above-described analysis, the above-described analysis may only be performed if the camera was in focus when the image was captured. If the camera was not in focus, the analysis may have no bearing on the captured image. The determination as to whether the image was in focus may be made by analyzing the metadata, which may store data indicating whether the image was in focus.
  • ISO Speed and Adaptive Light Conflict
  • In one embodiment of the camera, the ISO speed and the adaptive lighting are analyzed to determine if a possible conflict existed during image capture. If the ISO speed is set above a preselected value and the adaptive lighting is also set above a preselected value, the camera may display information indicating that the image quality may be poor. For example, the camera may indicate that the image may appear grainy or unrealistic because both the ISO speed and the adaptive lighting are set above predetermined thresholds. The camera may also suggest lowering either the ISO speed or the adaptive lighting setting.
  • In one embodiment, the predetermined value or threshold speed for the ISO setting is 400 or gain greater than 29.0. The camera may suggest setting either the ISO setting or the adaptive lighting to the default values or setting the camera so that the camera selects the values. The camera may also determine whether it was in focus during image capture. If the camera was not in focus, the poor image quality may be due to focusing problems.
  • Camera is Too Hot
  • Like most electronic devices, the performance of a digital camera can deteriorate as it gets hot. For example, the CCD and the image processing components may produce an image that is degraded when they are operated above a predetermined temperature.
  • The camera may have a temperature sensor and may store the temperature of the camera at the time images are captured. For example, the temperature of the camera at the time of image capture may be stored in the metadata. In one embodiment, the temperature threshold is forty degrees centigrade. If the camera determines that the temperature is above the threshold, the camera may display information indicating that excessive heat may have caused the image to be degraded. The camera may further suggest cooling the camera down prior to capturing more images. Cooling may include turning off the display for a period prior to capturing images.
  • In one embodiment, the program measures the temperature of a CCD or the like that is used to generate image data. In another embodiment, the temperature of the camera may be calculated by determining the time period in which the display was active. Based on this calculation, the program may display the above-described information.
  • In a related embodiment, the program may analyze the adaptive lighting setting to determine if the camera temperature was above a preselected temperature during image capture and the adaptive lighting was set high. If the above-described conditions are met, the program may display information indicating that the image may be grainy or otherwise be poor quality. In order to improve subsequent images, the program may suggest reducing the adaptive lighting setting or turning it off. The program may also suggest reducing the temperature of the camera.
  • Portrait Mode Using Wide Angle
  • As described above, the portrait mode is typically used to capture images that are close to the camera. A wide angle setting of the lens, on the other hand, is typically used to capturing images or scenes over a wide angle, which are not close to the camera. Capturing an image in portrait mode while using a wide angle lens setting may distort the image. The metadata may store information relating to the zoom setting and whether the camera was in wide angle mode during image capture.
  • The zoom setting and portrait mode may be determined by analyzing the metadata. The zoom setting can be compared to a predetermined value to determine if the zoom setting was great enough to cause distortion in the image. Likewise, the metadata may be analyzed to determine if the image was captured using the portrait mode. If both these conditions are met, the camera may display information indicating that the image may be distorted. The camera may also suggest moving away from the subject and using a narrower zoom setting.
  • Action Mode with Shutter Lag
  • The camera may have an action mode, which is used to capture scenes containing moving subjects. Action mode typically includes a very fast shutter speed and other settings that enable the image of the subject to be captured without blurring the image. The metadata may contain information regarding whether an image was captured using action mode. One problem with capturing moving subjects is obtaining proper focus. As the subject moves, toward or away from the camera, the focal length changes. In order to enhance an image, the camera may detect that an image was captured using the action mode. The camera may suggest setting the focus at a point where the subject is expected to be during image capture. This will assure that the camera is properly focused during image capture.
  • In some embodiments, the camera may have a switch used to cause the camera to capture an image, wherein the switch has multiple positions. The switch may have a first position when not force is applied. In the first position, the functions associated with the switch may be inactive. A second position of the switch may be reached by applying a first force to the switch. The second position may cause the camera to focus on the scene. It is noted that focusing is typically not instantaneous and may require that the switch be maintained in the second position for a period. The third position of the switch may be achieved by applying a third force to the switch wherein the third force is greater than the second force.
  • The camera or program may measure the time that the switch is in the second position. Thus, the program is able to determine whether the camera likely achieved focus lock meaning that the camera was able to focus on a scene. If the time that the switch was in the second position was shorter than a preselected time and the camera was in action mode, the program may display information indicating that the image may be blurry. The preselected period may, as an example, be approximately 1.5 seconds. The program may also suggest maintaining the switch in the second position for a longer period during capture of subsequent images.
  • High Digital Zoom Resulting in Low Image Quality
  • The use of a digital zoom enables a camera user to enlarge a scene, however, the quality or resolution of the scene is degraded as a result. This degradation is more prominent when the user prints or otherwise displays an enlarged image. For example, if the user enlarges the scene to print it on a large sheet of paper, the resolution and, thus, the quality, of the image will be degraded. If the printed image is too large, the quality of the image will be significantly deteriorated. The digital zoom setting used to capture an image may be stored in the metadata. During analysis, the camera may access the metadata to determine the digital zoom setting and provide the user with information regarding possible printing limitations of the image before degradation exceeds a predetermined threshold.
  • In one embodiment, the camera determines whether the resolution of the captured image is less than one thousand columns. More specifically, the camera determines if less than one thousand columns of photodetectors on the CCD were used to capture the image. The camera may then display information indicating that the largest suggested image that may be reasonably replicated based on the image data is five inches by seven inches or thirteen centimeters by eighteen centimeters. The camera may also determine if less than eight hundred columns of photodetectors were used to capture the image. The camera may then display information indicating that the largest suggested image is four inches by six inches or ten centimeters by fifteen centimeters. The camera may also determine if less than one six hundred columns of photodetectors were used to capture the image. The camera may then display information indicating that the largest suggested image is three and one half inches by five inches or nine centimeters by thirteen centimeters.
  • Alternatively, the resolution used to capture an image may be increased. Therefore, the camera may suggest increasing the resolution, which increases the number of pixels used to capture an image. Therefore, larger images will be able to be replicated or displayed without degradation.
  • In another embodiment, the camera may suggest eliminating the digital zoom in favor of optical zoom.

Claims (244)

1. A method of imaging an object, said method comprising:
generating image data, using an imaging device, said generating having at least one adjustable parameter associated therewith, said at least one first adjustable parameter being set at a first value;
analyzing said first value of said at least one first adjustable parameter; and
determining a second value of said at least one adjustable parameter based on said analyzing.
2. The method of claim 1, wherein said first value is associated with said image data.
3. The method of claim 1, and further comprising displaying information on said imaging device related to setting said at least one adjustable parameter to said second value.
4. The method of claim 1, wherein said at least one adjustable parameter is exposure.
5. The method of claim 1, wherein said analyzing comprises analyzing an aperture associated with said imaging device to determine whether an image represented by said image data is over exposed.
6. The method of claim 1, wherein said imaging device comprises an aperture having a first mode wherein the aperture size is setable by a user and a second mode wherein said aperture size is setable by said imaging device;
wherein said aperture size is an at least one adjustable parameter; wherein said image data comprises a plurality of pixel values; and wherein said method further comprises:
determining whether said aperture size used during generation of said image data was set by a user;
determining whether a preselected number of said pixel values exceed a preselected value; and
wherein said determining a second value comprises determining that the size of said aperture should be reduced during generation of subsequent image data if said preselected number of pixel values exceeded said preselected value and said aperture size was set by a user.
7. The method of claim 6 and further comprising displaying information suggesting using a smaller aperture size if said preselected number of pixel values exceeds said preselected value.
8. The method of claim 6 and further comprising displaying information on said imaging device indicating that an image represented by said image data may be over exposed.
9. The method of claim 6 and further comprising displaying information indicating the approximate number of stops that an image represented by said image data is over exposed if said preselected number of pixel values exceeds said preselected value.
10. The method of claim 6, wherein said determining whether said preselected number of said pixel values exceeds said preselected value corresponds to analyzing said pixel values to determine if an image represented by said image data is over exposed by approximately two-thirds stop or greater.
11. The method of claim 1, wherein said analyzing comprises analyzing an aperture associated with said imaging device to determine whether an image represented by said image data is under exposed.
12. The method of claim 1, wherein said imaging device comprises an aperture having a first mode wherein the aperture size is setable by a user and a second mode wherein said aperture size is setable by said imaging device; wherein said aperture size is an at least one adjustable parameter; wherein said image data comprises a plurality of pixel values; and wherein said method further comprises:
determining whether said aperture size used during generation of said image data was set by a user; and
analyzing said pixel values to determine if a preselected number of said pixel values are less than a preselected value;
wherein said determining a second value comprises determining that the size of said aperture should be increased during generation of subsequent image data if said preselected number of pixel values are less than said preselected value and said aperture sized used during generation of said image data was set by a user.
13. The method of claim 12 and further comprising displaying information on said imaging device suggesting using a larger aperture size if said preselected number of pixel values are less than said preselected value.
14. The method of claim 12 and further comprising displaying information on said imaging device indicating that an image represented by said image data may be under exposed.
15. The method of claim 12 and further comprising displaying information on said imaging device indicating the approximate number of stops that an image represented by said image data is under exposed if said preselected number of photodetectors are less than said preselected value.
16. The method of claim 12, wherein said analyzing said pixel values to determine if a preselected number of said pixel values are less than a preselected value corresponds to analyzing said pixel values to determine if an image represented by said image data is under exposed by one stop or greater.
17. The method of claim 1, wherein said analyzing comprises analyzing a shutter speed associated with said imaging device to determine whether an image represented by said image data is over exposed.
18. The method of claim 1, wherein said imaging device comprises a shutter having a shutter speed associated therewith, said shutter having a first mode wherein said shutter speed is setable by a user and a second mode wherein said shutter speed is setable by said imaging device; wherein said shutter speed is an at least one adjustable parameter; wherein said image data comprises a plurality of pixel values; and wherein said method further comprises:
determining whether said shutter speed used during generation of said image data was set by a user;
analyzing said pixel values to determine if a preselected number of said pixel values exceed a preselected value; and
wherein said determining a second value comprises determining that the speed of said shutter should be increased during generation of subsequent image data if said preselected number of pixel values exceed said preselected value and said shutter speed used during the generation of said image data was set by a user.
19. The method of claim 18 and further comprising displaying information on said imaging device suggesting using a faster shutter speed if said preselected number of pixel values exceed said preselected value.
20. The method of claim 18 and further comprising displaying information on said imaging device indicating that an image represented by said image data may be overexposed.
21. The method of claim 18, wherein said analyzing said pixel values to determine if a preselected number of said pixel values exceed said preselected value corresponds to analyzing said pixel values to determine if an image represented by said image data is over exposed by approximately one stop or greater.
22. The method of claim 1, wherein said analyzing comprises analyzing a shutter speed associated with said imaging device to determine whether an image represented by said image data is under exposed.
23. The method of claim 1, wherein said imaging device comprises a shutter having a shutter speed associated therewith, said shutter having a first mode wherein said shutter speed is setable by a user and a second mode wherein said shutter speed is setable by said imaging device; wherein said shutter speed is an at least one adjustable parameter; wherein said image data comprises a plurality of pixel values; and wherein said method further comprises:
determining whether said shutter speed used during generation of said image data was set by a user;
analyzing said pixel values to determine if a preselected number of said pixel values are less than a preselected value; and
wherein said determining a second value comprises determining that the speed of said shutter should be decreased during generation of subsequent image data if said preselected number of pixel values are less than said preselected value and said shutter speed used during the generation of said image data was set by a user.
24. The method of claim 23 and further comprising displaying information on said imaging device suggesting using a slower shutter speed if said preselected number of pixel values are less than said preselected value.
25. The method of claim 23 and further comprising displaying information on said imaging device indicating that an image represented by said image data may be under exposed.
26. The method of claim 23, wherein said analyzing said pixel values to determine if a preselected number of said pixel values are less than said preselected value corresponds to analyzing said pixel values to determine if the image represented by said image data is under exposed by approximately one stop or greater.
27. The method of claim 1, wherein said analyzing comprises analyzing a shutter speed and an aperture size associated with said imaging device to determine whether an image represented by said image data is under exposed.
28. The method of claim 1, wherein said imaging device comprises a shutter having a shutter speed associated therewith, said shutter having a first mode wherein said shutter speed is setable by a user and a second mode wherein said shutter speed is setable by said imaging device; wherein said shutter speed is an at least one adjustable parameter; said imaging device further comprising an aperture having a first mode wherein the aperture size is setable by a user and a second mode wherein said aperture size is setable by said imaging device; wherein said aperture size is an at least one adjustable parameter; wherein said image data comprises a plurality of pixel values; and wherein said method further comprises:
determining whether said shutter speed used during generation of said image data was set by a user;
determining whether said aperture size used during generation of said image data was set by a user;
analyzing said pixel values to determine if a preselected number of said pixel values are less than a preselected value;
wherein said determining a second value comprises determining that said aperture size should be increased during generation of subsequent image data if said aperture size is less than a preselected value based on said shutter speed and said aperture size was set by a user, if said shutter speed was set by a user, and if said predetermined number of said pixel values are less than said preselected value.
29. The method of claim 28 and further comprising displaying information on said imaging device indicating that an image represented by said image data may be under exposed.
30. The method of claim 28 and further comprising displaying information on said imaging device indicating that said aperture size should be increased during generation of subsequent image data.
31. The method of claim 1, wherein said analyzing comprises analyzing a shutter speed and an aperture size associated with said imaging device to determine whether an image represented by said image data is over exposed.
32. The method of claim 1, wherein said imaging device comprises a shutter having a shutter speed associated therewith, said shutter having a first mode wherein said shutter speed is setable by a user and a second mode wherein said shutter speed is setable by said imaging device; wherein said shutter speed is an at least one adjustable parameter; said imaging device further comprising an aperture having a first mode wherein the aperture size is setable by a user and a second mode wherein said aperture size is setable by said imaging device; wherein said aperture size is an at least one adjustable parameter; and wherein said image data comprises a plurality of pixel values; and wherein said method further comprises:
determining whether said shutter speed used during generation of said image data was set by a user;
determining whether said aperture size used during generation of said image data was set by a user;
analyzing said pixel values to determine if a preselected number of said pixel values exceed a preselected value;
wherein said determining a second value comprises determining that said aperture size should be reduced during generation of subsequent image data if said aperture size was set by a user and is greater than a preselected value based on said shutter speed, if said shutter speed was set by a user and if said preselected number of said pixel values exceed said preselected value.
33. The method of claim 32 and further comprising displaying information on said imaging device indicating that an image represented by said image data may be under exposed.
34. The method of claim 32 and further comprising displaying information on said imaging device indicating that said aperture size should be increased during generation of subsequent image data.
35. The method of claim 1, wherein said imaging device comprises a shutter having a shutter speed associated therewith, said shutter having a first mode wherein said shutter speed is setable by a user and a second mode wherein said shutter speed is setable by said imaging device; wherein said shutter speed is an at least one adjustable parameter; said imaging device further comprising an aperture having a first mode wherein the aperture size is setable by a user and a second mode wherein said aperture size is setable by said imaging device; wherein said aperture size is an at least one adjustable parameter; and wherein said image data comprises a plurality of pixel values; and wherein said method further comprises:
determining whether said shutter speed used during generation of said image data was set by a user;
determining whether said aperture size used during generation of said image data was set by a user;
analyzing said pixel values to determine if a preselected number of said pixel values are less than a preselected value;
wherein said determining a second value comprises determining that said shutter speed should be reduced during generation of subsequent image data if said shutter speed is faster than a preselected value based on said aperture size, if said shutter speed and said aperture size were set by a user; and if said preselected number of pixel values are less than said preselected value.
36. The method of claim 35 and further comprising displaying information on said imaging device indicating that an image represented by said image data may be under exposed.
37. The method of claim 35 and further comprising displaying information on said imaging device indicating that said shutter speed should be reduced during generation of subsequent image data.
38. The method of claim 1, wherein said imaging device comprises a shutter having a shutter speed associated therewith, said shutter having a first mode wherein said shutter speed is setable by a user and a second mode wherein said shutter speed is setable automatically by said imaging device; wherein said shutter speed is said at least one adjustable parameter; said imaging device further comprising an aperture having a first mode wherein the aperture size is setable by a user and a second mode wherein said aperture size is setable automatically by said imaging device; wherein said aperture size is said at least one adjustable parameter; and wherein said image data comprises a plurality of pixel values; and wherein said method further comprises:
determining whether said shutter speed used during generation of said image data was set by a user;
determining whether said aperture size used during generation of said image data was set by a user;
analyzing said pixel values to determine if a preselected number of said pixel values exceed a preselected value;
wherein said determining a second value comprises determining that said shutter speed should be increased during generation of subsequent image data if said shutter speed is less than a preselected value based on said aperture size, if said preselected number of pixel values exceed said preselected value, and if said shutter speed and said aperture size were set by a user.
39. The method of claim 38 and further comprising displaying information on said imaging device indicating that an image represented by said image data may be over exposed.
40. The method of claim 38 and further comprising displaying information on said imaging device indicating that said shutter speed should be increased during generation of subsequent image data.
41. The method of claim 1, wherein said analyzing comprises analyzing a plurality of shutter speeds and at least one exposure compensation associated with said imaging device to determine whether a plurality of images represented by said image require more exposure.
42. The method of claim 1, wherein said generating image data comprises generating image data representative of a plurality of images, said generating being accomplished within a preselected period; wherein said imaging device comprises a shutter speed associated therewith and wherein said images are captured using shutter speeds between a first shutter speed and a second shutter speed; wherein said imaging device comprises an adjustable exposure compensation, said exposure compensation being an at least one adjustable parameter; wherein said method further comprises:
analyzing pixel values representative of said plurality of images;
wherein said determining a second value comprises determining that a greater exposure compensation should be used during generation of subsequent image data if a preselected number of said pixel values are less than a preselected value.
43. The method of claim 42, wherein said determining a second value comprises determining that greater exposure compensation should be used during generation of capture of subsequent images if said image is over exposed by an amount equal to or greater than approximately three stops.
44. The method of claim 42 and further comprising displaying information on said imaging device indicating that more greater exposure compensation may be desired.
45. The method of claim 42 and further comprising displaying information on said imaging device indicating that a greater exposure compensation may be used during generation of subsequent image data.
46. The method of claim 1, wherein said analyzing comprises analyzing a plurality of shutter speeds and at least one exposure compensation associated with said imaging device to determine whether a plurality of images represented by said image require less exposure.
47. The method of claim 1, wherein said generating image data comprises generating image data representative of a plurality of images, said generating being accomplished within a preselected period; wherein said imaging device comprises a shutter speed associated therewith and wherein said images are captured using shutter speeds between a first shutter speed and a second shutter speed; wherein said imaging device comprises and exposure compensation, said exposure compensation being said at least one adjustable parameter; wherein said method further comprises:
analyzing pixel values representative of said plurality of images;
wherein said determining a second value comprises determining that a reduced exposure compensation should be used if a preselected number of said pixel values are greater than a preselected value.
48. The method of claim 47, wherein said determining a second value comprises determining that a reduced exposure compensation should be used during generation of subsequent image data if the image is under exposed by an amount equal to or less than approximately three stops.
49. The method of claim 47 and further comprising displaying information on said imaging device indicating that more under exposure may be desired during generation of subsequent image data.
50. The method of claim 47 and further comprising displaying information on said imaging device indicating that a reduced exposure compensation may be used during generation of subsequent image data.
51. The method of claim 1, wherein said analyzing comprises analyzing at least one shutter speed associated with said imaging device to determine whether an image represented by said image data is over exposed.
52. The method of claim 1, wherein said imaging device, wherein said generating image data comprises generating image data representative of a plurality of images, said generating being accomplished within a preselected period; wherein said imaging device has a shutter speed associated therewith and wherein said images are captured using shutter speeds between a first shutter speed and a second shutter speed; wherein said shutter speed is an at least one adjustable parameter; wherein said method further comprises:
analyzing pixel values representative of said image;
wherein said determining a second value comprises determining a faster first shutter speed or a faster second shutter speed for generation of subsequent image data if a preselected number of said pixel values representative of one of said plurality of images are greater than a preselected value.
53. The method of claim 52 and further comprising displaying information on said imaging device indicating that an image represented by said image data may be overexposed.
54. The method of claim 52 and further comprising displaying information on said imaging device indicating that a faster first shutter speed may be used during generation of subsequent image data.
55. The method of claim 52 and further comprising displaying information on said imaging device indicating that a faster second shutter speed may be used during generation of subsequent image data.
56. The method of claim 1, wherein said analyzing comprises analyzing at least one shutter speed associated with said imaging device to determine whether an image represented by said image data is under exposed.
57. The method of claim 1, wherein said imaging device, wherein said generating image data comprises generating image data representative of a plurality of images, said generating being accomplished within a preselected period; wherein said imaging device has a shutter speed associated therewith and wherein said images are captured using shutter speeds between a first shutter speed and a second shutter speed; wherein said shutter speed is said at least one adjustable parameter; wherein said method further comprises:
analyzing pixel values representative of said image;
wherein said determining a second value comprises determining a slower first shutter speed or a slower second shutter speed for generation of subsequent image data if a preselected number of said pixel values representative of one of said plurality of images are less than a preselected value.
58. The method of claim 57 and further comprising displaying information on said imaging device indicating that an image represented by said image data may be underexposed.
59. The method of claim 57 and further comprising displaying information on said imaging device indicating that a slower first shutter speed may be used during generation of subsequent image data.
60. The method of claim 57 and further comprising displaying information on said imaging device indicating that a slower second shutter speed may be used during generation of subsequent image data.
61. The method of claim 1, wherein said generating comprises generating image data representative of a plurality of images and wherein said analyzing comprises analyzing the status of a strobe associated with said imaging device during generation of said image data to determine whether one or more of said images may be blurry.
62. The method of claim 1, wherein said generating image data comprises generating image data representative of a plurality of images, said generating being accomplished within a preselected period; wherein said imaging device has a shutter speed associated therewith and wherein said images are captured using shutter speeds between a first shutter speed and a second shutter speed; wherein said shutter speed is an at least one adjustable parameter; wherein said method further comprises:
determining whether a strobe associated with said imaging device activated during capture of at least one of said images and did not activate during capture of at least one of said images;
wherein said determining a second value comprises determining that said strobe activate during capture of all images or not active during capture of all images, during a subsequent period if said strobe activated during capture of at least one of said images and did not activate during capture of at least one of said images.
63. The method of claim 62 and further displaying information on said imaging device that at least one of said images may be blurry.
64. The method of claim 62 and further comprising displaying an indication on said imaging device, wherein said indication comprises a message that one of said plurality of images may be blurry.
65. The method of claim 62, and further comprising:
calculating a hand held limit, said hand held limit being based, at least in part, on a zoom setting and exposure time of said camera during image capture; and
providing an indication on said imaging device that one of said plurality of images may have poor quality if said hand held limit has been exceeded.
66. The method of claim 64 and further comprising displaying an indication on said imaging device suggesting stabilizing said imaging device during a subsequent period in which images are captured.
67. The method of claim 1, wherein said analyzing comprises analyzing an exposure compensation setting associated with said imaging device to determine whether an image represented by said image data is over exposed.
68. The method of claim 1, wherein exposure compensation is an at least one adjustable parameter; wherein said image data comprises a plurality of pixel values; wherein said method further comprises determining whether a preselected number of pixel values exceed a preselected value; and wherein said determining a second value comprise determining a lower exposure compensation associated with the generation of subsequent image data if said preselected number of pixel values exceed said preselected value.
69. The method of claim 68, wherein said whether a preselected number of pixel values exceed a preselected value comprises determining whether said exposure compensation corresponds to an exposure compensation greater than 0.6.
70. The method of claim 68 and further comprising displaying information on said imaging device suggesting reducing said exposure compensation associated with the generation of subsequent image data.
71. The method of claim 68 and further comprising displaying information on said imaging device indicating that an image represented by said image data may be over exposed.
72. The method of claim 1, wherein said analyzing comprises analyzing an exposure compensation setting associated with said imaging device to determine whether an image represented by said image data is under exposed.
73. The method of claim 1, wherein exposure compensation is an at least one adjustable parameter; wherein said image data comprises a plurality of pixel values; wherein said method further comprises determining whether a preselected number of pixel values are less than a preselected value; and wherein said determining a second value comprise determining a greater exposure compensation associated with the generation of subsequent image data if said preselected number of pixel values are less than said preselected value.
74. The method of claim 73, wherein said whether a preselected number of pixel values exceed a preselected value comprises determining whether the exposure compensation corresponds to an exposure compensation less than negative 0.6.
75. The method of claim 73 and further comprising displaying information on said imaging device suggesting reducing said exposure compensation.
76. The method of claim 73 and further comprising displaying information on said imaging device indicating that an image represented by said image data may be under exposed.
77. The method of claim 73, wherein said imaging device further comprises a strobe, and wherein said determining a second value further comprises determining a greater exposure compensation associated with the generation of subsequent image data if said preselected number of pixel values are less than said preselected value and if said if said strobe activated during generation of said image data.
78. The method of claim 1, wherein said analyzing comprises analyzing the power output of a strobe used during the generation of said image data to determine whether an image represented by said image data is under exposed.
79. The method of claim 1, wherein said imaging device comprises a strobe; wherein said imaging device has an exposure compensation associated therewith, said exposure compensation being an at least one adjustable parameter; wherein said method further comprises:
analyzing pixel values associated with said image data to determine whether a preselected number of pixel values are less than a preselected value;
determining whether said strobe activated during generation of said image data;
determining the power output of said strobe during said generation of image data;
wherein said determining a second value comprises determining that exposure compensation should be increased during generation of subsequent image data if said preselected number of pixel values were less than a said preselected value and said power output of said strobe was less than a preselected power.
80. The method of claim 78 and further displaying information on said imaging device that an image represented by said image data may be under exposed.
81. The method of claim 78 and further comprising displaying information on said imaging device suggesting increasing said exposure compensation during generation of subsequent image data.
82. The method of claim 1, wherein the generation of image data comprises sampling a portion of an image represented by said image data to determine at least one setting to be applied to the remainder of said image, said analyzing comprising analyzing said portion of said image to determine whether another portion should be used during generation of subsequent image data.
83. The method of claim 1, wherein said generating image data further comprises sampling a preselected portion of said image and wherein exposure is based on said sampling, said portion of said image that is sampled being an at least one adjustable parameter; and further comprising determining whether the exposure of said image is out of a preselected range; and wherein said determining a second value of said at least one adjustable parameter comprises determining another portion of said image to be sampled in association with generating subsequent image data.
84. The method of claim 83, wherein said determining a second value comprises sampling a larger area of said image.
85. The method of claim 83, wherein said determining a second value comprises determining a second portion of said image to be sampled, said second portion not including the previous portion.
86. The method of claim 83, wherein said determining whether an exposure is out of a preselected range comprises determining whether the values of a preselected number of pixels values representative of said image are greater than a preselected value.
87. The method of claim 83, wherein said determining whether an exposure is out of a preselected range comprises determining whether the values of a preselected number of pixels values representative of said image are less than a preselected value.
88. The method of claim 83 and further comprising displaying information on said imaging device indicating that an image represented by said image data may be improperly exposed.
89. The method of claim 83 and further comprising displaying information on said imaging device suggesting correcting exposure errors during generation of subsequent image data.
90. The method of claim 1, wherein said analyzing comprises analyzing a shutter speed associated with said imaging device and pixel values associated with said image data to determine whether an image represented by said image data is under exposed.
91. The method of claim 1, wherein said imaging device comprises an ISO speed associated therewith, said ISO speed being an at least one adjustable parameter; said method further comprising determining whether a preselected number of pixel values representative of said image are less than a preselected value; and wherein said determining a second value comprises determining a slower ISO speed if said preselected number of pixel values are less than a preselected value.
92. The method of claim 91 and further displaying information on said imaging device that an image represented by said image data may be under exposed.
93. The method of claim 91 and further comprising displaying information on said imaging device suggesting reducing said ISO speed during generation of subsequent image data.
94. The method of claim 91, wherein said imaging device comprises a zoom, said zoom being an at least one adjustable parameter; wherein said method further comprises determining a wider zoom if said preselected number of pixel values are less than a preselected value and said imaging device does not comprise said slower ISO speed.
95. The method of claim 1, wherein said imaging device comprises a zoom having a zoom setting associated therewith, said zoom setting being an at least one adjustable parameter; said method further comprising determining whether a preselected number of pixel values representative of said image are less than a preselected value; and wherein said determining a second value comprises determining a wider zoom setting if said preselected number of pixel values are less than a preselected value.
96. The method of claim 95 and further displaying information on said imaging device that an image represented by said image data may be under exposed.
97. The method of claim 95 and further comprising displaying information on said imaging device suggesting using a wider zoom during generation of subsequent image data.
98. The method of claim 1, wherein said analyzing comprises analyzing a strobe exposure compensation associated with said imaging device to determine whether an image represented by said image data is under exposed.
99. The method of claim 1, wherein said imaging device comprises a strobe, said strobe having a strobe exposure compensation associated therewith; said strobe exposure compensation being an at least one adjustable parameter; said method further comprising:
determining whether a preselected number of pixel values comprising said image data are less than a preselected value;
wherein said determining a second value comprises determining a greater strobe exposure compensation during generation of subsequent image data if said preselected number of pixel values comprising said image data are less than said preselected value.
100. The method of claim 99 and further comprising displaying text on said imaging device indicating that an image representative of said image data may be under exposed.
101. The method of claim 99 and further comprising displaying text on said imaging device suggesting using a greater strobe exposure compensation during generation of subsequent image data.
102. The method of claim 1, wherein said analyzing comprises analyzing a strobe exposure compensation associated with said imaging device to determine whether an image represented by said image data is over exposed.
103. The method of claim 1, wherein said imaging device comprises a strobe, said strobe having a strobe exposure compensation associated therewith; said strobe exposure compensation being an at least one adjustable parameter; said method further comprising:
determining whether a preselected number of pixel values comprising said image data are greater than a preselected value;
wherein said determining a second value comprises determining a lower strobe exposure compensation if said preselected number of pixel values comprising said image data are greater than said preselected value.
104. The method of claim 103 and further comprising displaying text on said imaging device indicating that an image representative of said image data may be under exposed.
105. The method of claim 103 and further comprising displaying text on said imaging device suggesting using a lower strobe exposure compensation during generation of subsequent image data.
106. The method of claim 1, wherein said analyzing comprises analyzing the exposure compensation associated with said generating based on the distance between a scene and said imaging device to determine whether an image represented by said image data is under exposed.
107. The method of claim 1, wherein said imaging device comprises a strobe, and an exposure compensation; said exposure compensation being an at least one adjustable parameter; said method further comprising:
determining the distance between said imaging device and an object to which said image data was generated during the generation of said image data;
determining the exposure compensation used during generation of said image data;
determining whether said distance was greater than a preselected distance;
wherein said determining a second value comprises determining a greater exposure compensation if said distance was greater than said preselected distance and said exposure compensation was less than a preselected value.
108. The method of claim 107 and further comprising displaying information on said imaging device indicating that an image representative of said image data may be under exposed.
109. The method of claim 107 and further comprising displaying text on said imaging device suggesting using a greater exposure compensation during generation of subsequent image data.
110. The method of claim 1, wherein said analyzing comprises analyzing the ISO speed associated with said generating said image data based on the distance between a scene and said imaging device to determine whether an image represented by said image data is under exposed.
111. The method of claim 1, wherein said imaging device comprises a strobe and an ISO speed associate with said generating; said ISO speed being an at least one adjustable parameter; said method further comprising:
determining the distance between said imaging device and an object to which said image data was generated during the generation of said image data;
determining the ISO speed used during generation of said image data;
determining whether said distance was greater than a preselected distance;
wherein said determining a second value comprises determining a faster ISO speed if said ISO speed used during generation of image data was less than a preselected speed, said distance was greater than said preselected distance and said strobe activated during generation of image data.
112. The method of claim 111 and further comprising displaying text on said imaging device indicating that an image representative of said image data may be under exposed.
113. The method of claim 111 and further comprising displaying text on said imaging device suggesting using a faster ISO speed time during generation of subsequent image data.
114. The method of claim 1, wherein said analyzing comprises analyzing the distance between an object of which image data was generated and said imaging device to determine whether an image represented by said image data is under exposed if a strobe associated with said imaging device activated during generation of said image data.
115. The method of claim 1, wherein said imaging device comprises a strobe; said strobe having a first setting wherein said strobe activates during image capture and a second setting wherein said strobe does not activate during image capture; wherein a distance between an object to which image data is being generated and said imaging device is an at least one adjustable parameter; and where said method further comprises:
determining if said strobe activated during said image capture;
determining said distance between said object and said imaging device during generation of said image data;
determining if said distance exceeds a preselected distance; and
determining that a shorter distance should be used during subsequent image data generation if said strobe activated and said distance exceeded said preselected distance.
116. The method of claim 115 and further comprising displaying information on said imaging device indicating that an image representative of said image data may be under exposed.
117. The method of claim 62 and further comprising displaying information on said imaging device suggesting that the distance between an object being imaged and the imaging device should be reduced during generation of subsequent image data.
118. The method of claim 1, wherein said analyzing comprises analyzing the intensity of a strobe associated with said imaging device during generation of said image data to determine whether an image represented by said image data is over exposed.
119. The method of claim 1, wherein said camera comprises a strobe; said strobe having a first setting wherein said strobe activates during image capture and a second setting wherein said strobe does not activate during image capture; wherein an at least one adjustable parameter is said strobe setting; and wherein method further comprises:
determining if said strobe activated during generation of said image data;
determining whether an image represented by said image data is exposed greater than a preselected amount;
wherein said determining a second value comprises determining that said strobe should not activate during generation of subsequent image data if said strobe activated during generation of said image data and image represented by said image data is exposed greater than said preselected amount.
120. The method of claim 119 and further comprising displaying information on said imaging device indicating that an image representative of said image data may be over exposed.
121. The method of claim 119 and further comprising displaying information on said imaging device suggesting inactivating said strobe during generation of subsequent image data.
122. The method of claim 1, wherein said camera comprises a strobe; said strobe having a strobe exposure compensation associated therewith; wherein an at least one adjustable parameter is said strobe exposure compensation; and wherein said method further comprises:
determining said strobe exposure compensation during generation of said image data;
determining whether an image represented by said image data is exposed greater than a preselected amount;
wherein said determining a second value comprises determining that said strobe exposure compensation should be reduced if an image represented by said image data is exposed greater than said preselected amount.
123. The method of claim 122 and further comprising displaying information on said imaging device indicating that an image representative of said image data may be over exposed.
124. The method of claim 122 and further comprising displaying information on said imaging device suggesting reducing said strobe exposure compensation during generation of subsequent image data.
125. The method of claim 1, wherein said camera comprises a strobe; said strobe having a first setting wherein said strobe activates during image capture and a second setting wherein said strobe does not activate during image capture; wherein an at least one adjustable parameter is said strobe setting; and wherein method further comprises:
determining if said strobe activated during generation of said image data;
determining a distance between an object to which image data is being generated and said imaging device;
wherein said determining a second value comprises determining that said strobe should not activate during generation of subsequent image data if said strobe activated during generation of said image data and said distance is less a preselected value.
126. The method of claim 125 and further comprising displaying information on said imaging device indicating that an image representative of said image data may be over exposed.
127. The method of claim 125 and further comprising displaying information on said imaging device suggesting inactivating said strobe during generation of subsequent image data.
128. The method of claim 1, wherein said analyzing comprises analyzing the distance between a scene and said imaging device during generation of said image data and determining whether a strobe associated with said imaging device activated during generation of said image data to determine whether an image data represented by said image data is over exposed.
129. The method of claim 1, wherein said camera comprises a strobe; said strobe having a first setting wherein said strobe activates during generation of image data and a second setting wherein said strobe does not activate during generation of image data; wherein a distance between an object to which image data is being generated and said imaging device is an at least one adjustable parameter; and where said method further comprises:
determining if said strobe activated during generation of said image data;
determining a first distance between said object and said imaging device at the time of said generating;
wherein said determining a second value comprises determining second distance during generation of subsequent image data, said second distance being greater than said first distance, if said first distance is less than a preselected value and said strobe activated during said generating.
130. The method of claim 129 and further comprising displaying information on said imaging device indicating that an image representative of said image data may be over exposed.
131. The method of claim 129 and further comprising displaying information on said imaging device suggesting increasing said distance during generation of subsequent image data.
132. The method of claim 1, wherein said analyzing comprises determining whether a strobe associated with said imaging device activated during generation of said image data and determining the exposure time associated with said generating to determine whether an image represented by said image data is blurry.
133. The method of claim 1, wherein said camera comprises a strobe; said strobe having a first setting wherein said strobe activates during image capture and a second setting wherein said strobe does not activate during image capture; wherein the stability of said imaging device during image data generation is an at least on adjustable parameter; and where said method further comprises:
determining if said strobe activated during generation of said image data;
determining the exposure time of said generating;
wherein said determining a second value comprises determining that a greater stability of said imaging device may be used during generation of subsequent image data.
134. The method of claim 133 and further comprising displaying information on said imaging device indicating that an image representative of said image data may be blurry.
135. The method of claim 133 and further comprising displaying information on said imaging device suggesting increasing said stability of said imaging device during generation of subsequent image data.
136. The method of claim 1, wherein said analyzing comprises analyzing return strobe light to determine whether a strobe associated with said imaging device was blocked during generation of said image data.
137. The method of claim 1, wherein said imaging device comprises a strobe; said at least one adjustable parameter being clearance between said strobe and an object of which image data was generated, said method further comprising:
determining whether said object was within a predetermined distance from said strobe during generation of said image data;
determining whether said image data includes less than a preselected amount of return light from said strobe;
wherein said determining a second value of said at least one adjustable parameter comprises determining that an obstruction is located between said strobe and said object if said object was within said preselected distance from said strobe during generation of said image data and said image data includes less than said preselected amount of return light from said strobe.
138. The method of claim 137 and further comprising displaying information on said imaging device indicating that an obstruction exists between said strobe and said object.
139. The method of claim 1, wherein said imaging device comprises a strobe, said at least one adjustable parameter being reflectivity of a scene in which said image data was generated, said method further comprising:
determining whether said strobe activated during generation of said image data;
determining whether an image represented by said image data includes a portion that is brighter by a predetermined amount than at least one other portion of said image, said brighter portion including light reflected from said strobe;
wherein said determining a second value comprises determining that said reflectivity of said scene should be reduced during generation of subsequent image data.
140. The method of claim 139 and further comprising displaying information on said imaging device indicating that said image may include a reflection of said strobe.
141. The method of claim 139 and further comprising displaying information on said imaging device suggesting reducing the reflectivity of said scene during generation of subsequent image data.
142. The method of claim 79 and further comprising displaying information on said imaging device suggesting generating subsequent image data at an angle relative to reflective objects in said scene.
143. The method of claim 1, wherein said analyzing comprises analyzing a color temperature setting to determine whether another color temperature setting should be used during generation of subsequent image data.
144. The method of claim 1, wherein a user selects a first color temperature associated with a light source that illuminates a scene during generation of said image data; and wherein an at least one parameter is the selection of said color temperature; said method further comprising:
processing said image data based, at least in part, on said first color temperature;
determining a second color temperature associated with light source illuminating said scene during generation of said image data, wherein said imaging device selects said second color temperature;
comparing said first color temperature to said second color temperature;
wherein said determining a second value of said at least one adjustable parameter comprises selecting said second color temperature as a basis for processing of subsequently generated image data if said first color temperature is different than said second color temperature.
145. The method of claim 144 and further comprising displaying information on said imaging device indicating said second color temperature.
146. The method of claim 144, wherein said determining said second color temperature comprises analyzing ambient light during said generating.
147. The method of claim 1, wherein said imaging device comprises a strobe; wherein the stability of said imaging device during said generation of image data is an at least one adjustable parameter; said method further comprising:
determining if said strobe activated during said generation of image data;
determining the shutter speed during said generation of image data;
wherein said determining a second value comprises determining that the stability of said imaging device relative an object of which image data was generated should be increased during generation of subsequent image data if said strobe did not activate during said generation and said shutter speed was slower than a preselected exposure time.
148. The method of claim 147, and further comprising displaying information on said imaging device indicating that an image represented by said image data may be blurry.
149. The method of claim 117, and further comprising displaying information on said imaging device indicating that the stability of said imaging device may be increased during said generation of subsequent image data.
150. The method of claim 1, wherein said imaging device comprises a strobe and a zoom, and wherein the status of said strobe during said generation of image data is an at least one adjustable parameter; said method further comprising:
determining if said strobe activated during said generation of image data;
determining the ambient light intensity during said generation of image data;
determining the zoom setting during generation of said image data;
wherein said determining a second value comprises determining that said strobe should activate during generation of subsequent image data if said strobe did not activate during said generation, said ambient light intensity was less than a preselected intensity, and said zoom setting was greater than a preselected value.
151. The method of claim 150, and further comprising displaying information on said imaging device indicating that an image represented by said image data may be blurry.
152. The method of claim 150, and further comprising displaying information on said imaging device indicating that said strobe should activate during generation of subsequent image data.
153. The method of claim 150, wherein said determining a second value comprises determining that said strobe should activate during generation of subsequent image data if said strobe did not activate during said generation, said ambient light intensity was less than a preselected intensity, and said zoom setting was between a first value and a second value.
154. The method of claim 150, wherein said determining a second value comprises determining that said strobe should activate during generation of subsequent image data if said strobe did not activate during said generation, said ambient light intensity was less than a preselected intensity, and said zoom setting was less than a preselected value.
155. The method of claim 1, wherein said imaging device comprises a strobe, and wherein the stability of said imaging device is an at least one adjustable parameter; said method further comprising:
determining if said strobe activated during said generation of image data;
determining the ambient light intensity during said generation of image data;
determining the zoom setting during generation of said image data;
wherein said determining a second value comprises determining that the stability of said imaging device should be increased relative to an object to which image data is generated if said strobe did not activate during said generation of image data, said ambient light intensity was less than a preselected intensity, and said zoom setting was wider than a preselected value.
156. The method of claim 155, and further comprising displaying information on said imaging device indicating that an image represented by said image data may be blurry.
157. The method of claim 155, and further comprising displaying information on said imaging device indicating that the stability of said imaging device should be increased during generation of subsequent image data.
158. The method of claim 155, wherein said determining a second value comprises determining that the stability of said imaging device should be increased relative to an object to which image data is generated if said strobe did not activate during said generation of image data, said ambient light intensity was less than a preselected intensity, and said zoom setting was between a first value and a second value.
159. The method of claim 1, wherein said analyzing comprises determining a zoom setting used by said imaging device during said generating to determine whether an image represented by said image data is blurry.
160. The method of claim 1, wherein said imaging device comprises a strobe, and a zoom setting, and wherein said zoom setting is an at least one adjustable parameter; said method further comprising:
determining if said strobe activated during said generation of image data;
determining the zoom setting during generation of said image data;
wherein said determining a second value comprises determining that said zoom setting should be widened if said zoom setting was narrower than a preselected amount during generation of said image data, and if said strobe did not activate during said generation of image data.
161. The method of claim 160, and further comprising displaying information on said imaging device indicating that an image represented by said image data may be blurry.
162. The method of claim 160, and further comprising displaying information on said imaging device indicating said zoom setting should be widened during generation of subsequent image data.
163. The method of claim 1, wherein said imaging device comprises a strobe and said imaging device has a shutter speed associated therewith; wherein said generating image data comprises generating image data representative of a plurality of images within a preselected period; and wherein the stability of said imaging device during generation of image data is an at least one adjustable parameter; said method further comprising:
determining the shutter speed of said imaging device during said generating image data;
wherein said determining a second value comprises determining that said stability of said imaging device should be increased during subsequent generation of image data if said strobe did not activate during said generation of image data and if said shutter speed is slower than a preselected value.
164. The method of claim 163, and further comprising displaying information on said imaging device indicating that an image represented by said image data may be blurry.
165. The method of claim 163, and further comprising displaying information on said imaging device indicating said stability of said imaging device should be increased during generation of subsequent image data.
166. The method of claim 1, wherein said imaging device comprises a strobe and wherein said imaging device has an ISO speed associated therewith; wherein said generating image data comprises generating image data representative of a plurality of images within a preselected period; and wherein the intensity of ambient light of a scene to which image data is generated is an at least one adjustable parameter; said method further comprising:
determining the ISO speed of said imaging device during said generating image data;
wherein said determining a second value comprises determining that the intensity of said ambient light should be increased during subsequent generation of image data if said strobe did not activate during said generation of image data and if said ISO speed is slower than a preselected value.
167. The method of claim 166 and further comprising displaying information on said imaging device indicating that an image represented by said image data may be blurry.
168. The method of claim 166 and further comprising displaying information on said imaging device indicating the intensity of said ambient light should be increased during generation of subsequent image data.
169. The method of claim 166, wherein the increasing the intensity of ambient light comprises activating said strobe.
170. The method of claim 1, wherein said imaging device comprises a zoom and has an ISO speed associated therewith; wherein said generating image data comprises generating image data representative of a plurality of images within a preselected period; and wherein the setting of said zoom at the time said image data was generated is an at least one adjustable parameter; said method further comprising:
determining the ISO speed of said imaging device during said generating image data;
wherein said determining a second value comprises determining that the zoom setting should be widened during subsequent generation of image data if said zoom setting was set narrower than a preselected value during said generation of image data and if said ISO speed was slower than a preselected value.
171. The method of claim 170, and further comprising displaying information on said imaging device indicating that an image represented by said image data may be blurry.
172. The method of claim 170, and further comprising displaying information on said imaging device indicating that said zoom setting should be widened during generation of subsequent image data.
173. The method of claim 1, wherein said analyzing comprises analyzing the speed in which a switch associated with said imaging device was activated, said switch causing said imaging device to focus and generate said image data.
174. The method of claim 1, wherein said imaging device comprises a switch, said switch having a first position when no force is applied, a second position when a first force is applied, and a third position when a second force is applied, said second force being greater than said first force; wherein said switch being located in said second position causes said imaging device to focus on a scene; wherein said switch being in said third position causes said imaging device to generate said image data; and wherein the time between said switch being in said second position and said third position is an at least one adjustable parameter; said method further comprising:
measuring the time between said switch being in said second position and said third position;
wherein said determining a second value comprises determining that said time should be decreased during subsequent generation of image data if said time was greater than a preselected value.
175. The method of claim 174, and further comprising displaying information on said imaging device indicating that an image represented by said image data may be blurry.
176. The method of claim 174, and further comprising displaying information on said imaging device indicating the speed at which said switch is activated should be decreased during generation of subsequent image data.
177. The method of claim 1, wherein said analyzing comprises analyzing the focus of said imaging device to determine if an image represented by said image data is out of focus.
178. The method of claim 1, wherein said imaging device comprises a focus processor for focusing a scene; wherein the focus of a scene is an at least one adjustable parameter; said method further comprising:
determining whether the focus of said scene is greater than a preselected value using said focus processor during the generation of said image data;
wherein said determining a second value comprises increasing said focus during generation of subsequent image data if said focus was not greater than said preselected value.
179. The method of claim 178 and further comprising displaying information on said imaging device indicating that an image represented by said image data may be blurry.
180. The method of claim 178 and further comprising displaying information on said imaging device indicating the said focus should be increased during generation of subsequent image data.
181. The method of claim 178, wherein said preselected value constitutes a focus lock via said focus processor
182. The method of claim 1, wherein said imaging device comprises a first focus mode and a second focus mode; wherein said imaging device focuses on objects located between a first distance and infinity from said imaging device when said imaging device is in said first focus mode; wherein said imaging device focuses on objects located between a second distance and infinity from said imaging device when said imaging device is in said second focus mode, said second distance being greater than said first distance; and wherein the focus mode of said imaging device during the generation of image data is an at least one adjustable parameter; said method further comprising:
determining if said imaging device attempted to focus on an object located less than said second distance during the generation of image data;
wherein said determining a second value comprises determining that said first focus mode should be used during generation of subsequent image data if said imaging device was in said second focus mode during the generation of image data and said imaging device attempted to focus on an object located less than said second distance from said imaging device during generation of image data.
183. The method of claim 182, and further comprising displaying information on said imaging device indicating that an image represented by said image data may be blurry.
184. The method of claim 182, and further comprising displaying information on said imaging device indicating the said focus mode should be changed during generation of subsequent image data.
185. The method of claim 1, wherein said imaging device comprises a focus processor and an indicator associated with said focus processor; wherein said indicator has a first mode wherein said indicator is enabled and a second mode wherein said indicator is disabled; wherein said indicator provides an indication of focus lock; and wherein the mode of said indicator is an at least one adjustable parameter; said method further comprising:
determining whether said imaging device focus lock on a scene during generation of said image data;
determining whether the ambient light intensity of said scene is below a preselected value;
wherein said determining a second value comprises enabling said indicator during generation of subsequent image data if said indicator was disabled and said ambient light intensity was less than said preselected value.
186. The method of claim 185, and further comprising displaying information on said imaging device indicating that an image represented by said image data may be blurry.
187. The method of claim 186, and further comprising displaying information on said imaging device indicating the said indicator should be enabled during generation of subsequent image data.
188. The method of claim 1, wherein said imaging device comprises a an adjustable aperture and a focal distance, said focal distance being less than a preselected distance; the aperture size being an at least one adjustable parameter;
wherein said generating image data comprises generating image data representative of a scene, the distance between said imaging device and said scene being less than said preselected distance;
wherein said determining a second value comprises determining a smaller aperture size during generation of subsequent image data.
189. The method of claim 188, and further comprising displaying information on said imaging device indicating that an image represented by said image data may be blurry.
190. The method of claim 188, and further comprising displaying information on said imaging device indicating the said aperture size should be reduced during generation of subsequent image data.
191. The method of claim 1, wherein said analyzing comprises analyzing an adaptive lighting setting to determine if an image represented by said image data is exposed between preselected values.
192. The method of claim 1, wherein said imaging device comprises a portrait mode and a landscape mode, the imaging device being configured to generate image data representative of close images in portrait mode and distant images in landscape mode; wherein said imaging device processes said image data based on an adaptive lighting setting, said adaptive lighting setting being an at least one adjustable parameter; and wherein said method further comprises:
determining whether said imaging device was in said portrait mode during the generation of said image data;
determining whether a strobe associated with said imaging device activated during the generation of said image data;
analyzing said image data to determine if a preselected number of pixel values associated with said image data are greater than a preselected value and if a preselected number of pixel values are less than a preselected value;
wherein said determining a second value of comprises determining that said adaptive lighting set to low during generation of subsequent image data if said imaging device was in said portrait mode, if said strobe did not activate during generation of said image data, and if said preselected number of pixel values are greater than said preselected value and if said preselected number of pixel values are less said preselected value.
193. The method of claim 192, and further comprising displaying information on said imaging device indicating that an image represented by said image data may be contain shadows.
194. The method of claim 192, and further comprising displaying information on said imaging device indicating the said adaptive lighting setting may be set to low during generation of subsequent image data.
195. The method of claim 1, wherein said imaging device comprises a portrait mode and a landscape mode, the imaging device being configured to generate image data representative of close images in portrait mode and distant images in landscape mode; wherein said imaging device comprises a strobe, the status of said strobe during generation of said image data being an at least one adjustable parameter; and wherein said method further comprises:
determining whether said imaging device was in said portrait mode during the generation of said image data;
determining whether a strobe associated with said imaging device activated during the generation of said image data;
analyzing said image data to determine if a preselected number of pixel values associated with said image data are greater than a preselected value and if a preselected number of pixel values are below a preselected value;
wherein said determining a second value of comprises determining that said strobe should be activated during generation of subsequent image data if said imaging device was in said portrait mode, if said strobe did not activate during generation of said image data, and if said preselected number of pixel values are greater than said preselected value and if said preselected number of pixel values are below said preselected value.
196. The method of claim 195, and further comprising displaying information on said imaging device indicating that an image represented by said image data may be contain shadows.
197. The method of claim 195, and further comprising displaying information on said imaging device indicating the said strobe should be activated during generation of subsequent image data.
198. The method of claim 1, wherein said analyzing comprises analyzing the status of a redeye correction processor to determine whether said redeye correction processor should be enabled.
199. The method of claim 1, wherein said imaging device comprises a portrait mode and a landscape mode, the imaging device being configured to generate image data representative of close images in portrait mode and distant images in landscape mode; wherein said imaging device comprises a redeye processor, wherein said redeye processor corrects eyes of subjects to which said image data was generated; the status of said redeye processor being an at least one adjustable parameter; and wherein said method further comprises:
determining whether said imaging device was in said portrait mode during the generation of said image data;
determining whether a strobe associated with said imaging device activated during the generation of said image data;
wherein said determining a second value of comprises determining that redeye processor should be activated during generation of subsequent image data if said imaging device was in said portrait mode, if said strobe activated during generation of said image data, and if said redeye processor was not activated during generation of said image data.
200. The method of claim 199 and further comprising displaying information on said imaging device indicating that eyes of a subject in a scene represented by said image data may appear red.
201. The method of claim 199 and further comprising displaying information on said imaging device indicating the said redeye processor should be activated during generation of subsequent image data.
202. The method of claim 1, wherein said imaging device comprises settings for contrast, sharpness, and saturation; wherein said contrast, sharpness, and saturation comprise said adjustable parameters; and wherein said method further comprises:
determining whether the contrast setting is greater than a preselected value;
determining whether the sharpness setting is greater than a preselected value;
determining whether the saturation setting is greater than a preselected value;
wherein determining a second value comprises determining that at least one of either said contrast setting, said sharpness setting, or said saturation setting should be reduced during generation of subsequent image data.
203. The method of claim 202 and further comprising displaying information on said imaging device indicating that an image represented by said image data may appear unrealistic.
204. The method of claim 202 and further comprising displaying information on said imaging device indicating the settings of at least one of said contrast, said sharpness, or said saturation should be decreased during generation of subsequent image data if all of said contrast, sharpness, and saturation settings are greater than said preselected values.
205. The method of claim 1, wherein said imaging device comprises settings for contrast, sharpness, saturation, and adaptive lighting; wherein said contrast, sharpness, saturation, and adaptive lighting comprise said adjustable parameters; and wherein said method further comprises:
determining whether the contrast setting is greater than a preselected value;
determining whether the sharpness setting is greater than a preselected value;
determining whether the saturation setting is greater than a preselected value;
determining whether the setting of said adaptive lighting is greater than a preselected value;
wherein determining a second value comprises determining that at least one of either said contrast setting, said sharpness setting, said saturation setting or said adaptive lighting setting should be reduced during generation of subsequent image data if all of said contrast, sharpness, saturation, and adaptive lighting settings are greater than said preselected values.
206. The method of claim 205, and further comprising displaying information on said imaging device indicating that an image represented by said image data may appear unrealistic.
207. The method of claim 205, and further comprising displaying information on said imaging device indicating the settings of at least one of said contrast, said sharpness, said saturation, or said adaptive lighting should be decreased during generation of subsequent image data.
208. The method of claim 1, wherein said imaging device comprises settings for contrast, sharpness, and adaptive lighting; wherein said contrast, sharpness, and adaptive lighting comprise said adjustable parameters; and wherein said method further comprises:
determining whether the contrast setting is greater than a preselected value;
determining whether the sharpness setting is greater than a preselected value;
determining whether the setting of said adaptive lighting is greater than a preselected value;
wherein determining a second value comprises determining that at least one of either said contrast setting, said sharpness setting, or said adaptive lighting setting should be reduced during generation of subsequent image data if all of said contrast, sharpness, and adaptive lighting settings are greater than said preselected values.
209. The method of claim 208, and further comprising displaying information on said imaging device indicating that an image represented by said image data may appear unrealistic.
210. The method of claim 208, and further comprising displaying information on said imaging device indicating the settings or at least one of said contrast, said sharpness, or said adaptive lighting should be decreased during generation of subsequent image data.
211. The method of claim 1, wherein said imaging device comprises settings for contrast, saturation, and adaptive lighting; wherein said contrast, saturation, and adaptive lighting comprise said adjustable parameters; and wherein said method further comprises:
determining whether the contrast setting is greater than a preselected value;
determining whether the saturation setting is greater than a preselected value;
determining whether the setting of said adaptive lighting is greater than a preselected value;
wherein determining a second value comprises determining that at least one of either said contrast setting, said saturation setting or said adaptive lighting setting should be reduced during generation of subsequent image data if all of said contrast, saturation, and adaptive lighting settings are greater than said preselected values.
212. The method of claim 211, and further comprising displaying information on said imaging device indicating that an image represented by said image data may appear unrealistic.
213. The method of claim 211, and further comprising displaying information on said imaging device indicating the settings of at least one of said contrast, said saturation, or said adaptive lighting should be decreased during generation of subsequent image data.
214. The method of claim 1, wherein said imaging device comprises settings for sharpness, saturation, and adaptive lighting; wherein said sharpness, saturation, and adaptive lighting comprise said adjustable parameters; and wherein said method further comprises:
determining whether the sharpness setting is greater than a preselected value;
determining whether the saturation setting is greater than a preselected value;
determining whether the setting of said adaptive lighting is greater than a preselected value;
wherein determining a second value comprises determining that at least one of either said sharpness setting, said saturation setting or said adaptive lighting setting should be reduced during generation of subsequent image data if all of said sharpness, saturation, and adaptive lighting settings are greater than said preselected values.
215. The method of claim 214 and further comprising displaying information on said imaging device indicating that an image represented by said image data may appear unrealistic.
216. The method of claim 214 and further comprising displaying information on said imaging device indicating the settings of at least one of said sharpness, said saturation, or said adaptive lighting should be decreased during generation of subsequent image data.
217. The method of claim 1, wherein imaging device has an ISO speed associated therewith; said imaging device further comprising an adaptive lighting setting; wherein said ISO speed and said adaptive lighting setting are adjustable parameters; said method further comprising:
determining whether said ISO speed exceeded a preselected value during the generation of said image data;
determining whether said adaptive lighting setting exceeded a preselected value during the generation of said image data;
wherein said determining a second value comprises determining that either said ISO speed or said adaptive lighting setting should be reduced during generation of subsequent image data if said ISO speed exceed said preselected value and said adaptive lighting setting exceeded said preselected value.
218. The method of claim 217, wherein said preselected ISO speed corresponds to about 400.
219. The method of claim 217, and further comprising displaying information on said imaging device indicating that an image represented by said image data may appear unrealistic.
220. The method of claim 217, and further comprising displaying information on said imaging device indicating the settings of at least one of said adaptive lighting or ISO speed should be decreased during generation of subsequent image data.
221. The method of claim 1, wherein said analyzing comprises determining the temperature of said imaging device.
222. The method of claim 1, wherein imaging device comprises a temperature sensor and wherein the temperature of said imaging device at the time said image data was generated is an at least one adjustable parameter; wherein said method further comprises:
determining the temperature of said imaging device at the time said image data was generated;
wherein determining a second value comprises determining a lower temperature for said imaging device during generation of subsequent image data if said temperature of said imaging device was above a preselected temperature.
223. The method of claim 222, wherein said imaging device comprises a photodetector array and wherein said determining the temperature of said imaging device comprises determining the temperature of said photodetector array.
224. The method of claim 222 and further comprising displaying information on said imaging device indicating that said imaging device was too hot during generation of said image data.
225. The method of claim 222 and further comprising displaying information on said imaging device indicating that the temperature of said imaging device should be reduced during generation of subsequent image data.
226. The method of claim 222, wherein said imaging device comprises a display, and wherein said determining the temperature of said imaging device comprises determining the time in which a display has been active.
227. The method of claim 1, wherein imaging device comprises a temperature sensor and an adaptive lighting setting; wherein said adaptive lighting setting is an at least one adjustable parameter; wherein said method further comprises:
determining the temperature of said imaging device at the time said image data was generated;
determining said adaptive lighting setting at the time said image date was generated;
wherein determining a second value comprises determining a lower adaptive lighting setting during generation of subsequent image data if said temperature was above a preselected temperature and said adaptive lighting setting was set above a preselected value.
228. The method of claim 227, wherein said imaging device comprises a photodetector array and wherein said determining the temperature of said imaging device comprises determining the temperature of said photodetector array.
229. The method of claim 227 and further comprising displaying information on said imaging device indicating that said imaging device was too hot during for said adaptive lighting setting during the generation of said image data.
230. The method of claim 227 and further comprising displaying information on said imaging device indicating that said adaptive lighting setting should be reduced during generation of subsequent image data.
231. The method of claim 227, wherein imaging device comprises a display, and wherein said determining the temperature of said imaging device comprises determining the time in which said display has been active.
232. The method of claim 1, wherein said imaging device comprises a portrait mode, said portrait mode causing at least one setting of said imaging device to capture images within a predetermined distance from said imaging device; said imaging device further comprising a zoom; wherein said the setting of said zoom is an at least one adjustable parameter; wherein said method further comprises:
determining if said imaging device was in said portrait mode during generation of said image data;
determining said setting of said zoom during generation of said image data;
wherein said determining a second value comprises determining a narrower zoom setting during generation of subsequent image data if said imaging device was in portrait mode and said zoom setting was wider than a preselected value during the generation of said image data.
233. The method of claim 232, and further comprising displaying information on said imaging device indicating that an image represented by said image data may appear unrealistic.
234. The method of claim 232, and further comprising displaying information on said imaging device indicating the said setting of said zoom should be narrowed during generation of subsequent image data.
235. The method of claim 1, wherein said imaging device comprises an action mode, said action mode causing at least one setting of said imaging device to capture images of moving objects; wherein said imaging device further comprises a switch, said switch having a first position when no force is applied, a second position when a first force is applied, and a third position when a second force is applied, said second force being greater than said first force; wherein said switch being located in said second position causes said imaging device to focus on a scene; wherein said switch being in said third position causes said imaging device to generate said image data; and wherein the time that said switch is in said second position is an at least one adjustable parameter; said method further comprising:
determining whether said imaging device was in said action mode during generation of said image data;
determining the time that said switch said second position;
wherein said determining a second value comprises determining that said time said switch is in said second position should be increased during generation of subsequent image data if said imaging device was in action mode during generation of said image data and if said time was greater than a preselected time.
236. The method of claim 235, wherein said preselected time is about one and one half seconds.
237. The method of claim 235 and further comprising displaying information on said imaging device indicating that an image represented by said image data may be blurry.
238. The method of claim 235 and further comprising displaying information on said imaging device indicating the time said switch should be in said second position should be increased during generation of subsequent image data.
239. The method of claim 1, wherein said imaging device comprises a digital zoom, said digital zoom being an at least on adjustable parameter; said method further comprising:
determining whether said digital zoom was used in during the generation of said image data;
wherein said determining a second value comprises eliminating said digital zoom during generation of subsequent image data.
240. The method of claim 239 and further comprising displaying information on said imaging device suggesting eliminating said digital zoom during generation of subsequent image data.
241. A digital camera comprising:
at least one computer readable medium; and
computer readable program code stored on said at least one computer readable medium, said computer readable program code comprising instructions for operating said digital camera by:
generating image data, using an imaging device, said generating having at least one adjustable parameter associated therewith, said at least one first adjustable parameter being set at a first value;
analyzing said first value of said at least one first adjustable parameter; and
determining a second value of said at least one adjustable parameter based on said analyzing.
242. The digital camera of claim 241, wherein said first value is associated with said image data.
243. The digital camera of claim 241, and further comprising displaying information on said imaging device related to setting said at least one adjustable parameter to said second value.
244. A digital camera comprising:
generating means for generating image data, using an imaging device, said generating having at least one adjustable parameter associated therewith, said at least one first adjustable parameter being set at a first value;
analyzing means for analyzing said first value of said at least one first adjustable parameter; and
determining means for determining a second value of said at least one adjustable parameter based on said analyzing.
US11/054,291 2003-06-12 2005-02-08 System and method for analyzing a digital image Abandoned US20050212955A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US11/054,291 US20050212955A1 (en) 2003-06-12 2005-02-08 System and method for analyzing a digital image
US11/412,155 US20060239674A1 (en) 2003-06-12 2006-04-26 System and method for analyzing a digital image
US12/684,505 US8780232B2 (en) 2003-06-12 2010-01-08 System and method for analyzing a digital image

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US10/461,600 US20040252217A1 (en) 2003-06-12 2003-06-12 System and method for analyzing a digital image
US11/054,291 US20050212955A1 (en) 2003-06-12 2005-02-08 System and method for analyzing a digital image

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
US10/461,600 Continuation US20040252217A1 (en) 2003-06-12 2003-06-12 System and method for analyzing a digital image
US10/461,600 Continuation-In-Part US20040252217A1 (en) 2003-06-12 2003-06-12 System and method for analyzing a digital image

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US11/412,155 Continuation-In-Part US20060239674A1 (en) 2003-06-12 2006-04-26 System and method for analyzing a digital image
US12/684,505 Division US8780232B2 (en) 2003-06-12 2010-01-08 System and method for analyzing a digital image

Publications (1)

Publication Number Publication Date
US20050212955A1 true US20050212955A1 (en) 2005-09-29

Family

ID=33511283

Family Applications (3)

Application Number Title Priority Date Filing Date
US10/461,600 Abandoned US20040252217A1 (en) 2003-06-12 2003-06-12 System and method for analyzing a digital image
US11/054,291 Abandoned US20050212955A1 (en) 2003-06-12 2005-02-08 System and method for analyzing a digital image
US12/684,505 Active 2026-10-16 US8780232B2 (en) 2003-06-12 2010-01-08 System and method for analyzing a digital image

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US10/461,600 Abandoned US20040252217A1 (en) 2003-06-12 2003-06-12 System and method for analyzing a digital image

Family Applications After (1)

Application Number Title Priority Date Filing Date
US12/684,505 Active 2026-10-16 US8780232B2 (en) 2003-06-12 2010-01-08 System and method for analyzing a digital image

Country Status (3)

Country Link
US (3) US20040252217A1 (en)
JP (1) JP2005006330A (en)
DE (1) DE102004007649A1 (en)

Cited By (63)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050219386A1 (en) * 2004-04-06 2005-10-06 Stavely Donald J Imaging device with burst zoom mode
US20050219385A1 (en) * 2004-03-25 2005-10-06 Fuji Photo Film Co., Ltd. Device for preventing red eye, program therefor, and recording medium storing the program
US20060047704A1 (en) * 2004-08-31 2006-03-02 Kumar Chitra Gopalakrishnan Method and system for providing information services relevant to visual imagery
US20060158532A1 (en) * 2005-01-19 2006-07-20 Fuji Photo Film Co., Ltd. Image capturing apparatus and image capturing method
US20070005490A1 (en) * 2004-08-31 2007-01-04 Gopalakrishnan Kumar C Methods and System for Distributed E-commerce
US20070077053A1 (en) * 2005-09-30 2007-04-05 Casio Computer Co., Ltd. Imaging device, imaging method and program
US20070229850A1 (en) * 2006-04-04 2007-10-04 Boxternal Logics, Llc System and method for three-dimensional image capture
US20070263112A1 (en) * 2006-04-27 2007-11-15 Sony Corporation Image pickup apparatus and image pickup method as well as program
US20080056706A1 (en) * 2006-08-29 2008-03-06 Battles Amy E Photography advice based on captured image attributes and camera settings
US20080088728A1 (en) * 2006-09-29 2008-04-17 Minoru Omaki Camera
US20080187235A1 (en) * 2006-10-19 2008-08-07 Sony Corporation Image processing apparatus, imaging apparatus, imaging processing method, and computer program
US20090016565A1 (en) * 2007-07-11 2009-01-15 Sriram Kulumani Image analysis
US20090022414A1 (en) * 2007-07-20 2009-01-22 Microsoft Corporation High dynamic range image hallucination
US20090073306A1 (en) * 2007-09-13 2009-03-19 Samsung Electronics Co., Ltd. Method, medium, and apparatus for setting exposure time
US20090172756A1 (en) * 2007-12-31 2009-07-02 Motorola, Inc. Lighting analysis and recommender system for video telephony
US7689009B2 (en) 2005-11-18 2010-03-30 Fotonation Vision Ltd. Two stage detection for photographic eye artifacts
US7738015B2 (en) 1997-10-09 2010-06-15 Fotonation Vision Limited Red-eye filter method and apparatus
US20100188533A1 (en) * 2009-01-26 2010-07-29 Canon Kabushiki Kaisha Image sensing apparatus and control method thereof
US20100214483A1 (en) * 2009-02-24 2010-08-26 Robert Gregory Gann Displaying An Image With An Available Effect Applied
US7804531B2 (en) 1997-10-09 2010-09-28 Fotonation Vision Limited Detecting red eye filter and apparatus using meta-data
US20100255875A1 (en) * 2007-11-22 2010-10-07 Keisuke Oozeki Imaging device, information processing terminal, mobile telephone, program, and light emission control method
US20100259649A1 (en) * 2009-04-10 2010-10-14 Kazuki Aisaka Photographing apparatus and method, and program
US7865036B2 (en) 2005-11-18 2011-01-04 Tessera Technologies Ireland Limited Method and apparatus of correcting hybrid flash artifacts in digital images
US20110069179A1 (en) * 2009-09-24 2011-03-24 Microsoft Corporation Network coordinated event capture and image storage
US7916190B1 (en) 1997-10-09 2011-03-29 Tessera Technologies Ireland Limited Red-eye filter method and apparatus
US7920723B2 (en) 2005-11-18 2011-04-05 Tessera Technologies Ireland Limited Two stage detection for photographic eye artifacts
US20110093264A1 (en) * 2004-08-31 2011-04-21 Kumar Gopalakrishnan Providing Information Services Related to Multimodal Inputs
US20110092251A1 (en) * 2004-08-31 2011-04-21 Gopalakrishnan Kumar C Providing Search Results from Visual Imagery
US20110096187A1 (en) * 2003-06-26 2011-04-28 Tessera Technologies Ireland Limited Digital Image Processing Using Face Detection Information
US7962629B2 (en) 2005-06-17 2011-06-14 Tessera Technologies Ireland Limited Method for establishing a paired connection between media devices
US7965875B2 (en) 2006-06-12 2011-06-21 Tessera Technologies Ireland Limited Advances in extending the AAM techniques from grayscale to color images
US7970182B2 (en) 2005-11-18 2011-06-28 Tessera Technologies Ireland Limited Two stage detection for photographic eye artifacts
US7995804B2 (en) 2007-03-05 2011-08-09 Tessera Technologies Ireland Limited Red eye false positive filtering using face location and orientation
US8000526B2 (en) 2007-11-08 2011-08-16 Tessera Technologies Ireland Limited Detecting redeye defects in digital images
US8036460B2 (en) 2004-10-28 2011-10-11 DigitalOptics Corporation Europe Limited Analyzing partial face regions for red-eye detection in acquired digital images
US8055067B2 (en) 2007-01-18 2011-11-08 DigitalOptics Corporation Europe Limited Color segmentation
US8081254B2 (en) 2008-08-14 2011-12-20 DigitalOptics Corporation Europe Limited In-camera based method of detecting defect eye with high accuracy
US20120069044A1 (en) * 2006-03-30 2012-03-22 Hannstar Display Corporation Dynamic gamma control method for lcd
US8170294B2 (en) 2006-11-10 2012-05-01 DigitalOptics Corporation Europe Limited Method of detecting redeye in a digital image
US20120113298A1 (en) * 2009-02-08 2012-05-10 Wan-Yu Chen Image evaluation method, image capturing method and digital camera thereof for evaluating and capturing images according to composition of the images
US8184900B2 (en) 2006-02-14 2012-05-22 DigitalOptics Corporation Europe Limited Automatic detection and correction of non-red eye flash defects
US8212864B2 (en) 2008-01-30 2012-07-03 DigitalOptics Corporation Europe Limited Methods and apparatuses for using image acquisition data to detect and correct image defects
US8265399B2 (en) 2003-06-26 2012-09-11 DigitalOptics Corporation Europe Limited Detecting orientation of digital images using face detection information
US20130120604A1 (en) * 2004-05-13 2013-05-16 Sony Corporation Image capturing system, image capturing device, and image capturing method
US8503818B2 (en) 2007-09-25 2013-08-06 DigitalOptics Corporation Europe Limited Eye defect detection in international standards organization images
US8520093B2 (en) 2003-08-05 2013-08-27 DigitalOptics Corporation Europe Limited Face tracker and partial face tracker for red-eye filter method and apparatus
US8810715B1 (en) 2012-04-20 2014-08-19 Seth A. Rudin Methods and systems for user guided automatic exposure control
US20150130828A1 (en) * 2012-08-16 2015-05-14 Fujifilm Corporation Image file generation device and display device
CN105103534A (en) * 2013-03-27 2015-11-25 富士胶片株式会社 Image capturing apparatus, calibration method, program, and recording medium
US9412007B2 (en) 2003-08-05 2016-08-09 Fotonation Limited Partial face detector red-eye filter method and apparatus
US20160330374A1 (en) * 2014-01-07 2016-11-10 Dacuda Ag Adaptive camera control for reducing motion blur during real-time image capture
US20170230558A1 (en) * 2016-02-04 2017-08-10 KARL STORZ, Imaging, Inc. Exposure control method and system for an image capture device
US20180174365A1 (en) * 2016-12-19 2018-06-21 Intel Corporation Image stream switcher
US10142522B2 (en) 2013-12-03 2018-11-27 Ml Netherlands C.V. User feedback for real-time checking and improving quality of scanned image
US10298898B2 (en) 2013-08-31 2019-05-21 Ml Netherlands C.V. User feedback for real-time checking and improving quality of scanned image
US10410321B2 (en) 2014-01-07 2019-09-10 MN Netherlands C.V. Dynamic updating of a composite image
US10484561B2 (en) 2014-05-12 2019-11-19 Ml Netherlands C.V. Method and apparatus for scanning and printing a 3D object
US20200204687A1 (en) * 2018-12-21 2020-06-25 Xerox Corporation Ambient lighting indicating machine status conditions
US10798793B2 (en) * 2018-12-04 2020-10-06 Canon Kabushiki Kaisha Strobe device capable of emitting assist continuous light, and method of controlling same
US20200351439A1 (en) * 2015-08-31 2020-11-05 Snap Inc. Dynamic image-based adjustment of image capture parameters
US10939035B2 (en) 2016-12-07 2021-03-02 Zte Corporation Photograph-capture method, apparatus, terminal, and storage medium
US20210256759A1 (en) * 2019-07-25 2021-08-19 Nvidia Corporation Performance of ray-traced shadow creation within a scene
US11533439B2 (en) * 2020-05-29 2022-12-20 Sanjeev Kumar Singh Multi capture settings of multi light parameters for automatically capturing multiple exposures in digital camera and method

Families Citing this family (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7359572B2 (en) * 2003-03-26 2008-04-15 Microsoft Corporation Automatic analysis and adjustment of digital images with exposure problems
US7532234B2 (en) * 2003-06-19 2009-05-12 Microsoft Corporation Automatic analysis and adjustment of digital images upon acquisition
TW200640244A (en) * 2005-05-13 2006-11-16 Avision Inc Optical scan module of a scan device
JP2006332789A (en) * 2005-05-23 2006-12-07 Nippon Telegr & Teleph Corp <Ntt> Video photographing method, apparatus, and program, and storage medium for storing the program
JP4288612B2 (en) * 2005-09-14 2009-07-01 ソニー株式会社 Image processing apparatus and method, and program
JP2007184733A (en) * 2006-01-05 2007-07-19 Fujifilm Corp Imaging apparatus and photographing mode display method
TW200805111A (en) * 2006-07-14 2008-01-16 Asustek Comp Inc Method for controlling the function of application software and computer readable recording medium for storing program thereof
JP4725452B2 (en) * 2006-08-04 2011-07-13 株式会社ニコン Digital camera and image processing program
US8355059B2 (en) * 2009-02-06 2013-01-15 Canon Kabushiki Kaisha Image capturing apparatus and control method thereof
US9077905B2 (en) * 2009-02-06 2015-07-07 Canon Kabushiki Kaisha Image capturing apparatus and control method thereof
US8726324B2 (en) * 2009-03-27 2014-05-13 Motorola Mobility Llc Method for identifying image capture opportunities using a selected expert photo agent
JP5397059B2 (en) * 2009-07-17 2014-01-22 ソニー株式会社 Image processing apparatus and method, program, and recording medium
JP5182312B2 (en) * 2010-03-23 2013-04-17 株式会社ニコン Image processing apparatus and image processing program
US11403739B2 (en) * 2010-04-12 2022-08-02 Adobe Inc. Methods and apparatus for retargeting and prioritized interpolation of lens profiles
US20130286234A1 (en) * 2012-04-25 2013-10-31 Atif Hussain Method and apparatus for remotely managing imaging
US9277148B2 (en) 2012-06-06 2016-03-01 Board Of Regents, The University Of Texas System Maximizing perceptual quality and naturalness of captured images
US9215433B2 (en) * 2014-02-11 2015-12-15 Duelight Llc Systems and methods for digital photography
JP6159105B2 (en) * 2013-03-06 2017-07-05 キヤノン株式会社 Imaging apparatus and control method thereof
JP2015011320A (en) * 2013-07-02 2015-01-19 キヤノン株式会社 Imaging device and control method of the same
US10075654B2 (en) * 2013-07-04 2018-09-11 Sony Corporation Method, apparatus and system for image processing
US20150042843A1 (en) * 2013-08-09 2015-02-12 Broadcom Corporation Systems and methods for improving images
KR102146853B1 (en) * 2013-12-27 2020-08-21 삼성전자주식회사 Photographing apparatus and method
CN106030614A (en) 2014-04-22 2016-10-12 史內普艾德有限公司 System and method for controlling a camera based on processing an image captured by other camera
US10191986B2 (en) 2014-08-11 2019-01-29 Microsoft Technology Licensing, Llc Web resource compatibility with web applications
US9705637B2 (en) 2014-08-19 2017-07-11 Microsoft Technology Licensing, Llc Guard band utilization for wireless data communication
US9805483B2 (en) * 2014-08-21 2017-10-31 Microsoft Technology Licensing, Llc Enhanced recognition of charted data
US9524429B2 (en) 2014-08-21 2016-12-20 Microsoft Technology Licensing, Llc Enhanced interpretation of character arrangements
US9397723B2 (en) 2014-08-26 2016-07-19 Microsoft Technology Licensing, Llc Spread spectrum wireless over non-contiguous channels
US9723200B2 (en) * 2014-10-15 2017-08-01 Microsoft Technology Licensing, Llc Camera capture recommendation for applications
US10542204B2 (en) 2015-08-05 2020-01-21 Microsoft Technology Licensing, Llc Methods and apparatuses for capturing multiple digital image frames
WO2017029488A2 (en) * 2015-08-14 2017-02-23 Metail Limited Methods of generating personalized 3d head models or 3d body models
CN106658691B (en) * 2017-03-10 2020-01-14 Oppo广东移动通信有限公司 Display control method and device and mobile terminal
JP2018152724A (en) * 2017-03-13 2018-09-27 オリンパス株式会社 Information terminal device, information processing system, information processing method, and information processing program
JP2019021991A (en) * 2017-07-12 2019-02-07 オリンパス株式会社 Imaging element, imaging apparatus, imaging program and imaging method
CN109302570B (en) * 2018-10-23 2021-01-22 深圳市宸电电子有限公司 Night vision environment detection processing method based on ROI (region of interest) sub-image brightness value
US11212460B2 (en) 2020-02-28 2021-12-28 Hand Held Products, Inc. Apparatuses, methods, and computer program products for flicker reduction in a multi-sensor environment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030076420A1 (en) * 2001-09-06 2003-04-24 Yuji Akiyama Image processing apparatus for print process of photographed image
US6930718B2 (en) * 2001-07-17 2005-08-16 Eastman Kodak Company Revised recapture camera and method
US6970199B2 (en) * 2001-10-05 2005-11-29 Eastman Kodak Company Digital camera using exposure information acquired from a scene

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11352541A (en) * 1998-06-09 1999-12-24 Minolta Co Ltd Camera
US6608650B1 (en) * 1998-12-01 2003-08-19 Flashpoint Technology, Inc. Interactive assistant process for aiding a user in camera setup and operation
US6714249B2 (en) 1998-12-31 2004-03-30 Eastman Kodak Company Producing panoramic digital images by digital camera systems
US6539177B2 (en) * 2001-07-17 2003-03-25 Eastman Kodak Company Warning message camera and method
JP3868273B2 (en) * 2001-11-16 2007-01-17 オリンパス株式会社 Camera shake detection method
US7573514B2 (en) * 2005-02-03 2009-08-11 Eastman Kodak Company Digital imaging system with digital zoom warning

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6930718B2 (en) * 2001-07-17 2005-08-16 Eastman Kodak Company Revised recapture camera and method
US20030076420A1 (en) * 2001-09-06 2003-04-24 Yuji Akiyama Image processing apparatus for print process of photographed image
US6970199B2 (en) * 2001-10-05 2005-11-29 Eastman Kodak Company Digital camera using exposure information acquired from a scene

Cited By (128)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7738015B2 (en) 1997-10-09 2010-06-15 Fotonation Vision Limited Red-eye filter method and apparatus
US7916190B1 (en) 1997-10-09 2011-03-29 Tessera Technologies Ireland Limited Red-eye filter method and apparatus
US7852384B2 (en) 1997-10-09 2010-12-14 Fotonation Vision Limited Detecting red eye filter and apparatus using meta-data
US7847839B2 (en) 1997-10-09 2010-12-07 Fotonation Vision Limited Detecting red eye filter and apparatus using meta-data
US7847840B2 (en) 1997-10-09 2010-12-07 Fotonation Vision Limited Detecting red eye filter and apparatus using meta-data
US8203621B2 (en) 1997-10-09 2012-06-19 DigitalOptics Corporation Europe Limited Red-eye filter method and apparatus
US7804531B2 (en) 1997-10-09 2010-09-28 Fotonation Vision Limited Detecting red eye filter and apparatus using meta-data
US7787022B2 (en) 1997-10-09 2010-08-31 Fotonation Vision Limited Red-eye filter method and apparatus
US8264575B1 (en) 1997-10-09 2012-09-11 DigitalOptics Corporation Europe Limited Red eye filter method and apparatus
US7746385B2 (en) 1997-10-09 2010-06-29 Fotonation Vision Limited Red-eye filter method and apparatus
US8265399B2 (en) 2003-06-26 2012-09-11 DigitalOptics Corporation Europe Limited Detecting orientation of digital images using face detection information
US8126208B2 (en) 2003-06-26 2012-02-28 DigitalOptics Corporation Europe Limited Digital image processing using face detection information
US8131016B2 (en) 2003-06-26 2012-03-06 DigitalOptics Corporation Europe Limited Digital image processing using face detection information
US8224108B2 (en) * 2003-06-26 2012-07-17 DigitalOptics Corporation Europe Limited Digital image processing using face detection information
US20110096187A1 (en) * 2003-06-26 2011-04-28 Tessera Technologies Ireland Limited Digital Image Processing Using Face Detection Information
US9412007B2 (en) 2003-08-05 2016-08-09 Fotonation Limited Partial face detector red-eye filter method and apparatus
US8520093B2 (en) 2003-08-05 2013-08-27 DigitalOptics Corporation Europe Limited Face tracker and partial face tracker for red-eye filter method and apparatus
US20050219385A1 (en) * 2004-03-25 2005-10-06 Fuji Photo Film Co., Ltd. Device for preventing red eye, program therefor, and recording medium storing the program
US20050219386A1 (en) * 2004-04-06 2005-10-06 Stavely Donald J Imaging device with burst zoom mode
US7920180B2 (en) * 2004-04-06 2011-04-05 Hewlett-Packard Development Company, L.P. Imaging device with burst zoom mode
US20130120604A1 (en) * 2004-05-13 2013-05-16 Sony Corporation Image capturing system, image capturing device, and image capturing method
US8787748B2 (en) * 2004-05-13 2014-07-22 Sony Corporation Image capturing system, image capturing device, and image capturing method
US8965195B2 (en) 2004-05-13 2015-02-24 Sony Corporation Image capturing system, image capturing device, and image capturing method
US9998647B2 (en) 2004-05-13 2018-06-12 Sony Corporation Image capturing system, image capturing device, and image capturing method
US10999487B2 (en) 2004-05-13 2021-05-04 Sony Group Corporation Image capturing system, image capturing device, and image capturing method
US9467610B2 (en) 2004-05-13 2016-10-11 Sony Corporation Image capturing system, image capturing device, and image capturing method
US20070005490A1 (en) * 2004-08-31 2007-01-04 Gopalakrishnan Kumar C Methods and System for Distributed E-commerce
US9639633B2 (en) 2004-08-31 2017-05-02 Intel Corporation Providing information services related to multimodal inputs
US20110092251A1 (en) * 2004-08-31 2011-04-21 Gopalakrishnan Kumar C Providing Search Results from Visual Imagery
US20110093264A1 (en) * 2004-08-31 2011-04-21 Kumar Gopalakrishnan Providing Information Services Related to Multimodal Inputs
US20060047704A1 (en) * 2004-08-31 2006-03-02 Kumar Chitra Gopalakrishnan Method and system for providing information services relevant to visual imagery
US8370323B2 (en) 2004-08-31 2013-02-05 Intel Corporation Providing information services related to multimodal inputs
US8265388B2 (en) 2004-10-28 2012-09-11 DigitalOptics Corporation Europe Limited Analyzing partial face regions for red-eye detection in acquired digital images
US8036460B2 (en) 2004-10-28 2011-10-11 DigitalOptics Corporation Europe Limited Analyzing partial face regions for red-eye detection in acquired digital images
US20060158532A1 (en) * 2005-01-19 2006-07-20 Fuji Photo Film Co., Ltd. Image capturing apparatus and image capturing method
US7580058B2 (en) * 2005-01-19 2009-08-25 Fujifilm Corporation Image capturing apparatus and image capturing method
US7962629B2 (en) 2005-06-17 2011-06-14 Tessera Technologies Ireland Limited Method for establishing a paired connection between media devices
US7957637B2 (en) 2005-09-30 2011-06-07 Casio Computer Co., Ltd. Imaging device, imaging method and program
US20100272425A1 (en) * 2005-09-30 2010-10-28 Casio Computer Co., Ltd. Imaging device, imaging method and program
US20070077053A1 (en) * 2005-09-30 2007-04-05 Casio Computer Co., Ltd. Imaging device, imaging method and program
US7865036B2 (en) 2005-11-18 2011-01-04 Tessera Technologies Ireland Limited Method and apparatus of correcting hybrid flash artifacts in digital images
US7689009B2 (en) 2005-11-18 2010-03-30 Fotonation Vision Ltd. Two stage detection for photographic eye artifacts
US7920723B2 (en) 2005-11-18 2011-04-05 Tessera Technologies Ireland Limited Two stage detection for photographic eye artifacts
US8126218B2 (en) 2005-11-18 2012-02-28 DigitalOptics Corporation Europe Limited Two stage detection for photographic eye artifacts
US7953252B2 (en) 2005-11-18 2011-05-31 Tessera Technologies Ireland Limited Two stage detection for photographic eye artifacts
US7970183B2 (en) 2005-11-18 2011-06-28 Tessera Technologies Ireland Limited Two stage detection for photographic eye artifacts
US7970182B2 (en) 2005-11-18 2011-06-28 Tessera Technologies Ireland Limited Two stage detection for photographic eye artifacts
US7970184B2 (en) 2005-11-18 2011-06-28 Tessera Technologies Ireland Limited Two stage detection for photographic eye artifacts
US8131021B2 (en) 2005-11-18 2012-03-06 DigitalOptics Corporation Europe Limited Two stage detection for photographic eye artifacts
US8126217B2 (en) 2005-11-18 2012-02-28 DigitalOptics Corporation Europe Limited Two stage detection for photographic eye artifacts
US8180115B2 (en) 2005-11-18 2012-05-15 DigitalOptics Corporation Europe Limited Two stage detection for photographic eye artifacts
US8175342B2 (en) 2005-11-18 2012-05-08 DigitalOptics Corporation Europe Limited Two stage detection for photographic eye artifacts
US7869628B2 (en) 2005-11-18 2011-01-11 Tessera Technologies Ireland Limited Two stage detection for photographic eye artifacts
US8160308B2 (en) 2005-11-18 2012-04-17 DigitalOptics Corporation Europe Limited Two stage detection for photographic eye artifacts
US8184900B2 (en) 2006-02-14 2012-05-22 DigitalOptics Corporation Europe Limited Automatic detection and correction of non-red eye flash defects
US20120069044A1 (en) * 2006-03-30 2012-03-22 Hannstar Display Corporation Dynamic gamma control method for lcd
US20070229850A1 (en) * 2006-04-04 2007-10-04 Boxternal Logics, Llc System and method for three-dimensional image capture
US20070263112A1 (en) * 2006-04-27 2007-11-15 Sony Corporation Image pickup apparatus and image pickup method as well as program
US8081217B2 (en) * 2006-04-27 2011-12-20 Sony Corporation Image pickup apparatus for use with a construction table
US20100188515A1 (en) * 2006-04-27 2010-07-29 Sony Corporation Image pickup apparatus for use with a construction table
US8004583B2 (en) * 2006-04-27 2011-08-23 Sony Corporation Image pickup apparatus for use with a construction table
US7965875B2 (en) 2006-06-12 2011-06-21 Tessera Technologies Ireland Limited Advances in extending the AAM techniques from grayscale to color images
US7668454B2 (en) 2006-08-29 2010-02-23 Hewlett-Packard Development Company, L.P. Photography advice based on captured image attributes and camera settings
US20080056706A1 (en) * 2006-08-29 2008-03-06 Battles Amy E Photography advice based on captured image attributes and camera settings
US20080088728A1 (en) * 2006-09-29 2008-04-17 Minoru Omaki Camera
US8089546B2 (en) * 2006-09-29 2012-01-03 Olympus Corporation Camera with interchangeable lens unit
US7834915B2 (en) * 2006-10-19 2010-11-16 Sony Corporation Image processing apparatus, imaging apparatus, imaging processing method, and computer program
US20080187235A1 (en) * 2006-10-19 2008-08-07 Sony Corporation Image processing apparatus, imaging apparatus, imaging processing method, and computer program
US8170294B2 (en) 2006-11-10 2012-05-01 DigitalOptics Corporation Europe Limited Method of detecting redeye in a digital image
US8055067B2 (en) 2007-01-18 2011-11-08 DigitalOptics Corporation Europe Limited Color segmentation
US7995804B2 (en) 2007-03-05 2011-08-09 Tessera Technologies Ireland Limited Red eye false positive filtering using face location and orientation
US8233674B2 (en) 2007-03-05 2012-07-31 DigitalOptics Corporation Europe Limited Red eye false positive filtering using face location and orientation
US20090016565A1 (en) * 2007-07-11 2009-01-15 Sriram Kulumani Image analysis
US20090022414A1 (en) * 2007-07-20 2009-01-22 Microsoft Corporation High dynamic range image hallucination
US8346002B2 (en) 2007-07-20 2013-01-01 Microsoft Corporation High dynamic range image hallucination
US8537269B2 (en) * 2007-09-13 2013-09-17 Samsung Electronics Co., Ltd. Method, medium, and apparatus for setting exposure time
US20090073306A1 (en) * 2007-09-13 2009-03-19 Samsung Electronics Co., Ltd. Method, medium, and apparatus for setting exposure time
US8503818B2 (en) 2007-09-25 2013-08-06 DigitalOptics Corporation Europe Limited Eye defect detection in international standards organization images
US8000526B2 (en) 2007-11-08 2011-08-16 Tessera Technologies Ireland Limited Detecting redeye defects in digital images
US8036458B2 (en) 2007-11-08 2011-10-11 DigitalOptics Corporation Europe Limited Detecting redeye defects in digital images
US20130162888A1 (en) * 2007-11-22 2013-06-27 Nec Corporation Imaging device, information processing terminal, mobile telephone, program, and light emission control method
US20100255875A1 (en) * 2007-11-22 2010-10-07 Keisuke Oozeki Imaging device, information processing terminal, mobile telephone, program, and light emission control method
US8896753B2 (en) * 2007-11-22 2014-11-25 Lenovo Innovations Limited (Hong Kong) Control of light emission at different brightnesses corresponding to operation mode
US20090172756A1 (en) * 2007-12-31 2009-07-02 Motorola, Inc. Lighting analysis and recommender system for video telephony
US8212864B2 (en) 2008-01-30 2012-07-03 DigitalOptics Corporation Europe Limited Methods and apparatuses for using image acquisition data to detect and correct image defects
US8081254B2 (en) 2008-08-14 2011-12-20 DigitalOptics Corporation Europe Limited In-camera based method of detecting defect eye with high accuracy
US20100188533A1 (en) * 2009-01-26 2010-07-29 Canon Kabushiki Kaisha Image sensing apparatus and control method thereof
US8400526B2 (en) * 2009-01-26 2013-03-19 Canon Kabushiki Kaisha Image sensing apparatus and control method thereof
US20120113298A1 (en) * 2009-02-08 2012-05-10 Wan-Yu Chen Image evaluation method, image capturing method and digital camera thereof for evaluating and capturing images according to composition of the images
US9258458B2 (en) * 2009-02-24 2016-02-09 Hewlett-Packard Development Company, L.P. Displaying an image with an available effect applied
US20100214483A1 (en) * 2009-02-24 2010-08-26 Robert Gregory Gann Displaying An Image With An Available Effect Applied
US20100259649A1 (en) * 2009-04-10 2010-10-14 Kazuki Aisaka Photographing apparatus and method, and program
US8253826B2 (en) * 2009-04-10 2012-08-28 Sony Corporation Photographing apparatus and method, and program
US20110069179A1 (en) * 2009-09-24 2011-03-24 Microsoft Corporation Network coordinated event capture and image storage
US8810715B1 (en) 2012-04-20 2014-08-19 Seth A. Rudin Methods and systems for user guided automatic exposure control
US20150130828A1 (en) * 2012-08-16 2015-05-14 Fujifilm Corporation Image file generation device and display device
US9384536B2 (en) * 2012-08-16 2016-07-05 Fujifilm Corporation Image file generation device and image file display device
CN105103534A (en) * 2013-03-27 2015-11-25 富士胶片株式会社 Image capturing apparatus, calibration method, program, and recording medium
US20150365661A1 (en) * 2013-03-27 2015-12-17 Fujifilm Corporation Image capturing apparatus, calibration method, and non-transitory computer-readable medium
US10171803B2 (en) * 2013-03-27 2019-01-01 Fujifilm Corporation Image capturing apparatus, calibration method, and non-transitory computer-readable medium for calculating parameter for a point image restoration process
US10841551B2 (en) 2013-08-31 2020-11-17 Ml Netherlands C.V. User feedback for real-time checking and improving quality of scanned image
US10298898B2 (en) 2013-08-31 2019-05-21 Ml Netherlands C.V. User feedback for real-time checking and improving quality of scanned image
US11563926B2 (en) 2013-08-31 2023-01-24 Magic Leap, Inc. User feedback for real-time checking and improving quality of scanned image
US11115565B2 (en) 2013-12-03 2021-09-07 Ml Netherlands C.V. User feedback for real-time checking and improving quality of scanned image
US10375279B2 (en) 2013-12-03 2019-08-06 Ml Netherlands C.V. User feedback for real-time checking and improving quality of scanned image
US11798130B2 (en) 2013-12-03 2023-10-24 Magic Leap, Inc. User feedback for real-time checking and improving quality of scanned image
US10142522B2 (en) 2013-12-03 2018-11-27 Ml Netherlands C.V. User feedback for real-time checking and improving quality of scanned image
US10455128B2 (en) 2013-12-03 2019-10-22 Ml Netherlands C.V. User feedback for real-time checking and improving quality of scanned image
US11315217B2 (en) 2014-01-07 2022-04-26 Ml Netherlands C.V. Dynamic updating of a composite image
US10410321B2 (en) 2014-01-07 2019-09-10 MN Netherlands C.V. Dynamic updating of a composite image
US20160330374A1 (en) * 2014-01-07 2016-11-10 Dacuda Ag Adaptive camera control for reducing motion blur during real-time image capture
US11516383B2 (en) 2014-01-07 2022-11-29 Magic Leap, Inc. Adaptive camera control for reducing motion blur during real-time image capture
US10708491B2 (en) * 2014-01-07 2020-07-07 Ml Netherlands C.V. Adaptive camera control for reducing motion blur during real-time image capture
US10484561B2 (en) 2014-05-12 2019-11-19 Ml Netherlands C.V. Method and apparatus for scanning and printing a 3D object
US11245806B2 (en) 2014-05-12 2022-02-08 Ml Netherlands C.V. Method and apparatus for scanning and printing a 3D object
US20200351439A1 (en) * 2015-08-31 2020-11-05 Snap Inc. Dynamic image-based adjustment of image capture parameters
US20170230558A1 (en) * 2016-02-04 2017-08-10 KARL STORZ, Imaging, Inc. Exposure control method and system for an image capture device
US9986169B2 (en) * 2016-02-04 2018-05-29 KARL STORZ, Imaging, Inc. Exposure control method and system for an image capture device
US10939035B2 (en) 2016-12-07 2021-03-02 Zte Corporation Photograph-capture method, apparatus, terminal, and storage medium
US11079842B2 (en) * 2016-12-19 2021-08-03 Intel Corporation Image stream switcher
US20180174365A1 (en) * 2016-12-19 2018-06-21 Intel Corporation Image stream switcher
US10691201B2 (en) * 2016-12-19 2020-06-23 Intel Corporation Image stream switcher
US10798793B2 (en) * 2018-12-04 2020-10-06 Canon Kabushiki Kaisha Strobe device capable of emitting assist continuous light, and method of controlling same
US11438465B2 (en) * 2018-12-21 2022-09-06 Xerox Corporation Ambient lighting indicating machine status conditions
US20200204687A1 (en) * 2018-12-21 2020-06-25 Xerox Corporation Ambient lighting indicating machine status conditions
US20210256759A1 (en) * 2019-07-25 2021-08-19 Nvidia Corporation Performance of ray-traced shadow creation within a scene
US11847733B2 (en) * 2019-07-25 2023-12-19 Nvidia Corporation Performance of ray-traced shadow creation within a scene
US11533439B2 (en) * 2020-05-29 2022-12-20 Sanjeev Kumar Singh Multi capture settings of multi light parameters for automatically capturing multiple exposures in digital camera and method

Also Published As

Publication number Publication date
DE102004007649A1 (en) 2005-01-13
JP2005006330A (en) 2005-01-06
US20100123805A1 (en) 2010-05-20
US20040252217A1 (en) 2004-12-16
US8780232B2 (en) 2014-07-15

Similar Documents

Publication Publication Date Title
US8780232B2 (en) System and method for analyzing a digital image
KR102376901B1 (en) Imaging control method and imaging device
JP4657457B2 (en) How to automatically determine the final exposure setting for a solid-state camera without a separate photometric circuit
US8106965B2 (en) Image capturing device which corrects a target luminance, based on which an exposure condition is determined
US20060239674A1 (en) System and method for analyzing a digital image
US7868929B2 (en) White balance correcting method and image-pickup apparatus
JP4542058B2 (en) Imaging apparatus and imaging method
US20110242366A1 (en) Imaging apparatus, imaging method, storage medium, and integrated circuit
US20080129860A1 (en) Digital camera
JP2002232906A (en) White balance controller
JP2002010108A (en) Device and method for processing image signal
US20090167932A1 (en) Image capturing apparatus
US20060198625A1 (en) Imaging device and imaging method
CN114666511B (en) Method and device for automatically obtaining optimal exposure value on tunable spectrum camera
JP4831175B2 (en) Imaging apparatus and imaging method
JP5149055B2 (en) Imaging device
JP4029206B2 (en) Imaging device
JP5970871B2 (en) Electronic camera
US20110128404A1 (en) Imaging apparatus, image processing program, image processing apparatus, and image processing method
EP2146501B1 (en) Image processing apparatus, image-capturing apparatus, and image processing program
JP3822486B2 (en) Electronic camera and signal processing method
JP2020191506A (en) Defect pixel detection device and imaging device, and defect pixel detection method
JP3792555B2 (en) Brightness adjustment method and imaging apparatus
JP2005167465A (en) Digital camera and imaging method of digital camera
US20220321764A1 (en) Illumination control device, imaging device, and storage medium

Legal Events

Date Code Title Description
AS Assignment

Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P., TEXAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CRAIG, MURRAY D.;HOFER, GREGORY;MANSON, SUSAN E.;AND OTHERS;REEL/FRAME:016615/0200;SIGNING DATES FROM 20050519 TO 20050520

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION