US20060012684A1 - Method and apparatus for capturing images - Google Patents

Method and apparatus for capturing images Download PDF

Info

Publication number
US20060012684A1
US20060012684A1 US11/233,349 US23334905A US2006012684A1 US 20060012684 A1 US20060012684 A1 US 20060012684A1 US 23334905 A US23334905 A US 23334905A US 2006012684 A1 US2006012684 A1 US 2006012684A1
Authority
US
United States
Prior art keywords
image data
camera
component
memory
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/233,349
Inventor
Jonathan Kagle
Gilad Odinak
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Corp filed Critical Microsoft Corp
Priority to US11/233,349 priority Critical patent/US20060012684A1/en
Publication of US20060012684A1 publication Critical patent/US20060012684A1/en
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MICROSOFT CORPORATION
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • H04N23/634Warning indications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/88Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control

Definitions

  • the present application relates to digital cameras.
  • the present invention relates to image control and processing in digital cameras.
  • Digital cameras capture images by converting light into electrical signals and processing the electrical signals to produce a set of image data.
  • the image data is then stored in a long-term memory for later retrieval.
  • a first set of image data is acquired based on a first frame of light entering the camera before receiving an instruction to capture an image from a user.
  • the first set of image data is then used to test the performance of at least one hardware component in the camera.
  • FIG. 1 is a general block diagram of a camera under an embodiment of the present invention.
  • FIG. 2 is a flow diagram of a method under one embodiment of the present invention.
  • FIG. 3 is a block diagram of software components under one embodiment of the present invention.
  • FIG. 4 is a flow diagram of an alternative method under one embodiment of the present invention.
  • FIG. 5 is a flow diagram of an alternative method under one embodiment of the present invention.
  • FIG. 6 is a flow diagram of a method of verifying pre-processing results under one embodiment of the present invention.
  • FIG. 1 is a block diagram of a camera 200 under an embodiment of the present invention.
  • Light 202 enters camera 200 through a lens assembly 204 , which focuses the light onto a photo array 206 .
  • lens assembly 204 is controlled by an auto-focus controller 208 that modifies lens assembly 204 to bring an image into focus on photo array 206 .
  • Photo array 206 is constructed of an array of specialized transistors known as Charge Coupled Devices (CCDs). For black-and-white images, each CCD represents a separate pixel in the captured image. For color images, a set of CCD pixels with overlaid color filters are combined to represent a single pixel, with each CCD providing information on a different color of light.
  • CCDs Charge Coupled Devices
  • photo array 206 Based on the output of the CCDs, photo array 206 transmits an analog signal to an analog-to-digital (A/D) converter 210 .
  • A/D converter 210 converts the analog signal into a series of digital values, with each value representing the brightness or intensity of a pixel's color.
  • the digital values produced by A/D converter 210 are stored in a register 212 that receives read and write control signals from an image processor 214 . By controlling when A/D converter 210 can write to register 212 , image processor 214 is able to capture and store values representing a single frame of light.
  • Image processor 214 and auto-focus controller 208 are able to access and use the image data in register 212 .
  • Auto-focus controller 208 uses the data to adjust lens assembly 204 .
  • Image processor 214 uses the image data for a number of processing functions described further below.
  • image processor 214 processes the image data in register 212 using software components stored in code storage 216 . Intermediate results of this processing are stored in a second register 218 , and the final image data resulting from the processing is stored in long-term storage 220 .
  • image processor 214 receives input from a capture. button 222 that is depressed by the user to indicate when the user wants to capture an image.
  • the user is able to press capture button 222 half-way down to indicate that they want the camera to prepare to capture an image by, for example, activating the auto-focus feature of the camera.
  • Image processor 214 is also able to set parameters for a flash 224 , which is controlled by a flash controller 226 .
  • the flash parameters are stored in register 228 by image processor 214 and include parameters such as a red-eye reduction flash parameter, a duration parameter, and a brightness parameter.
  • image processor 214 performs some processing functions before the user indicates that they wish to capture an image. Thus, before capture button 222 is fully depressed, image processor 214 performs one or more functions such as white balance, contrast adjustment, and red-eye reduction. After the user fully depress capture button 222 , image processor 214 performs additional post-capture processing functions. The results of the pre-capture and post-capture processing functions are then combined to produce a final set of image data. By performing some of the image processing functions before the user tries to capture an image, the present invention reduces the post-capture processing time.
  • FIG. 2 is a flow diagram of a method for capturing images under one embodiment of the present invention. Under one embodiment, the process of FIG. 2 is performed by image processor 214 by invoking a set of software components.
  • FIG. 3 provides a block diagram of some of these components, which are described below in connection with the method of FIG. 2 .
  • the method of FIG. 2 begins at step 300 and proceeds to step 302 where a system control component 400 invoked by image processor 214 causes image data for a frame of light to be written to register 212 .
  • System control 400 then retrieves the stored values from register 212 and at step 304 invokes a pre-capture process control 402 that controls one or more pre-capture processing functions.
  • pre-capture process control 402 invokes one or more separate image processing components such as white balance component 404 , red-eye reduction component 406 , contrast adjustment component 408 , flesh tone correction component 410 , and edge enhancement component 412 of FIG. 3 . When invoked, each of these components is given access to the image data. When a component completes its operation, it typically returns one or more processing values to pre-capture process control 402 . Under some embodiments, pre-capture process control 402 invokes two or more of the image processing components in parallel. In other embodiments, the image processing components are invoked serially.
  • pre-capture process control 402 also invokes processing functions to test the performance of some hardware components of the camera. For example, under one embodiment, pre-capture process control 402 invokes a defective pixel detection component 428 , which determines if one or more of the CCD transistors is malfunctioning. In other embodiments, pre-capture process control 402 invokes a memory availability component 430 , which determines if long term image storage 220 of FIG. 1 has enough available memory to accommodate the next image. If there is insufficient available memory, system control 400 either sends a message to the user through a display driver 423 in operating system 416 or reallocates the memory resources by remapping virtual address space, terminating other components or minimizing other components.
  • system control 400 When system control 400 reallocates the memory resources, it acts as a memory management component. As shown in FIG. 1 , this message is provided to the user through a display 230 that is controlled by a display interface 232 . Display interface 232 receives its display data from display driver 423 in image processor 214 .
  • system control 400 checks to see if the user has pressed the image capture button at step 306 .
  • information as to whether the user has pressed the button is provided by a user input component 414 shown in operating system 416 in FIG. 3 . If the capture button has not been pressed at step 306 , the process of FIG. 2 returns to step 302 , and steps 302 and 304 are repeated.
  • system control component 400 captures image data for a second frame of light at step 308 .
  • System control component 400 then invokes a post-capture process control component 418 that invokes one or more image processing components at step 310 .
  • Some of the image processing components such as compression component 420 of FIG. 3 , change the image data as they process it. If another image processing component is invoked after a component that changes the image data, the subsequent image processing component receives the modified image data.
  • Post-capture process control 418 also acts as an image production control that utilizes the results of the pre-capture processing components and the post-capture processing components to produce a final set of image data.
  • the final set of image data is stored in long-term image storage 220 of FIG. 1 .
  • the process of storing the final image data is controlled by operating system 416 . After the final image data has been stored, the process of FIG. 2 ends at step 314 .
  • FIG. 4 provides a flow diagram for an alternative method of capturing images under the present invention.
  • the process of FIG. 4 starts at step 480 and continues at step 482 where system control 400 waits for the user to depress the capture button.
  • system control 400 uses a flash control component 422 in operating system 416 of FIG. 3 to trigger a red-eye reduction flash at step 484 of FIG. 4 .
  • system control 400 captures image data generated from a frame of light that includes light from the red-eye reduction flash.
  • the red-eye reduction flash illuminates the subject of the photograph so that the captured image has color, contrast, and brightness characteristics more similar to the image produced when the main flash is later triggered, as described below.
  • system control 400 invokes pre-capture process control 402 , which invokes one or more processing components as described above.
  • some embodiments that utilize the method of FIG. 4 perform an additional pre-capture processing function.
  • This additional function is performed by flash control 424 in FIG. 3 and involves adjusting the parameters of the camera's flash system.
  • flash control 424 changes the duration and/or brightness of the flash.
  • the changes to the flash parameters are stored in register 228 of FIG. 1 through flash control 422 of operating system 416 .
  • step 490 system control 400 triggers the main flash.
  • the image generated by the light of the main flash is then captured at step 492 .
  • system control 400 invokes post-capture process control 418 , which invokes one or more post-capture processing functions as described above.
  • Post-capture process control 418 utilizes the results of the pre-capture processing and post-capture processing to form a final set of image data that is stored before the process of FIG. 4 ends at step 496 .
  • FIG. 5 provides a flow diagram of an alternative method for capturing images under the present invention.
  • the method of FIG. 5 begins at step 520 and proceeds to step 522 where system control 400 waits to receive a pre-capture event.
  • the pre-capture event can be generated by a software routine based on a timer or some other triggering event or can be generated when the user presses the capture button half-way down.
  • the method of FIG. 5 continues at step 524 where system control 400 captures image data for a first frame of light.
  • system control 400 invokes pre-capture process control 402 , which invokes one or more pre-capture processing components as described above in connection with FIGS. 2 and 3 .
  • the pre-capture processing components can include an auto-focus component such as auto-focus component 426 of FIG. 3 .
  • system control 400 waits for the user to fully depress the capture button at step 528 .
  • system control 400 retrieves a new set of image data based on a second frame of light that entered the camera just after the capture button was depressed. This is shown in step 530 of FIG. 5 .
  • System control 400 then invokes post-capture process control 418 at step 532 .
  • Post-capture process control 418 invokes one or more post-capture processing components and utilizes the results from the pre-capture processing components and the post-capture processing components to form a set of final image data.
  • System control 400 then stores the final image data before the method of FIG. 5 ends at step 534 .
  • post-capture process control 418 invokes a verify pre-capture results component 432 to verify the results from one or more of the pre-capture components before utilizing those results. Such verification is helpful because the lighting conditions may have changed between the time when the image data for the first frame of light was collected and the time when the image data for the second frame of light was collected.
  • FIG. 6 is a flow diagram of one method for verifying the pre-capture processing results.
  • the method of FIG. 6 begins at step 600 where a portion of the image data for the second frame of light is selected.
  • this sub-set of image data is provided to the pre-capture processing component that is having its results verified. This produces a verification result that is compared to the original result produced by the pre-capture processing component for the first frame of light.
  • This comparison is performed in step 604 .
  • the difference between the two results is compared to a threshold at step 606 . If the two results differ by more than the threshold, all of the image data for the second frame of light is passed to the pre-capture processing component to generate a new result at step 608 . If the two results differ by less than the threshold, the original result formed from the first frame of light is used at step 610 .
  • the pre-capture processing functions are repeated using the second frame of light instead of the first frame of light.
  • the results formed from the first frame of light are used to determine the final image data.

Abstract

A first set of image data is acquired based on a first frame of light entering the camera before receiving an instruction to capture an image from a user. The first set of image data is then used to test the performance of at least one hardware component in the camera.

Description

    REFERENCE TO RELATED APPLICATIONS
  • The present application is a Continuation of U.S. patent application Ser. No. 09/505,223, filed on Feb. 16, 2000, which claims priority benefit from U.S. Provisional Application 60/136,517, filed on May 28, 1999 and entitled “EARLY IMAGE ACQUISITION”.
  • BACKGROUND OF THE INVENTION
  • The present application relates to digital cameras. In particular, the present invention relates to image control and processing in digital cameras.
  • Digital cameras capture images by converting light into electrical signals and processing the electrical signals to produce a set of image data. The image data is then stored in a long-term memory for later retrieval.
  • The processing of captured images is compute-intensive, often making the user wait for the final picture after the shutter button is pressed. For users who want to take a series of photographs in rapid succession, this delay is undesirable.
  • SUMMARY OF THE INVENTION
  • A first set of image data is acquired based on a first frame of light entering the camera before receiving an instruction to capture an image from a user. The first set of image data is then used to test the performance of at least one hardware component in the camera.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a general block diagram of a camera under an embodiment of the present invention.
  • FIG. 2 is a flow diagram of a method under one embodiment of the present invention.
  • FIG. 3 is a block diagram of software components under one embodiment of the present invention.
  • FIG. 4 is a flow diagram of an alternative method under one embodiment of the present invention.
  • FIG. 5 is a flow diagram of an alternative method under one embodiment of the present invention.
  • FIG. 6 is a flow diagram of a method of verifying pre-processing results under one embodiment of the present invention.
  • DETAILED DESCRIPTION OF ILLUSTRATIVE EMBODIMENTS
  • FIG. 1 is a block diagram of a camera 200 under an embodiment of the present invention. Light 202 enters camera 200 through a lens assembly 204, which focuses the light onto a photo array 206. In many embodiments, lens assembly 204 is controlled by an auto-focus controller 208 that modifies lens assembly 204 to bring an image into focus on photo array 206.
  • Photo array 206 is constructed of an array of specialized transistors known as Charge Coupled Devices (CCDs). For black-and-white images, each CCD represents a separate pixel in the captured image. For color images, a set of CCD pixels with overlaid color filters are combined to represent a single pixel, with each CCD providing information on a different color of light.
  • Based on the output of the CCDs, photo array 206 transmits an analog signal to an analog-to-digital (A/D) converter 210. A/D converter 210 converts the analog signal into a series of digital values, with each value representing the brightness or intensity of a pixel's color. The digital values produced by A/D converter 210 are stored in a register 212 that receives read and write control signals from an image processor 214. By controlling when A/D converter 210 can write to register 212, image processor 214 is able to capture and store values representing a single frame of light.
  • Image processor 214 and auto-focus controller 208 are able to access and use the image data in register 212. Auto-focus controller 208 uses the data to adjust lens assembly 204. Image processor 214 uses the image data for a number of processing functions described further below.
  • In most embodiments, image processor 214 processes the image data in register 212 using software components stored in code storage 216. Intermediate results of this processing are stored in a second register 218, and the final image data resulting from the processing is stored in long-term storage 220.
  • In most embodiments, image processor 214 receives input from a capture. button 222 that is depressed by the user to indicate when the user wants to capture an image. In some embodiments, the user is able to press capture button 222 half-way down to indicate that they want the camera to prepare to capture an image by, for example, activating the auto-focus feature of the camera.
  • Image processor 214 is also able to set parameters for a flash 224, which is controlled by a flash controller 226. The flash parameters are stored in register 228 by image processor 214 and include parameters such as a red-eye reduction flash parameter, a duration parameter, and a brightness parameter.
  • Under the present invention, image processor 214 performs some processing functions before the user indicates that they wish to capture an image. Thus, before capture button 222 is fully depressed, image processor 214 performs one or more functions such as white balance, contrast adjustment, and red-eye reduction. After the user fully depress capture button 222, image processor 214 performs additional post-capture processing functions. The results of the pre-capture and post-capture processing functions are then combined to produce a final set of image data. By performing some of the image processing functions before the user tries to capture an image, the present invention reduces the post-capture processing time.
  • FIG. 2 is a flow diagram of a method for capturing images under one embodiment of the present invention. Under one embodiment, the process of FIG. 2 is performed by image processor 214 by invoking a set of software components. FIG. 3 provides a block diagram of some of these components, which are described below in connection with the method of FIG. 2.
  • The method of FIG. 2 begins at step 300 and proceeds to step 302 where a system control component 400 invoked by image processor 214 causes image data for a frame of light to be written to register 212. System control 400 then retrieves the stored values from register 212 and at step 304 invokes a pre-capture process control 402 that controls one or more pre-capture processing functions.
  • Under one embodiment, pre-capture process control 402 invokes one or more separate image processing components such as white balance component 404, red-eye reduction component 406, contrast adjustment component 408, flesh tone correction component 410, and edge enhancement component 412 of FIG. 3. When invoked, each of these components is given access to the image data. When a component completes its operation, it typically returns one or more processing values to pre-capture process control 402. Under some embodiments, pre-capture process control 402 invokes two or more of the image processing components in parallel. In other embodiments, the image processing components are invoked serially.
  • In some embodiments, pre-capture process control 402 also invokes processing functions to test the performance of some hardware components of the camera. For example, under one embodiment, pre-capture process control 402 invokes a defective pixel detection component 428, which determines if one or more of the CCD transistors is malfunctioning. In other embodiments, pre-capture process control 402 invokes a memory availability component 430, which determines if long term image storage 220 of FIG. 1 has enough available memory to accommodate the next image. If there is insufficient available memory, system control 400 either sends a message to the user through a display driver 423 in operating system 416 or reallocates the memory resources by remapping virtual address space, terminating other components or minimizing other components. When system control 400 reallocates the memory resources, it acts as a memory management component. As shown in FIG. 1, this message is provided to the user through a display 230 that is controlled by a display interface 232. Display interface 232 receives its display data from display driver 423 in image processor 214.
  • Returning to the method of FIG. 2, after step 304, system control 400 checks to see if the user has pressed the image capture button at step 306. Under some embodiments, information as to whether the user has pressed the button is provided by a user input component 414 shown in operating system 416 in FIG. 3. If the capture button has not been pressed at step 306, the process of FIG. 2 returns to step 302, and steps 302 and 304 are repeated.
  • If the capture button has been pressed at step 306, system control component 400 captures image data for a second frame of light at step 308. System control component 400 then invokes a post-capture process control component 418 that invokes one or more image processing components at step 310. Some of the image processing components, such as compression component 420 of FIG. 3, change the image data as they process it. If another image processing component is invoked after a component that changes the image data, the subsequent image processing component receives the modified image data. Post-capture process control 418 also acts as an image production control that utilizes the results of the pre-capture processing components and the post-capture processing components to produce a final set of image data.
  • At step 312, the final set of image data is stored in long-term image storage 220 of FIG. 1. Under the embodiment of FIG. 3, the process of storing the final image data is controlled by operating system 416. After the final image data has been stored, the process of FIG. 2 ends at step 314.
  • FIG. 4 provides a flow diagram for an alternative method of capturing images under the present invention. The process of FIG. 4 starts at step 480 and continues at step 482 where system control 400 waits for the user to depress the capture button. When the user depresses the capture button, system control 400 uses a flash control component 422 in operating system 416 of FIG. 3 to trigger a red-eye reduction flash at step 484 of FIG. 4. At step 486, system control 400 captures image data generated from a frame of light that includes light from the red-eye reduction flash. The red-eye reduction flash illuminates the subject of the photograph so that the captured image has color, contrast, and brightness characteristics more similar to the image produced when the main flash is later triggered, as described below.
  • At step 488, system control 400 invokes pre-capture process control 402, which invokes one or more processing components as described above.
  • In addition to using some or all of the processing components described above, some embodiments that utilize the method of FIG. 4 perform an additional pre-capture processing function. This additional function is performed by flash control 424 in FIG. 3 and involves adjusting the parameters of the camera's flash system. For example, in some embodiments, flash control 424 changes the duration and/or brightness of the flash. The changes to the flash parameters are stored in register 228 of FIG. 1 through flash control 422 of operating system 416.
  • When the pre-capture processing is complete, the method of FIG. 4 continues at step 490 where system control 400 triggers the main flash. The image generated by the light of the main flash is then captured at step 492. At step 494, system control 400 invokes post-capture process control 418, which invokes one or more post-capture processing functions as described above. Post-capture process control 418 utilizes the results of the pre-capture processing and post-capture processing to form a final set of image data that is stored before the process of FIG. 4 ends at step 496.
  • FIG. 5 provides a flow diagram of an alternative method for capturing images under the present invention. The method of FIG. 5 begins at step 520 and proceeds to step 522 where system control 400 waits to receive a pre-capture event. The pre-capture event can be generated by a software routine based on a timer or some other triggering event or can be generated when the user presses the capture button half-way down. When the pre-capture event occurs, the method of FIG. 5 continues at step 524 where system control 400 captures image data for a first frame of light. At step 526, system control 400 invokes pre-capture process control 402, which invokes one or more pre-capture processing components as described above in connection with FIGS. 2 and 3. In the method of FIG. 5, the pre-capture processing components can include an auto-focus component such as auto-focus component 426 of FIG. 3.
  • After pre-capture process control 402 has finished executing, system control 400 waits for the user to fully depress the capture button at step 528. When the user fully depresses the capture button, system control 400 retrieves a new set of image data based on a second frame of light that entered the camera just after the capture button was depressed. This is shown in step 530 of FIG. 5. System control 400 then invokes post-capture process control 418 at step 532. Post-capture process control 418 invokes one or more post-capture processing components and utilizes the results from the pre-capture processing components and the post-capture processing components to form a set of final image data. System control 400 then stores the final image data before the method of FIG. 5 ends at step 534.
  • Under some embodiments of the invention, post-capture process control 418 invokes a verify pre-capture results component 432 to verify the results from one or more of the pre-capture components before utilizing those results. Such verification is helpful because the lighting conditions may have changed between the time when the image data for the first frame of light was collected and the time when the image data for the second frame of light was collected. FIG. 6 is a flow diagram of one method for verifying the pre-capture processing results.
  • The method of FIG. 6 begins at step 600 where a portion of the image data for the second frame of light is selected. At step 602, this sub-set of image data is provided to the pre-capture processing component that is having its results verified. This produces a verification result that is compared to the original result produced by the pre-capture processing component for the first frame of light. This comparison is performed in step 604. The difference between the two results is compared to a threshold at step 606. If the two results differ by more than the threshold, all of the image data for the second frame of light is passed to the pre-capture processing component to generate a new result at step 608. If the two results differ by less than the threshold, the original result formed from the first frame of light is used at step 610. Thus, if image conditions have changed substantially between the first and second frames of light, the pre-capture processing functions are repeated using the second frame of light instead of the first frame of light. However, if image conditions have not changed substantially between the frames of light, the results formed from the first frame of light are used to determine the final image data.
  • Although the present invention has been described with reference to particular embodiments, workers skilled in the art will recognize that changes may be made in form and detail without departing from the spirit and scope of the invention.

Claims (20)

1. A method in a camera, the method comprising:
acquiring a first set of image data based on a first frame of light entering the camera before receiving an instruction to capture an image from a user; and
testing the performance of at least one hardware component in the camera using the first set of image data.
2. The method of claim 1 wherein testing the performance of at least one hardware component comprises determining if the camera has enough available memory to store the first set of image data.
3. The method of claim 2 further comprising determining that the camera does not have enough available memory to store the first set of image data and indicating to the user that there is insufficient memory before acquiring a second set of image data.
4. The method of claim 2 further comprising determining that the camera does not have enough available memory to store the first set of image data and reallocating memory before acquiring the second set of image data.
5. The method of claim 4 wherein reallocating memory comprises terminating a software component.
6. The method of claim 4 wherein reallocating memory comprises minimizing a software component.
7. The method of claim 4 wherein reallocating memory comprises remapping virtual address space.
8. The method of claim 1 wherein testing the performance of at least one hardware component comprises determining if a light sensor is working.
9. The method of claim 8 wherein testing the performance of at least one hardware component comprises determining if each light sensor in an array of light sensors is working.
10. The method of claim 9 further comprising providing an indication of which light sensors are not working before acquiring a second set of image data.
11. The method of claim 1 further comprising receiving a pre-capture event before acquiring the first set of image data.
12. The method of claim 11 wherein the pre-capture event is an indication that a user has pressed a capture button half-way down.
13. The method of claim 11 wherein the pre-capture event is produced by a software routine.
14. A camera having processor-executable components for capturing images, the components comprising:
an image acquisition component capable of acquiring image data representing a first frame of light; and
a hardware testing component capable of testing a hardware component in the camera based on image data acquired by the image acquisition component for the first frame of light.
15. The camera of claim 14 wherein the hardware testing component comprises a memory availability component capable of performing a memory resource function to determine if there is sufficient available memory to store the image data.
16. The camera of claim 15 further comprising a user notification component capable of notifying the user when there is insufficient memory to store the image data before the image acquisition component acquires image data for a second frame of light.
17. The camera of claim 15 further comprising a memory management component that is capable of reallocating memory resources before the image acquisition component acquires image data for a second frame of light.
18. The camera of claim 17 wherein the memory management component reallocates memory resources by terminating components.
19. The camera of claim 17 wherein the memory management component reallocates memory by remapping a virtual address space.
20. The camera of claim 14 wherein the hardware testing component comprises a defective pixel detection component that is capable of identifying defective light sensors in the camera based on image data acquired for the first frame of light.
US11/233,349 1999-05-28 2005-09-22 Method and apparatus for capturing images Abandoned US20060012684A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/233,349 US20060012684A1 (en) 1999-05-28 2005-09-22 Method and apparatus for capturing images

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US13651799P 1999-05-28 1999-05-28
US09/505,223 US6967680B1 (en) 1999-05-28 2000-02-16 Method and apparatus for capturing images
US11/233,349 US20060012684A1 (en) 1999-05-28 2005-09-22 Method and apparatus for capturing images

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/505,223 Continuation US6967680B1 (en) 1999-05-28 2000-02-16 Method and apparatus for capturing images

Publications (1)

Publication Number Publication Date
US20060012684A1 true US20060012684A1 (en) 2006-01-19

Family

ID=35344935

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/505,223 Expired - Lifetime US6967680B1 (en) 1999-05-28 2000-02-16 Method and apparatus for capturing images
US11/233,349 Abandoned US20060012684A1 (en) 1999-05-28 2005-09-22 Method and apparatus for capturing images

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US09/505,223 Expired - Lifetime US6967680B1 (en) 1999-05-28 2000-02-16 Method and apparatus for capturing images

Country Status (1)

Country Link
US (2) US6967680B1 (en)

Families Citing this family (51)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7738015B2 (en) 1997-10-09 2010-06-15 Fotonation Vision Limited Red-eye filter method and apparatus
US7630006B2 (en) 1997-10-09 2009-12-08 Fotonation Ireland Limited Detecting red eye filter and apparatus using meta-data
US7042505B1 (en) 1997-10-09 2006-05-09 Fotonation Ireland Ltd. Red-eye filter method and apparatus
US6489989B1 (en) * 1999-09-15 2002-12-03 Electric Planet, Inc. System, method and article of manufacture for executing a video setup protocol
JP3854833B2 (en) * 2001-09-10 2006-12-06 キヤノン株式会社 Imaging apparatus, signal processing method thereof, program, and storage medium
CN1269364C (en) * 2002-05-01 2006-08-09 佳能株式会社 Photographic device, photographic method, programme and recording medium
JP3950783B2 (en) * 2002-11-07 2007-08-01 キヤノン株式会社 Imaging device
US8254674B2 (en) 2004-10-28 2012-08-28 DigitalOptics Corporation Europe Limited Analyzing partial face regions for red-eye detection in acquired digital images
US8593542B2 (en) 2005-12-27 2013-11-26 DigitalOptics Corporation Europe Limited Foreground/background separation using reference images
US7620218B2 (en) 2006-08-11 2009-11-17 Fotonation Ireland Limited Real-time face tracking with reference images
US7920723B2 (en) 2005-11-18 2011-04-05 Tessera Technologies Ireland Limited Two stage detection for photographic eye artifacts
US8155397B2 (en) 2007-09-26 2012-04-10 DigitalOptics Corporation Europe Limited Face tracking in a camera processor
US9692964B2 (en) 2003-06-26 2017-06-27 Fotonation Limited Modification of post-viewing parameters for digital images using image region or feature information
US7574016B2 (en) 2003-06-26 2009-08-11 Fotonation Vision Limited Digital image processing using face detection information
US7587085B2 (en) 2004-10-28 2009-09-08 Fotonation Vision Limited Method and apparatus for red-eye detection in an acquired digital image
US7792970B2 (en) 2005-06-17 2010-09-07 Fotonation Vision Limited Method for establishing a paired connection between media devices
US7970182B2 (en) 2005-11-18 2011-06-28 Tessera Technologies Ireland Limited Two stage detection for photographic eye artifacts
US8036458B2 (en) 2007-11-08 2011-10-11 DigitalOptics Corporation Europe Limited Detecting redeye defects in digital images
US8896725B2 (en) 2007-06-21 2014-11-25 Fotonation Limited Image capture device with contemporaneous reference image capture mechanism
US7269292B2 (en) 2003-06-26 2007-09-11 Fotonation Vision Limited Digital image adjustable compression and resolution using face detection information
US8330831B2 (en) 2003-08-05 2012-12-11 DigitalOptics Corporation Europe Limited Method of gathering visual meta data using a reference image
US8682097B2 (en) 2006-02-14 2014-03-25 DigitalOptics Corporation Europe Limited Digital image enhancement with reference images
US7440593B1 (en) 2003-06-26 2008-10-21 Fotonation Vision Limited Method of improving orientation and color balance of digital images using face detection information
US7536036B2 (en) * 2004-10-28 2009-05-19 Fotonation Vision Limited Method and apparatus for red-eye detection in an acquired digital image
US7689009B2 (en) 2005-11-18 2010-03-30 Fotonation Vision Ltd. Two stage detection for photographic eye artifacts
US7844076B2 (en) 2003-06-26 2010-11-30 Fotonation Vision Limited Digital image processing using face detection and skin tone information
US8948468B2 (en) 2003-06-26 2015-02-03 Fotonation Limited Modification of viewing parameters for digital images using face detection information
US8989453B2 (en) 2003-06-26 2015-03-24 Fotonation Limited Digital image processing using face detection information
US7565030B2 (en) 2003-06-26 2009-07-21 Fotonation Vision Limited Detecting orientation of digital images using face detection information
US8170294B2 (en) 2006-11-10 2012-05-01 DigitalOptics Corporation Europe Limited Method of detecting redeye in a digital image
US9129381B2 (en) 2003-06-26 2015-09-08 Fotonation Limited Modification of post-viewing parameters for digital images using image region or feature information
US7471846B2 (en) 2003-06-26 2008-12-30 Fotonation Vision Limited Perfecting the effect of flash within an image acquisition devices using face detection
US8498452B2 (en) 2003-06-26 2013-07-30 DigitalOptics Corporation Europe Limited Digital image processing using face detection information
US9412007B2 (en) 2003-08-05 2016-08-09 Fotonation Limited Partial face detector red-eye filter method and apparatus
US8520093B2 (en) 2003-08-05 2013-08-27 DigitalOptics Corporation Europe Limited Face tracker and partial face tracker for red-eye filter method and apparatus
US8320641B2 (en) 2004-10-28 2012-11-27 DigitalOptics Corporation Europe Limited Method and apparatus for red-eye detection using preview or other reference images
US7315631B1 (en) 2006-08-11 2008-01-01 Fotonation Vision Limited Real-time face tracking in a digital image acquisition device
US7599577B2 (en) 2005-11-18 2009-10-06 Fotonation Vision Limited Method and apparatus of correcting hybrid flash artifacts in digital images
JP4735255B2 (en) * 2005-12-29 2011-07-27 オムロン株式会社 Image processing apparatus and operation condition setting method for image processing apparatus
WO2007095553A2 (en) 2006-02-14 2007-08-23 Fotonation Vision Limited Automatic detection and correction of non-red eye flash defects
ATE497218T1 (en) 2006-06-12 2011-02-15 Tessera Tech Ireland Ltd ADVANCES IN EXPANSING AAM TECHNIQUES FROM GRAYSCALE TO COLOR IMAGES
US7916897B2 (en) 2006-08-11 2011-03-29 Tessera Technologies Ireland Limited Face tracking for controlling imaging parameters
US7403643B2 (en) 2006-08-11 2008-07-22 Fotonation Vision Limited Real-time face tracking in a digital image acquisition device
US8055067B2 (en) 2007-01-18 2011-11-08 DigitalOptics Corporation Europe Limited Color segmentation
KR101159830B1 (en) 2007-03-05 2012-06-26 디지털옵틱스 코포레이션 유럽 리미티드 Red eye false positive filtering using face location and orientation
US8503818B2 (en) 2007-09-25 2013-08-06 DigitalOptics Corporation Europe Limited Eye defect detection in international standards organization images
US8212864B2 (en) 2008-01-30 2012-07-03 DigitalOptics Corporation Europe Limited Methods and apparatuses for using image acquisition data to detect and correct image defects
CN106919911A (en) 2008-07-30 2017-07-04 快图有限公司 Modified using the automatic face and skin of face detection
US8081254B2 (en) 2008-08-14 2011-12-20 DigitalOptics Corporation Europe Limited In-camera based method of detecting defect eye with high accuracy
US8379917B2 (en) 2009-10-02 2013-02-19 DigitalOptics Corporation Europe Limited Face recognition performance using additional image features
JP2014050042A (en) * 2012-09-03 2014-03-17 Toshiba Corp Image processor and solid-state imaging device

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5189519A (en) * 1988-03-22 1993-02-23 Canon Kabushiki Kaisha Electronic camera with exposure control
US5260774A (en) * 1989-04-20 1993-11-09 Canon Kabushiki Kaisha White balance control for still image sensing apparatus
US5563658A (en) * 1994-12-16 1996-10-08 Eastman Kodak Company Electronic camera with rapid automatic focus of an image upon an image sensor
US5751354A (en) * 1994-04-28 1998-05-12 Canon Kabushiki Kaisha Image sensing apparatus and method with exposure performed based on focus evaluation values
US5929908A (en) * 1995-02-03 1999-07-27 Canon Kabushiki Kaisha Image sensing apparatus which performs dynamic range expansion and image sensing method for dynamic range expansion
US6038571A (en) * 1996-01-31 2000-03-14 Kabushiki Kaisha Toshiba Resource management method and apparatus for information processing system of multitasking facility
US6327001B1 (en) * 1992-12-01 2001-12-04 Canon Kabushiki Kaisha Image processing system and information processing apparatus
US6489989B1 (en) * 1999-09-15 2002-12-03 Electric Planet, Inc. System, method and article of manufacture for executing a video setup protocol
US6614408B1 (en) * 1998-03-25 2003-09-02 W. Stephen G. Mann Eye-tap for electronic newsgathering, documentary video, photojournalism, and personal safety
US6618084B1 (en) * 1997-11-05 2003-09-09 Stmicroelectronics, Inc. Pixel correction system and method for CMOS imagers
US6690417B1 (en) * 1997-09-30 2004-02-10 Canon Kabushiki Kaisha Image processing method, an image processing apparatus, an image input device, a photographing device, a photographing system, a communication device, a communication system, and a storage medium
US6727943B1 (en) * 1999-01-20 2004-04-27 Nikon Corporation Electronic camera with white balance adjustment recording medium storing image processing program for white balance adjustment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3335572B2 (en) * 1997-11-28 2002-10-21 沖電気工業株式会社 Auto focus device

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5189519A (en) * 1988-03-22 1993-02-23 Canon Kabushiki Kaisha Electronic camera with exposure control
US5260774A (en) * 1989-04-20 1993-11-09 Canon Kabushiki Kaisha White balance control for still image sensing apparatus
US6327001B1 (en) * 1992-12-01 2001-12-04 Canon Kabushiki Kaisha Image processing system and information processing apparatus
US5751354A (en) * 1994-04-28 1998-05-12 Canon Kabushiki Kaisha Image sensing apparatus and method with exposure performed based on focus evaluation values
US5563658A (en) * 1994-12-16 1996-10-08 Eastman Kodak Company Electronic camera with rapid automatic focus of an image upon an image sensor
US5929908A (en) * 1995-02-03 1999-07-27 Canon Kabushiki Kaisha Image sensing apparatus which performs dynamic range expansion and image sensing method for dynamic range expansion
US6038571A (en) * 1996-01-31 2000-03-14 Kabushiki Kaisha Toshiba Resource management method and apparatus for information processing system of multitasking facility
US6690417B1 (en) * 1997-09-30 2004-02-10 Canon Kabushiki Kaisha Image processing method, an image processing apparatus, an image input device, a photographing device, a photographing system, a communication device, a communication system, and a storage medium
US6618084B1 (en) * 1997-11-05 2003-09-09 Stmicroelectronics, Inc. Pixel correction system and method for CMOS imagers
US6614408B1 (en) * 1998-03-25 2003-09-02 W. Stephen G. Mann Eye-tap for electronic newsgathering, documentary video, photojournalism, and personal safety
US6727943B1 (en) * 1999-01-20 2004-04-27 Nikon Corporation Electronic camera with white balance adjustment recording medium storing image processing program for white balance adjustment
US6489989B1 (en) * 1999-09-15 2002-12-03 Electric Planet, Inc. System, method and article of manufacture for executing a video setup protocol

Also Published As

Publication number Publication date
US6967680B1 (en) 2005-11-22

Similar Documents

Publication Publication Date Title
US6967680B1 (en) Method and apparatus for capturing images
CN101867724B (en) Imaging device and imaging method
CN101334578B (en) Image photographing apparatus, image photographing method, and computer program
JP4626493B2 (en) Image processing apparatus, image processing method, program for image processing method, and recording medium recording program for image processing method
US7692696B2 (en) Digital image acquisition system with portrait mode
US8477993B2 (en) Image taking apparatus and image taking method
US7352394B1 (en) Image modification based on red-eye filter analysis
US7352390B2 (en) Digital camera for capturing image data with rapid response between activating a capture trigger and capturing image data
CN101213828B (en) Method and apparatus for incorporating iris color in red-eye correction
JP4413235B2 (en) Electronic camera
JP4974812B2 (en) Electronic camera
CN103227902A (en) Imaging device, display control method, and program
JP5180349B2 (en) Imaging apparatus, method, and program
US20080100720A1 (en) Cutout Effect For Digital Photographs
CN102082909B (en) Digital photographing apparatus and control the method for this digital photographing apparatus
US7319489B2 (en) Camera with strobe light
US7414650B2 (en) System and method for capturing image data
JP4307061B2 (en) Imaging device
JP5077294B2 (en) Image photographing apparatus, image photographing method, and computer program
JP2008028956A (en) Imaging apparatus and method for generating image signal for detecting target therein
US7570289B2 (en) Electronic camera having a clamp circuit and performing a clamping process on an image signal
KR100231922B1 (en) Image transformation method for digital still camera
JP4120690B2 (en) Still image capturing apparatus and still image capturing method in still image capturing apparatus
US11812158B2 (en) Image capture device having control to insert or remove filter, control method, and storage medium
CN113347490B (en) Video processing method, terminal and storage medium

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034766/0001

Effective date: 20141014