US20090160970A1 - Remote determination of image-acquisition settings and opportunities - Google Patents

Remote determination of image-acquisition settings and opportunities Download PDF

Info

Publication number
US20090160970A1
US20090160970A1 US11/961,497 US96149707A US2009160970A1 US 20090160970 A1 US20090160970 A1 US 20090160970A1 US 96149707 A US96149707 A US 96149707A US 2009160970 A1 US2009160970 A1 US 2009160970A1
Authority
US
United States
Prior art keywords
image
acquisition
information
digital camera
settings
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/961,497
Inventor
John R. Fredlund
Bruce H. Pillman
Andrew C. Gallagher
Andrew C. Blose
John N. Border
Kevin M. Gobeyn
Richard B. Wheeler
Michael J. Telek
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Eastman Kodak Co
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=40460007&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=US20090160970(A1) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Individual filed Critical Individual
Priority to US11/961,497 priority Critical patent/US20090160970A1/en
Assigned to EASTMAN KODAK COMPANY reassignment EASTMAN KODAK COMPANY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TELEK, MICHAEL J., WHEELER, RICHARD B., BORDER, JOHN N., FREDLUND, JOHN R., GALLAGHER, ANDREW C., PILLMAN, BRUCE H., BLOSE, ANDREW C., GOBEYN, KEVIN M.
Priority to PCT/US2008/013483 priority patent/WO2009085099A2/en
Publication of US20090160970A1 publication Critical patent/US20090160970A1/en
Priority to US13/151,304 priority patent/US8305452B2/en
Assigned to KODAK IMAGING NETWORK, INC., EASTMAN KODAK INTERNATIONAL CAPITAL COMPANY, INC., KODAK AVIATION LEASING LLC, QUALEX INC., CREO MANUFACTURING AMERICA LLC, FPC INC., KODAK PORTUGUESA LIMITED, NPEC INC., LASER-PACIFIC MEDIA CORPORATION, PAKON, INC., FAR EAST DEVELOPMENT LTD., KODAK AMERICAS, LTD., KODAK PHILIPPINES, LTD., EASTMAN KODAK COMPANY, KODAK REALTY, INC., KODAK (NEAR EAST), INC. reassignment KODAK IMAGING NETWORK, INC. PATENT RELEASE Assignors: CITICORP NORTH AMERICA, INC., WILMINGTON TRUST, NATIONAL ASSOCIATION
Assigned to MONUMENT PEAK VENTURES, LLC reassignment MONUMENT PEAK VENTURES, LLC RELEASE BY SECURED PARTY (SEE DOCUMENT FOR DETAILS). Assignors: INTELLECTUAL VENTURES FUND 83 LLC
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera

Definitions

  • This invention relates to remote determination of image-acquisition settings and opportunities for a digital camera based at least upon pre-image-acquisition information obtained by the digital camera.
  • parameters configured to control exposure time affect motion blur
  • parameters configured to control f/number affect depth-of-field
  • all or some of these parameters can be controlled and are conveniently referred to herein as image-acquisition settings.
  • scene modes are essentially collections of image-acquisition settings, which direct the camera to optimize parameters given the user's selection of scene type.
  • scene modes are limited in several ways.
  • One limitation is that the user must select a scene mode for it to be effective, which is often inconvenient, and shifts the burden of scene determination from the image-acquisition device to the user.
  • the average user generally understands little of the utility and usage of the scene modes.
  • a second limitation is that scene modes tend to oversimplify the possible kinds of scenes being acquired.
  • a common scene mode is “portrait”, which is optimized for capturing images of people.
  • Another common scene mode is “snow”, which is optimized to acquire a subject against a background of snow with different parameters. If a user wishes to acquire a portrait against a snowy background, the user must choose either portrait or snow, but the user cannot combine aspects of each. Many other combinations exist, and creating scene modes for the varying combinations is cumbersome at best.
  • a backlit scene can be very much like a scene with a snowy background, in that subject matter is surrounded by background with a higher brightness.
  • pre-image-acquisition information is obtained by a digital camera and transmitted to a system external to the digital camera.
  • a system external system is referred to herein as an “image-acquisition-setting providing system”, or an “IAS Providing System,” and is configured to provide image-acquisition settings to the digital camera.
  • the digital camera receives the image-acquisition settings from the IAS Providing System in response to the step of transmitting the pre-image-acquisition information. Subsequently, the digital camera performs an image-acquisition sequence based at least upon the received image-acquisition settings.
  • embodiments of the present invention allow the determination of image-acquisition settings to be performed remotely from the digital camera, where data-processing resources and available data sources can greatly exceed those within the digital camera.
  • the remote system need not be limited to a select group of “scene modes” and can identify image-acquisition settings customized for the particular pre-image-acquisition information provided by the digital camera.
  • pre-image-acquisition information may include audio information, illumination information, camera position information, camera orientation information, motion information, an announcement of the digital camera's presence, temperature information, humidity information, ceiling detection information, distance-to-subject information, spectral information, etc.
  • some or all of the pre-image acquisition information may be generated, at least in part, by the digital camera itself or by a system external to the digital camera, such as a global positioning system (“GPS”), known in the art.
  • GPS global positioning system
  • the digital camera may determine whether or not it is appropriate to acquire an image based at least upon an analysis of the received image-acquisition settings. For example, the received image-acquisition settings may require the digital camera to operate in a manner that it deems will produce an unacceptable image. Consequently, the digital camera may present an indication configured to warn a user of the digital camera that performing the image-acquisition sequence is not appropriate or to advise the user to take an action to improve the appropriateness of performing the image-acquisition sequence.
  • the IAS Providing System can include in its image-acquisition settings an indication of whether the digital camera is even permitted to acquire images. These embodiments allow, for example, an event operator to prevent images of the event from being acquired.
  • the digital camera generates image data from the image-acquisition sequence and transmits at least the image data and the pre-image-acquisition information to an image processing system external to the digital camera for processing.
  • the image processing system may or may not be the same system as the IAS Providing System.
  • the digital camera may obtain image-acquisition information (as opposed to pre-image-acquisition information) contemporaneously with the image-acquisition sequence.
  • the digital camera may transmit the pre-image-acquisition information, the image-acquisition information, and the image data to the image processing system for processing.
  • image-acquisition information includes audio information, illumination information, camera position information, camera orientation information, motion information, temperature information, humidity information, ceiling detection information, distance to subject information, spectral information such as histograms, etc.
  • information in the image-acquisition information and the pre-image-acquisition information is of a same category.
  • both the pre-image acquisition information and the image-acquisition information may include illumination information.
  • the digital camera may be configured to verify the consistency between the information of the same category. For example, the digital camera may be configured to verify that illumination conditions have not substantially changed from the time the pre-image-acquisition information was obtained and the time the image-acquisition information was obtained.
  • the external image processing system may not only perform the corrective image processing, but also may perform the verification of consistency between pre-image-acquisition and image-acquisition information.
  • pre-image acquisition information includes a present time.
  • the pre-image acquisition information may be transmitted to a system external to the digital camera.
  • the digital camera may receive image-opportunity information from the system, where such information is configured to guide a user of the digital camera towards an image-acquisition opportunity.
  • the image-opportunity information indicates at least a time period in which the image-acquisition opportunity exists.
  • the digital camera may present the image-opportunity information in a manner configured to present the image-opportunity information, or a derivative thereof, to a user of the digital camera. Accordingly, users can become informed of image-acquisition opportunities currently available nearby.
  • FIG. 1 illustrates a system for identifying image-acquisition settings, according to an embodiment of the present invention
  • FIG. 2 illustrates a method for identifying image-acquisition settings, according to an embodiment of the present invention
  • FIG. 3 illustrates an alternative method for identifying image-acquisition settings, according to an embodiment of the present invention
  • FIG. 4 illustrates a method for identifying image-acquisition settings and processing image data acquired based at least upon the identified image-acquisition settings, according to an embodiment of the present invention
  • FIG. 5 illustrates a method for identifying an image-acquisition opportunity, according to embodiments of the present invention.
  • FIG. 6 illustrates the presentation of an indication of an image-acquisition opportunity, according to an embodiment of the present invention.
  • Embodiments of the present invention allow the determination of image-acquisition settings to be performed remotely from the digital camera, where data-processing resources can greatly exceed those within the digital camera.
  • the remote system need not be limited to a select group of “scene modes” and can identify image-acquisition settings customized for the particular pre-image-acquisition information provided by the digital camera. Consequently, better-tailored image-acquisition settings can be generated, and generated more quickly, than conventional techniques.
  • image-acquisition is intended to refer to the process of acquiring an image performed by a camera.
  • image-acquisition is to be differentiated from processes that occur down-stream from image-acquisition, such as processes that pertain to determining what to do with already-acquired images.
  • word “or” is used in this disclosure in a non-exclusive sense.
  • FIG. 1 illustrates a system 100 for identifying image-acquisition settings, according to an embodiment of the present invention.
  • the system 100 includes a digital camera 101 , an image-acquisition-setting (“IAS”) providing system 102 , an optional pre-image-acquisition information (“PIAI”) providing system 104 , and an optional image processing system 105 .
  • the term “system” is intended to include one or more devices configured to collectively perform a set of one or more functions.
  • the broken line 106 indicates that the IAS providing system 102 and the image processing system 105 may be part of a same system or a common device.
  • the PIAI providing system 104 also may be part of a common system or a common device with the image processing system 105 or IAS providing system 102 .
  • the IAS providing system 102 , the PIAI providing system 104 , and the image processing system 105 are communicatively connected to the digital camera 101 .
  • the phrase “communicatively connected” is intended to include any type of connection between devices, whether wired or wireless, in which data may be communicated.
  • the digital camera 101 is configured to implement the processes of the various embodiments of the present invention, including the example processes of FIGS. 2-5 described herein.
  • the digital camera includes a data processing system including one or more data processing devices and a processor-accessible memory system that facilitate implementation of the processes of the various embodiments of the present invention, including the example processes of FIGS. 2-5 described herein.
  • the processor-accessible memory system includes one or more processor-accessible memories configured to store information, including the information needed to execute the processes of the various embodiments of the present invention, including the example processes of FIGS. 2-5 described herein.
  • the phrase “processor-accessible memory” is intended to include any processor-accessible data storage device, whether volatile or nonvolatile, electronic, magnetic, optical, or otherwise, including but not limited to, RAM, ROM, hard disks, and flash memories.
  • the digital camera 101 also includes an optional sensor system 103 configured to obtain pre-image-acquisition information or image-acquisition information.
  • the sensor system 103 may include sensors, known in the art, for obtaining audio information, illumination information, camera position information, camera orientation information, motion information, an announcement of the digital camera's presence, temperature information, humidity information, ceiling detection information, distance to subject information, spectral information such as histograms, etc.
  • pre-image-acquisition information may be obtained by devices within the PIAI providing system 104 and transmitted to the digital camera 101 .
  • FIG. 1 shows the system 104 as providing pre-image-acquisition information, one skilled in the art will appreciate that the system 104 may provide image-acquisition information in addition to or in lieu of the pre-image-acquisition information.
  • FIG. 2 illustrates a method 200 implemented by the digital camera 101 for identifying image-acquisition settings, according to an embodiment of the present invention.
  • the digital camera obtains pre-image-acquisition information (“PIAI”).
  • the PIAI may include, for example, audio information, illumination information, camera position information, camera orientation information, motion information, an announcement of the digital camera's presence, temperature information, humidity information, ceiling detection information, distance to subject information, or spectral information such as histograms.
  • the PIAI may be obtained via the sensor system 103 or from one or more external devices within the PIAI providing system 104 .
  • the digital camera 101 transmits the PIAI to the image-acquisition-setting (“IAS”) providing system 102 .
  • the IAS providing system 102 may include one or more data processing devices, another camera, a server, etc.
  • the IAS providing system 102 may include superior computing power than that included within the digital camera 101 . Consequently, with the PIAI, the IAS providing system 102 determines appropriate image-acquisition settings for the digital camera 101 in order to improve the quality of an image about to be acquired by the digital camera 101 .
  • the IAS providing system 102 transmits these image-acquisition settings to the digital camera and, consequently, at step 203 , the digital camera 101 receives the image-acquisition settings from the IAS providing system 102 .
  • the PIAI may include a measure of the dynamic range of a scene.
  • a daylight scene containing brightly lit portions and dark shadows such as a scene of a field of Holsteins viewed from under a copse of Ginko trees with dense foliage, may have a dynamic range that is greater than the camera's image acquisition system is capable of acquiring.
  • a measurement of dynamic range may be provided as PIAI to the IAS providing system 102 .
  • the IAS providing system 102 uses this measurement information to determine that multiple image-acquisitions are required to render the scene as best as possible, and also to determine the optimum exposures for each of the image-acquisitions. These image-acquisition settings are sent to the digital camera 101 .
  • the digital camera 101 performs an image-acquisition sequence based at least upon the received image-acquisition settings.
  • the IAS providing system 102 can quickly provide accurate image-acquisition settings for the digital camera 101 without the need for excessive data processing capabilities on the digital camera 101 itself.
  • the image-acquisition settings include a triggering signal that, when received by the digital camera 101 , instructs the digital camera 101 to initiate the image-acquisition sequence at step 204 .
  • a triggering signal that, when received by the digital camera 101 , instructs the digital camera 101 to initiate the image-acquisition sequence at step 204 .
  • the IAS providing system 102 can be configured to provide a triggering signal in the image-acquisition settings when a leading racecar approaches the finish line, thereby allowing respective digital cameras to acquire an image of the racecar crossing the finish line at precisely the right moment.
  • the IAS providing system can be configured to provide a triggering signal when a particular amusement ride car enters a digital camera's field of view.
  • FIG. 3 illustrates a method 300 for identifying image-acquisition settings, according to an embodiment of the present invention.
  • the method 300 begins with the same steps 201 - 203 initially discussed with respect to FIG. 2 .
  • the method 300 differs, however, in that it includes optional steps 304 and 305 .
  • the digital camera 101 can determine at step 304 whether image-acquisition is permitted.
  • the image-acquisition settings may include an indication of whether or not the digital camera 101 is even allowed to acquire an image.
  • the pre-image-acquisition information obtained by the digital camera 101 may include an announcement of the camera's presence or an indication of the camera's location.
  • the IAS providing system 102 may use this information to determine that the digital camera 101 , based upon its identification or its location, for example, is not allowed to acquire images. Alternately, the IAS providing system 102 may use this information to determine that the digital camera 101 , based upon its location and orientation direction, for example, is not allowed to acquire images in some directions, and is only allowed to acquire images in other directions.
  • the pre-image-acquisition information obtained by the digital camera 101 may include image content from a preliminary image acquisition. The IAS providing system 102 may use the image content to determine whether or not the digital camera 101 is allowed to acquire images.
  • the IAS providing system 102 may consider in its determination the color content, textures, line direction, the number of faces detected, whether particular faces are detected, face location, the number of objects detected, whether particular objects are detected, or object location in the image content.
  • the above-features may be useful, for example, by event organizers.
  • the event organizers may, consequently, have the ability to prevent images of the event from being acquired by particular cameras or any camera.
  • the method 300 may abort at step 306 . Otherwise, processing may proceed onto step 305 , if this step is implemented, or directly to step 310 where an image-acquisition sequence is performed.
  • the digital camera 101 may determine, based at least upon the received image acquisition settings, that image acquisition is or is not appropriate.
  • the IAS providing system 102 may provide image acquisition settings that the digital camera 101 deems will produce an unacceptable image.
  • the digital camera 101 may determine that image acquisition is not appropriate at step 305 , and may present a warning indication to a user.
  • the digital camera 101 may also present advice to the user regarding how the user can improve the image acquisition prospects at step 307 . Note that the advice presented to the user may be received with the Image-Acquisition Settings in step 203 .
  • advice to the user may be to instruct the user to immobilize digital camera 101 so that a long exposure or multiple exposures could be taken to acquire the silhouettes of the trees and the reflection of the moon off the water.
  • the images can be combined in digital camera 101 or in Image Processing System 105 .
  • the Image-Acquisition-Setting Providing system 102 uses the PIAI, including location, direction, distance-to-subject, and average illumination, and other non-PIAI information, such as weather information and moonrise information, to determine that the likely subject is a moonrise and incorporates the immobilization warning along with the Image-Acquisition Settings in step 203 .
  • PIAI including location, direction, distance-to-subject, and average illumination
  • other non-PIAI information such as weather information and moonrise information
  • the digital camera 101 may either 308 abort the performance of an image-acquisition sequence at step 309 or perform an image-acquisition sequence based at least upon the received image acquisition settings at step 310 .
  • the decision whether to abort at step 309 or perform the image acquisition sequence at step 310 may be determined at least by user input.
  • FIG. 4 illustrates a method 400 for identifying image-acquisition settings and processing image data acquired based at least upon the identified image-acquisition settings, according to an embodiment of the present invention.
  • the method 400 is a continuation of the processes of FIG. 2 or 3 . Consequently, the method 400 begins with step 204 in FIG. 2 or step 310 in FIG. 3 where an image-acquisition sequence is performed.
  • the digital camera 101 generates image data from the performed image-acquisition sequence.
  • the image-acquisition sequence may include the acquisition of multiple images, where the multiple images are configured to be synthesized into a single image, a collage, or a video.
  • the resulting image data from the image-acquisition sequence, as well as the pre-image-acquisition information may be transmitted to the image processing system 105 .
  • This transmission occurs at step 412 and also may optionally include image-acquisition information (“IAI”).
  • the IAI may include the same information as the PIAI, but be obtained at different points in time. For example, the PIAI is obtained before image acquisition and the IAI is obtained contemporaneously or substantially contemporaneously with image acquisition.
  • the information transmitted at step 412 is used by the image processing system 105 to process the image data.
  • the aforementioned moon picture is one example that can benefit from processing in image-processing system 105 .
  • processing system 105 can integrate the knowledge of these parameters into processing the acquired images to provide an improved composite image that does not unduly compress the range of either the moon or the silhouettes of the trees and the reflections from the water.
  • Another example is an image-acquisition looking over Spectacle Lake from the top of Good Luck Mountain on a hazy summer midday.
  • processing system 105 can integrate the knowledge of these parameters along with weather information into processing the acquired image to provide an improved processed image by expanding the dynamic range of the acquired image.
  • either the digital camera 101 or the image processing system 105 may check for consistency between the PIAI and the IAI. If there are inconsistencies between the PIAI and the IAI, the image processing system 105 may process the image data differently than if the PIAI and the IAI were consistent.
  • An image-acquisition taken through the window of a moving automobile is an example.
  • PIAI location information in this case, will be significantly different than IAI location information.
  • processing system 105 can determine that there is motion associated with the image-acquisition. This motion is detected apart from or in addition to IAI information on camera shake that is determined from small accelerations that are the result of user movements. Additionally, direction and velocity can be determined from the differing locations. These factors can be figured into processing the acquired image and motion blur can be minimized.
  • the digital camera 101 receives the processed image at step 413 .
  • FIG. 5 illustrates a method 500 for identifying an image-acquisition opportunity, according to an embodiment of the present invention.
  • the embodiment of FIG. 5 begins at step 501 where the digital camera 101 obtains PIAI indicating at least a present time.
  • the PIAI is transmitted to an external system at step 502 , such external system being configured at least to determine whether or not an improved image-acquisition opportunity exists for the digital camera 101 .
  • the external system may determine that a popular imaging opportunity is available to the digital camera 101 nearby and within a current or upcoming span of time.
  • the external system may be the IAS providing system 102 , and the image-opportunity information may be provided with or within the image-acquisition settings provided to the digital camera 101 .
  • FIG. 6 illustrates an example 600 of image-opportunity information presented by the digital camera 101 to a user.
  • the image-opportunity information indicates a period of time in which an imaging opportunity exists or a moment in time in which the imaging opportunity exists, as shown at reference numeral 601 .
  • FIG. 6 illustrates an embodiment of step 504 where the image-opportunity information is presented to the user.
  • an IAS providing system provides all image acquisition settings to a digital camera
  • the IAS providing system may provide only some image acquisition settings, while the digital camera provides others.
  • the IAS providing system may provide image acquisition settings redundant to those provided by the digital camera itself, for purposes of verification or improvement of the image acquisition settings provided by the digital camera.
  • the digital camera may receive image acquisition settings from the IAS providing system and process them or modify them before arriving at a final set of image acquisition settings ultimately used in an image-acquisition sequence. By processing or otherwise modifying the received image acquisition settings, it can be said that the image-acquisition sequence is performed based at least upon a derivative of the received image acquisition settings, because the received image-acquisition settings have been processed or modified in some manner.
  • step 304 in FIG. 3 is described in the context of a digital camera using image-acquisition settings to determine whether image-acquisition is permitted, one skilled in the art will appreciate that the invention is not so limited.
  • image-acquisition settings received by a digital camera may include additional information that pertains to image-acquisition permissions, such as what metadata or the amount of detail in metadata that is recorded by a digital camera along with an image-acquisition sequence.
  • IAS providing system may be configured to provide image acquisition settings that prevent digital cameras within a particular location from recording their location in metadata when performing an image-acquisition sequence.
  • an IAS providing system may provide multiple sets of image acquisition settings to a digital camera, each of the multiple sets of image acquisition settings being configured to control a subset of the multiple image acquisitions in the image acquisition sequence.
  • each set of image acquisition settings may be configured to control one of the multiple image acquisitions in the image acquisition sequence.
  • step 304 in FIG. 3 is described in the context of determining whether the digital camera 101 is permitted to acquire an image. Such determination may be made based at least upon digital camera location, digital camera orientation, or image content included in pre-image acquisition information.
  • the digital camera location, digital camera orientation, or image content in the pre-image acquisition information need not only be used to determine whether or not a digital camera is permitted to acquire an image, but also may be used to allow a digital camera to acquire an image while limiting how that image is acquired.
  • the image-acquisition settings provided by the IAS providing system 102 at step 203 may be used to limit how an image is acquired.
  • the IAS providing system 102 may limit the digital camera 101 to using at least (a) a particular image-acquisition mode or parameter or (b) one of a particular set of image-acquisition modes or parameters, such as a focus distance, camera operating mode, flash mode, still or video mode, panorama or non-panorama mode, etc.
  • a particular image-acquisition mode or parameter or (b) one of a particular set of image-acquisition modes or parameters, such as a focus distance, camera operating mode, flash mode, still or video mode, panorama or non-panorama mode, etc.
  • These features may be useful, for example, for allowing an image to be acquired while preventing (a) a particular use of the digital camera or (b) a particular object from being represented in the image.
  • the IAS providing system 102 can allow image-acquisitions at an event, but prevent flashes from being used.
  • an owner of a building wants to prevent images of the building from being acquired.
  • the IAS providing system 102 can allow images to be acquired in the vicinity of the building, but require a short focus distance.
  • the short focus distance would allow close-up portrait images of people standing in front of the building to be acquired, but would cause the building in the background of such images to be blurry.

Abstract

Pre-image-acquisition information is obtained by a digital camera and transmitted to a system external to the digital camera. The system is configured to provide image-acquisition settings to the digital camera. In this regard, the digital camera receives the image-acquisition settings from the external system and performs an image-acquisition sequence based at least upon the received image-acquisition settings. Accordingly, the determination of image-acquisition settings can be performed remotely from the digital camera, where data-processing resources can greatly exceed those within the digital camera.

Description

    FIELD OF THE INVENTION
  • This invention relates to remote determination of image-acquisition settings and opportunities for a digital camera based at least upon pre-image-acquisition information obtained by the digital camera.
  • BACKGROUND
  • Many parameters affect the quality and usefulness of an image of a scene acquired by a camera. For example, parameters configured to control exposure time affect motion blur, parameters configured to control f/number affect depth-of-field, and so forth. In many cameras, all or some of these parameters can be controlled and are conveniently referred to herein as image-acquisition settings.
  • Methods for controlling exposure and focus parameters are well known in both film-based and electronic cameras. However, the level of intelligence in these systems is limited by resource and time constraints in the camera. In many cases, knowing the type of scene being acquired can lead easily to improved selection of image-acquisition parameters. For example, knowing a scene is a portrait allows the camera to select a wider aperture to minimize depth-of-field. Knowing a scene is a sports/action scene allows the camera to automatically limit exposure time to control motion blur and adjust gain (exposure index) and aperture accordingly. Knowing the scene is a sunset suggests that the color balance will be shifted from the norm and that high saturation is likely to be desired. Knowing a scene is a snow scene indicates that a special mapping of input brightness to output values is desired. Because this knowledge is useful in guiding simple exposure control systems, many film, video, and digital still cameras include a number of scene modes that can be selected by the user. These scene modes are essentially collections of image-acquisition settings, which direct the camera to optimize parameters given the user's selection of scene type.
  • The use of scene modes is limited in several ways. One limitation is that the user must select a scene mode for it to be effective, which is often inconvenient, and shifts the burden of scene determination from the image-acquisition device to the user. The average user generally understands little of the utility and usage of the scene modes.
  • A second limitation is that scene modes tend to oversimplify the possible kinds of scenes being acquired. For example, a common scene mode is “portrait”, which is optimized for capturing images of people. Another common scene mode is “snow”, which is optimized to acquire a subject against a background of snow with different parameters. If a user wishes to acquire a portrait against a snowy background, the user must choose either portrait or snow, but the user cannot combine aspects of each. Many other combinations exist, and creating scene modes for the varying combinations is cumbersome at best. In another example, a backlit scene can be very much like a scene with a snowy background, in that subject matter is surrounded by background with a higher brightness. Few users are likely to understand the concept of a backlit scene and realize it has crucial similarity to a “snow” scene. A camera developer wishing to help users with backlit scenes will probably have to add a scene mode for backlit scenes, even though it may be identical to the snow scene mode.
  • Both of these scenarios illustrate the problems of describing photographic scenes in way accessible to a casual user. The number of scene modes required expands greatly and becomes difficult to navigate. The proliferation of scene modes ends up exacerbating the problem that many users find scene modes excessively complex.
  • Attempts to automate the selection of a scene mode have been made, for example, in United States Patent Application Publication No. 2003/0007076 by Noriyuki Okisu et al. and U.S. Pat. No. 6,301,440, to Rudolf M. Bolle et al. A limitation on such automated methods is that they tend to be computationally intensive relative to the simpler methods. In this regard, cameras tend to be relatively limited in computing resources, in order to reduce cost, cut energy drain, and the like. Consequently, a noticeable lag between shutter trip and image acquisition occurs in some cameras. Such lag is highly undesirable when a subject to be photographed is in motion. One solution to the problem of lag is avoidance of highly time consuming computations, which leads us back again to the also-undesirable use of fewer, manually selected modes with associated image-acquisition settings.
  • Accordingly, a need in the art exists for improved solutions for determining image-acquisition settings in a computationally-sensitive environment.
  • SUMMARY
  • The above-described problems are addressed and a technical solution is achieved in the art by systems and methods for identifying image-acquisition settings, according to various embodiments of the present invention. In an embodiment of the present invention, pre-image-acquisition information is obtained by a digital camera and transmitted to a system external to the digital camera. Such an external system is referred to herein as an “image-acquisition-setting providing system”, or an “IAS Providing System,” and is configured to provide image-acquisition settings to the digital camera. In this regard, the digital camera receives the image-acquisition settings from the IAS Providing System in response to the step of transmitting the pre-image-acquisition information. Subsequently, the digital camera performs an image-acquisition sequence based at least upon the received image-acquisition settings.
  • Accordingly, embodiments of the present invention allow the determination of image-acquisition settings to be performed remotely from the digital camera, where data-processing resources and available data sources can greatly exceed those within the digital camera. In this regard, the remote system need not be limited to a select group of “scene modes” and can identify image-acquisition settings customized for the particular pre-image-acquisition information provided by the digital camera.
  • Examples of pre-image-acquisition information may include audio information, illumination information, camera position information, camera orientation information, motion information, an announcement of the digital camera's presence, temperature information, humidity information, ceiling detection information, distance-to-subject information, spectral information, etc. In this regard, some or all of the pre-image acquisition information may be generated, at least in part, by the digital camera itself or by a system external to the digital camera, such as a global positioning system (“GPS”), known in the art.
  • In some embodiments, the digital camera may determine whether or not it is appropriate to acquire an image based at least upon an analysis of the received image-acquisition settings. For example, the received image-acquisition settings may require the digital camera to operate in a manner that it deems will produce an unacceptable image. Consequently, the digital camera may present an indication configured to warn a user of the digital camera that performing the image-acquisition sequence is not appropriate or to advise the user to take an action to improve the appropriateness of performing the image-acquisition sequence.
  • In some embodiments, the IAS Providing System can include in its image-acquisition settings an indication of whether the digital camera is even permitted to acquire images. These embodiments allow, for example, an event operator to prevent images of the event from being acquired.
  • According to some embodiments of the present invention, the digital camera generates image data from the image-acquisition sequence and transmits at least the image data and the pre-image-acquisition information to an image processing system external to the digital camera for processing. The image processing system may or may not be the same system as the IAS Providing System. In some embodiments, the digital camera may obtain image-acquisition information (as opposed to pre-image-acquisition information) contemporaneously with the image-acquisition sequence. In this regard, the digital camera may transmit the pre-image-acquisition information, the image-acquisition information, and the image data to the image processing system for processing.
  • Examples of image-acquisition information includes audio information, illumination information, camera position information, camera orientation information, motion information, temperature information, humidity information, ceiling detection information, distance to subject information, spectral information such as histograms, etc. In this regard, in some embodiments, information in the image-acquisition information and the pre-image-acquisition information is of a same category. For example, both the pre-image acquisition information and the image-acquisition information may include illumination information. In some of these embodiments, the digital camera may be configured to verify the consistency between the information of the same category. For example, the digital camera may be configured to verify that illumination conditions have not substantially changed from the time the pre-image-acquisition information was obtained and the time the image-acquisition information was obtained. If a meaningful difference did occur, the user may be notified, corrective image processing may occur, or both. In the embodiments where the image data is transmitted to an external image processing system, the external image processing system may not only perform the corrective image processing, but also may perform the verification of consistency between pre-image-acquisition and image-acquisition information.
  • According to some embodiments of the present invention, pre-image acquisition information includes a present time. The pre-image acquisition information may be transmitted to a system external to the digital camera. In response, the digital camera may receive image-opportunity information from the system, where such information is configured to guide a user of the digital camera towards an image-acquisition opportunity. The image-opportunity information indicates at least a time period in which the image-acquisition opportunity exists. The digital camera may present the image-opportunity information in a manner configured to present the image-opportunity information, or a derivative thereof, to a user of the digital camera. Accordingly, users can become informed of image-acquisition opportunities currently available nearby.
  • In addition to the embodiments described above, further embodiments will become apparent by reference to the drawings and by study of the following detailed description.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The present invention will be more readily understood from the detailed description of exemplary embodiments presented below considered in conjunction with the attached drawings, of which:
  • FIG. 1 illustrates a system for identifying image-acquisition settings, according to an embodiment of the present invention;
  • FIG. 2 illustrates a method for identifying image-acquisition settings, according to an embodiment of the present invention;
  • FIG. 3 illustrates an alternative method for identifying image-acquisition settings, according to an embodiment of the present invention;
  • FIG. 4 illustrates a method for identifying image-acquisition settings and processing image data acquired based at least upon the identified image-acquisition settings, according to an embodiment of the present invention;
  • FIG. 5 illustrates a method for identifying an image-acquisition opportunity, according to embodiments of the present invention; and
  • FIG. 6 illustrates the presentation of an indication of an image-acquisition opportunity, according to an embodiment of the present invention.
  • It is to be understood that the attached drawings are for purposes of illustrating the concepts of the invention and may not be to scale.
  • DETAILED DESCRIPTION
  • Embodiments of the present invention allow the determination of image-acquisition settings to be performed remotely from the digital camera, where data-processing resources can greatly exceed those within the digital camera. In this regard, the remote system need not be limited to a select group of “scene modes” and can identify image-acquisition settings customized for the particular pre-image-acquisition information provided by the digital camera. Consequently, better-tailored image-acquisition settings can be generated, and generated more quickly, than conventional techniques.
  • It should be noted that the phrase “image-acquisition” is intended to refer to the process of acquiring an image performed by a camera. In this regard, “image-acquisition” is to be differentiated from processes that occur down-stream from image-acquisition, such as processes that pertain to determining what to do with already-acquired images. It should also be noted that, unless otherwise explicitly noted or required by context, the word “or” is used in this disclosure in a non-exclusive sense.
  • FIG. 1 illustrates a system 100 for identifying image-acquisition settings, according to an embodiment of the present invention. The system 100 includes a digital camera 101, an image-acquisition-setting (“IAS”) providing system 102, an optional pre-image-acquisition information (“PIAI”) providing system 104, and an optional image processing system 105. The term “system” is intended to include one or more devices configured to collectively perform a set of one or more functions. In this regard, the broken line 106 indicates that the IAS providing system 102 and the image processing system 105 may be part of a same system or a common device. Further in this regard, although not shown in FIG. 1, the PIAI providing system 104 also may be part of a common system or a common device with the image processing system 105 or IAS providing system 102.
  • The IAS providing system 102, the PIAI providing system 104, and the image processing system 105 are communicatively connected to the digital camera 101. The phrase “communicatively connected” is intended to include any type of connection between devices, whether wired or wireless, in which data may be communicated.
  • The digital camera 101 is configured to implement the processes of the various embodiments of the present invention, including the example processes of FIGS. 2-5 described herein. Although not shown in FIG. 1, the digital camera includes a data processing system including one or more data processing devices and a processor-accessible memory system that facilitate implementation of the processes of the various embodiments of the present invention, including the example processes of FIGS. 2-5 described herein. The processor-accessible memory system includes one or more processor-accessible memories configured to store information, including the information needed to execute the processes of the various embodiments of the present invention, including the example processes of FIGS. 2-5 described herein. The phrase “processor-accessible memory” is intended to include any processor-accessible data storage device, whether volatile or nonvolatile, electronic, magnetic, optical, or otherwise, including but not limited to, RAM, ROM, hard disks, and flash memories.
  • The digital camera 101 also includes an optional sensor system 103 configured to obtain pre-image-acquisition information or image-acquisition information. In this regard, the sensor system 103 may include sensors, known in the art, for obtaining audio information, illumination information, camera position information, camera orientation information, motion information, an announcement of the digital camera's presence, temperature information, humidity information, ceiling detection information, distance to subject information, spectral information such as histograms, etc. Alternatively or in addition, pre-image-acquisition information may be obtained by devices within the PIAI providing system 104 and transmitted to the digital camera 101. Although FIG. 1 shows the system 104 as providing pre-image-acquisition information, one skilled in the art will appreciate that the system 104 may provide image-acquisition information in addition to or in lieu of the pre-image-acquisition information.
  • FIG. 2 illustrates a method 200 implemented by the digital camera 101 for identifying image-acquisition settings, according to an embodiment of the present invention. At step 201, the digital camera obtains pre-image-acquisition information (“PIAI”). The PIAI may include, for example, audio information, illumination information, camera position information, camera orientation information, motion information, an announcement of the digital camera's presence, temperature information, humidity information, ceiling detection information, distance to subject information, or spectral information such as histograms. The PIAI may be obtained via the sensor system 103 or from one or more external devices within the PIAI providing system 104.
  • At step 202 the digital camera 101 transmits the PIAI to the image-acquisition-setting (“IAS”) providing system 102. The IAS providing system 102 may include one or more data processing devices, another camera, a server, etc. In this regard, the IAS providing system 102 may include superior computing power than that included within the digital camera 101. Consequently, with the PIAI, the IAS providing system 102 determines appropriate image-acquisition settings for the digital camera 101 in order to improve the quality of an image about to be acquired by the digital camera 101. The IAS providing system 102 transmits these image-acquisition settings to the digital camera and, consequently, at step 203, the digital camera 101 receives the image-acquisition settings from the IAS providing system 102.
  • For example, the PIAI may include a measure of the dynamic range of a scene. A daylight scene containing brightly lit portions and dark shadows, such as a scene of a field of Holsteins viewed from under a copse of Ginko trees with dense foliage, may have a dynamic range that is greater than the camera's image acquisition system is capable of acquiring. A measurement of dynamic range, whether acquired by multiple pre-image-acquisitions by the image acquisition system, or by a sensor designed to measure dynamic range, may be provided as PIAI to the IAS providing system 102. The IAS providing system 102 uses this measurement information to determine that multiple image-acquisitions are required to render the scene as best as possible, and also to determine the optimum exposures for each of the image-acquisitions. These image-acquisition settings are sent to the digital camera 101.
  • At step 204, the digital camera 101 performs an image-acquisition sequence based at least upon the received image-acquisition settings. In view of the high-bandwidth and quick data transmission times currently available, and the improvements to bandwidth and data transmission times that will become available, the IAS providing system 102 can quickly provide accurate image-acquisition settings for the digital camera 101 without the need for excessive data processing capabilities on the digital camera 101 itself.
  • It should be noted that, in some embodiments of the present invention, the image-acquisition settings include a triggering signal that, when received by the digital camera 101, instructs the digital camera 101 to initiate the image-acquisition sequence at step 204. Such a feature may be useful, for instance, when precise timing for image acquisition is needed. For example, at a racing event, the IAS providing system 102 can be configured to provide a triggering signal in the image-acquisition settings when a leading racecar approaches the finish line, thereby allowing respective digital cameras to acquire an image of the racecar crossing the finish line at precisely the right moment. Or, at an amusement park, for example, the IAS providing system can be configured to provide a triggering signal when a particular amusement ride car enters a digital camera's field of view.
  • FIG. 3 illustrates a method 300 for identifying image-acquisition settings, according to an embodiment of the present invention. The method 300 begins with the same steps 201-203 initially discussed with respect to FIG. 2. The method 300 differs, however, in that it includes optional steps 304 and 305. In particular, after receipt of the image-acquisition settings 203, the digital camera 101 can determine at step 304 whether image-acquisition is permitted. To elaborate, the image-acquisition settings may include an indication of whether or not the digital camera 101 is even allowed to acquire an image. For example, the pre-image-acquisition information obtained by the digital camera 101 may include an announcement of the camera's presence or an indication of the camera's location. The IAS providing system 102 may use this information to determine that the digital camera 101, based upon its identification or its location, for example, is not allowed to acquire images. Alternately, the IAS providing system 102 may use this information to determine that the digital camera 101, based upon its location and orientation direction, for example, is not allowed to acquire images in some directions, and is only allowed to acquire images in other directions. In some embodiments, the pre-image-acquisition information obtained by the digital camera 101 may include image content from a preliminary image acquisition. The IAS providing system 102 may use the image content to determine whether or not the digital camera 101 is allowed to acquire images. For example, the IAS providing system 102 may consider in its determination the color content, textures, line direction, the number of faces detected, whether particular faces are detected, face location, the number of objects detected, whether particular objects are detected, or object location in the image content. The above-features may be useful, for example, by event organizers. The event organizers may, consequently, have the ability to prevent images of the event from being acquired by particular cameras or any camera. In this regard, at step 304, if image acquisition is not permitted, the method 300 may abort at step 306. Otherwise, processing may proceed onto step 305, if this step is implemented, or directly to step 310 where an image-acquisition sequence is performed.
  • If step 305 is performed, the digital camera 101 may determine, based at least upon the received image acquisition settings, that image acquisition is or is not appropriate. In this regard, the IAS providing system 102 may provide image acquisition settings that the digital camera 101 deems will produce an unacceptable image. In this case, the digital camera 101 may determine that image acquisition is not appropriate at step 305, and may present a warning indication to a user. In this regard, the digital camera 101 may also present advice to the user regarding how the user can improve the image acquisition prospects at step 307. Note that the advice presented to the user may be received with the Image-Acquisition Settings in step 203. For example, when the user is attempting to acquire a photograph of the moon rising over the trees on the far side of an Adirondack lake, advice to the user, whether determined by the digital camera 101 or by IAS providing system 102, may be to instruct the user to immobilize digital camera 101 so that a long exposure or multiple exposures could be taken to acquire the silhouettes of the trees and the reflection of the moon off the water. In the case of multiple exposures, the images can be combined in digital camera 101 or in Image Processing System 105. In an embodiment, the Image-Acquisition-Setting Providing system 102 uses the PIAI, including location, direction, distance-to-subject, and average illumination, and other non-PIAI information, such as weather information and moonrise information, to determine that the likely subject is a moonrise and incorporates the immobilization warning along with the Image-Acquisition Settings in step 203.
  • After presenting the indication and optional advice at step 307, the digital camera 101 may either 308 abort the performance of an image-acquisition sequence at step 309 or perform an image-acquisition sequence based at least upon the received image acquisition settings at step 310. The decision whether to abort at step 309 or perform the image acquisition sequence at step 310 may be determined at least by user input.
  • FIG. 4 illustrates a method 400 for identifying image-acquisition settings and processing image data acquired based at least upon the identified image-acquisition settings, according to an embodiment of the present invention. The method 400 is a continuation of the processes of FIG. 2 or 3. Consequently, the method 400 begins with step 204 in FIG. 2 or step 310 in FIG. 3 where an image-acquisition sequence is performed. At step 411, the digital camera 101 generates image data from the performed image-acquisition sequence. In this regard, the image-acquisition sequence may include the acquisition of multiple images, where the multiple images are configured to be synthesized into a single image, a collage, or a video. Regardless, the resulting image data from the image-acquisition sequence, as well as the pre-image-acquisition information may be transmitted to the image processing system 105. This transmission occurs at step 412 and also may optionally include image-acquisition information (“IAI”). The IAI may include the same information as the PIAI, but be obtained at different points in time. For example, the PIAI is obtained before image acquisition and the IAI is obtained contemporaneously or substantially contemporaneously with image acquisition.
  • The information transmitted at step 412 is used by the image processing system 105 to process the image data. The aforementioned moon picture is one example that can benefit from processing in image-processing system 105. Using PIAI, including, e.g., location, direction, distance-to-subject, and average illumination, and IAI, such as detected camera movement, processing system 105 can integrate the knowledge of these parameters into processing the acquired images to provide an improved composite image that does not unduly compress the range of either the moon or the silhouettes of the trees and the reflections from the water. Another example is an image-acquisition looking over Spectacle Lake from the top of Good Luck Mountain on a hazy summer midday. Using PIAI, including, e.g., location, direction, distance to subject and average illumination and IAI such as humidity, processing system 105 can integrate the knowledge of these parameters along with weather information into processing the acquired image to provide an improved processed image by expanding the dynamic range of the acquired image.
  • Although not shown in FIG. 4, either the digital camera 101 or the image processing system 105 may check for consistency between the PIAI and the IAI. If there are inconsistencies between the PIAI and the IAI, the image processing system 105 may process the image data differently than if the PIAI and the IAI were consistent. An image-acquisition taken through the window of a moving automobile is an example. PIAI location information, in this case, will be significantly different than IAI location information. Assuming accurate location information in both PIAI and IAI information, processing system 105 can determine that there is motion associated with the image-acquisition. This motion is detected apart from or in addition to IAI information on camera shake that is determined from small accelerations that are the result of user movements. Additionally, direction and velocity can be determined from the differing locations. These factors can be figured into processing the acquired image and motion blur can be minimized. After processing of the image by the image processing system 105, the digital camera 101 receives the processed image at step 413.
  • FIG. 5 illustrates a method 500 for identifying an image-acquisition opportunity, according to an embodiment of the present invention. The embodiment of FIG. 5 begins at step 501 where the digital camera 101 obtains PIAI indicating at least a present time. The PIAI is transmitted to an external system at step 502, such external system being configured at least to determine whether or not an improved image-acquisition opportunity exists for the digital camera 101. For example, if the PIAI includes camera orientation information and camera location information, as well as a present time, the external system may determine that a popular imaging opportunity is available to the digital camera 101 nearby and within a current or upcoming span of time. The external system may be the IAS providing system 102, and the image-opportunity information may be provided with or within the image-acquisition settings provided to the digital camera 101.
  • FIG. 6 illustrates an example 600 of image-opportunity information presented by the digital camera 101 to a user. The image-opportunity information, in this example, indicates a period of time in which an imaging opportunity exists or a moment in time in which the imaging opportunity exists, as shown at reference numeral 601. In this regard, FIG. 6 illustrates an embodiment of step 504 where the image-opportunity information is presented to the user.
  • It is to be understood that the exemplary embodiments are merely illustrative of the present invention and that many variations of the above-described embodiments can be devised by one skilled in the art without departing from the scope of the invention.
  • For example, although often described with the view that an IAS providing system provides all image acquisition settings to a digital camera, one skilled in the art will appreciate that the invention is not so limited. For instance, the IAS providing system may provide only some image acquisition settings, while the digital camera provides others. Or, the IAS providing system may provide image acquisition settings redundant to those provided by the digital camera itself, for purposes of verification or improvement of the image acquisition settings provided by the digital camera. Or, still, the digital camera may receive image acquisition settings from the IAS providing system and process them or modify them before arriving at a final set of image acquisition settings ultimately used in an image-acquisition sequence. By processing or otherwise modifying the received image acquisition settings, it can be said that the image-acquisition sequence is performed based at least upon a derivative of the received image acquisition settings, because the received image-acquisition settings have been processed or modified in some manner.
  • For another example, although step 304 in FIG. 3 is described in the context of a digital camera using image-acquisition settings to determine whether image-acquisition is permitted, one skilled in the art will appreciate that the invention is not so limited. For instance, image-acquisition settings received by a digital camera may include additional information that pertains to image-acquisition permissions, such as what metadata or the amount of detail in metadata that is recorded by a digital camera along with an image-acquisition sequence. In one instance, and IAS providing system may be configured to provide image acquisition settings that prevent digital cameras within a particular location from recording their location in metadata when performing an image-acquisition sequence.
  • For yet another example, although this disclosure describes that received image acquisition settings may be used to perform an image acquisition sequence involving multiple image acquisitions, one skilled in the art will appreciate that the invention is not so limited. For instance, an IAS providing system may provide multiple sets of image acquisition settings to a digital camera, each of the multiple sets of image acquisition settings being configured to control a subset of the multiple image acquisitions in the image acquisition sequence. In one case, each set of image acquisition settings may be configured to control one of the multiple image acquisitions in the image acquisition sequence.
  • For still yet another example, step 304 in FIG. 3 is described in the context of determining whether the digital camera 101 is permitted to acquire an image. Such determination may be made based at least upon digital camera location, digital camera orientation, or image content included in pre-image acquisition information. However, one skilled in the art will appreciate that the digital camera location, digital camera orientation, or image content in the pre-image acquisition information need not only be used to determine whether or not a digital camera is permitted to acquire an image, but also may be used to allow a digital camera to acquire an image while limiting how that image is acquired. To elaborate, the image-acquisition settings provided by the IAS providing system 102 at step 203 may be used to limit how an image is acquired. For instance, the IAS providing system 102 may limit the digital camera 101 to using at least (a) a particular image-acquisition mode or parameter or (b) one of a particular set of image-acquisition modes or parameters, such as a focus distance, camera operating mode, flash mode, still or video mode, panorama or non-panorama mode, etc. These features may be useful, for example, for allowing an image to be acquired while preventing (a) a particular use of the digital camera or (b) a particular object from being represented in the image. For example, the IAS providing system 102 can allow image-acquisitions at an event, but prevent flashes from being used. For another example, assume that an owner of a building wants to prevent images of the building from being acquired. The IAS providing system 102 can allow images to be acquired in the vicinity of the building, but require a short focus distance. The short focus distance would allow close-up portrait images of people standing in front of the building to be acquired, but would cause the building in the background of such images to be blurry.
  • It is therefore intended that all such variations be included within the scope of the following claims and their equivalents.
  • PARTS LIST
    • 100 System
    • 101 Digital camera
    • 102 IAS providing system
    • 103 Sensor system
    • 104 PIAI providing system
    • 105 Image processing system
    • 106 Broken line
    • 200 Method
    • 201 Step
    • 202 Step
    • 203 Step
    • 204 Step
    • 300 Method
    • 304 Step
    • 305 Step
    • 306 Step
    • 307 Step
    • 308 Or symbol
    • 309 Step
    • 310 Step
    • 400 Method
    • 411 Step
    • 412 Step
    • 413 Step
    • 500 Method
    • 501 Step
    • 502 Step
    • 503 Step
    • 504 Step
    • 600 Example
    • 601 Moment in time

Claims (25)

1. A method implemented by a digital camera for identifying image-acquisition settings, the method comprising the steps of:
obtaining pre-image-acquisition information;
transmitting the pre-image-acquisition information to a system external to the digital camera, the system configured to provide image-acquisition settings;
receiving the image-acquisition settings from the system in response to the transmitting step; and
performing an image-acquisition sequence based at least upon the received image-acquisition settings or a derivative thereof.
2. The method of claim 1, wherein at least some of the pre-image-acquisition information is received from a system external to the digital camera.
3. The method of claim 1, wherein the system is a portable digital assistant, another camera, or a server.
4. The method of claim 1, wherein the pre-image-acquisition information includes only an announcement of the digital camera's presence.
5. The method of claim 1, wherein the pre-image-acquisition information includes an announcement of the digital camera's presence.
6. The method of claim 1, further comprising the step of determining that it is appropriate to perform the image-acquisition sequence based at least upon an analysis of the received image-acquisition settings.
7. The method of claim 6, further comprising the step of determining that performing the image-acquisition sequence is not appropriate based at least upon an analysis of the received image-acquisition settings.
8. The method of claim 7, further comprising the step of presenting an indication configured to warn a user of the digital camera that performing the image-acquisition sequence is not appropriate.
9. The method of claim 7, further comprising the step of modifying the image acquisition settings to improve the appropriateness of performing the image-acquisition sequence.
10. The method of claim 1, wherein the image-acquisition sequence involves multiple image acquisitions, and wherein the multiple images are configured to be synthesized into a single image, a collage, or a video.
11. The method of claim 1, wherein the received image-acquisition settings include an indication that performing the image-acquisition sequence is permitted, and wherein the method further comprises the step of determining that the image-acquisition sequence is permitted based at least upon the indication.
12. The method of claim 1, further comprising the steps of:
generating image data from the image-acquisition sequence; and
transmitting at least the image data and the pre-image-acquisition information to an image processing system external to the digital camera for processing.
13. The method of claim 12, wherein the image-processing system and the system configured to provide image-acquisition settings are a same system.
14. The method of claim 12, further comprising the step of receiving a processed image from the image processing system in response to the step of transmitting the image data and the pre-image-acquisition information.
15. The method of claim 12, further comprising the step of obtaining image-acquisition information contemporaneously with the step of performing the image-acquisition sequence, wherein the step of transmitting at least the image data and the pre-image-acquisition information includes transmitting at least the image data, the pre-image acquisition information, and the acquisition information.
16. The method of claim 15, wherein the image-acquisition information includes audio information, illumination information, camera position information, camera orientation information, motion information, temperature information, humidity information, ceiling detection information, distance-to-subject information, or spectral information.
17. The method of claim 15, wherein information in the pre-image-acquisition information and the image-acquisition information is of a same category.
18. The method of claim 17, further comprising the step of verifying consistency between the information of the same category.
19. The method of claim 1, wherein the image-acquisition sequence involves multiple image acquisitions, and wherein the received image acquisition settings include different image acquisition settings for different ones of the multiple image acquisitions.
20. The method of claim 1, further comprising the step of determining what metadata or an amount of detail in metadata that is recorded by a digital camera along with the image-acquisition sequence based at least upon an analysis of the received image-acquisition settings.
21. The method of claim 11, wherein the pre-image-acquisition information includes (a) digital camera location information, (b) digital camera orientation information, (c) image content information from an image acquired by the digital camera just prior to performing the image acquisition sequence, or (d) combinations thereof, from which the indication that performing the image-acquisition sequence is permitted is based.
22. The method of claim 1, wherein
(1) the pre-image-acquisition information includes (a) digital camera location information, (b) digital camera orientation information, (c) image content information from an image acquired by the digital camera just prior to performing the image acquisition sequence, or (d) combinations thereof,
(2) the image-acquisition settings limit the digital camera to using at least (a) a particular image-acquisition mode or parameter or (b) one of a particular set of image-acquisition modes or parameters, and
(3) the image-acquisition settings are configured to prevent (a) a particular use of the digital camera or (b) a particular object from being represented in an image acquired by the digital camera.
23. The method of claim 22, wherein an the image-acquisition mode(s) or parameter(s) include(s) a focus distance, a camera operating mode, a flash mode, a still or video mode, or a panorama or non-panorama mode.
24. A method implemented by a digital camera for identifying an image-acquisition opportunity, the method comprising the steps of:
obtaining pre-image-acquisition information indicating at least a present time;
transmitting the pre-image acquisition information to a system external to the digital camera;
receiving image-opportunity information from the system in response to the transmitting step, the image-opportunity information configured to guide a user of the digital camera towards the image acquisition opportunity, wherein the image-opportunity information indicates at least a time period in which the image-acquisition opportunity exists; and
presenting the image-opportunity information in a manner configured to present the image-opportunity information, or a derivative thereof, to a user of the digital camera.
25. A system comprising:
an image-acquisition-setting (“IAS”) providing system configured to provide image-acquisition settings; and
a digital camera external to the IAS providing system, the digital camera configured at least to:
obtain pre-image-acquisition information;
transmit the pre-image-acquisition information to the IAS providing system;
receive image-acquisition settings from the IAS providing system in response to transmitting the pre-image acquisition information to the IAS providing system; and
performing an image-acquisition sequence based at least upon the received image-acquisition settings.
US11/961,497 2007-12-20 2007-12-20 Remote determination of image-acquisition settings and opportunities Abandoned US20090160970A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US11/961,497 US20090160970A1 (en) 2007-12-20 2007-12-20 Remote determination of image-acquisition settings and opportunities
PCT/US2008/013483 WO2009085099A2 (en) 2007-12-20 2008-12-08 Remote determination of image-acquisition settings and opportunities
US13/151,304 US8305452B2 (en) 2007-12-20 2011-06-02 Remote determination of image-acquisition settings and opportunities

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US11/961,497 US20090160970A1 (en) 2007-12-20 2007-12-20 Remote determination of image-acquisition settings and opportunities

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/151,304 Continuation US8305452B2 (en) 2007-12-20 2011-06-02 Remote determination of image-acquisition settings and opportunities

Publications (1)

Publication Number Publication Date
US20090160970A1 true US20090160970A1 (en) 2009-06-25

Family

ID=40460007

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/961,497 Abandoned US20090160970A1 (en) 2007-12-20 2007-12-20 Remote determination of image-acquisition settings and opportunities
US13/151,304 Active US8305452B2 (en) 2007-12-20 2011-06-02 Remote determination of image-acquisition settings and opportunities

Family Applications After (1)

Application Number Title Priority Date Filing Date
US13/151,304 Active US8305452B2 (en) 2007-12-20 2011-06-02 Remote determination of image-acquisition settings and opportunities

Country Status (2)

Country Link
US (2) US20090160970A1 (en)
WO (1) WO2009085099A2 (en)

Cited By (156)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110043658A1 (en) * 2008-04-30 2011-02-24 Sony Corporation Information recording apparatus, image capturing apparatus, information recording method, and program
US20130262565A1 (en) * 2012-03-27 2013-10-03 Sony Corporation Server, client terminal, system, and storage medium
US20140122702A1 (en) * 2012-10-31 2014-05-01 Elwha Llc Methods and systems for monitoring and/or managing device data
US20150049211A1 (en) * 2013-08-19 2015-02-19 Lg Electronics Inc. Mobile terminal and control method thereof
US9083770B1 (en) 2013-11-26 2015-07-14 Snapchat, Inc. Method and system for integrating real time communication features in applications
US9094137B1 (en) 2014-06-13 2015-07-28 Snapchat, Inc. Priority based placement of messages in a geo-location based event gallery
US9225897B1 (en) * 2014-07-07 2015-12-29 Snapchat, Inc. Apparatus and method for supplying content aware photo filters
US9237202B1 (en) 2014-03-07 2016-01-12 Snapchat, Inc. Content delivery network for ephemeral objects
WO2016014657A1 (en) * 2014-07-23 2016-01-28 Ebay Inc. Use of camera metadata for recommendations
US9276886B1 (en) 2014-05-09 2016-03-01 Snapchat, Inc. Apparatus and method for dynamically configuring application component tiles
US9385983B1 (en) 2014-12-19 2016-07-05 Snapchat, Inc. Gallery of messages from individuals with a shared interest
US9396354B1 (en) 2014-05-28 2016-07-19 Snapchat, Inc. Apparatus and method for automated privacy protection in distributed images
US20160248958A1 (en) * 2013-10-28 2016-08-25 Canon Kabushiki Kaisha Image capturing apparatus, external apparatus, image capturing system, method for controlling image capturing apparatus, computer program, and computer-readable storage medium
US9537811B2 (en) 2014-10-02 2017-01-03 Snap Inc. Ephemeral gallery of ephemeral messages
CN106339022A (en) * 2015-07-09 2017-01-18 中华映管股份有限公司 Electronic Device And Multimedia Control Method Thereof
US9705831B2 (en) 2013-05-30 2017-07-11 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US9721394B2 (en) 2012-08-22 2017-08-01 Snaps Media, Inc. Augmented reality virtual content platform apparatuses, methods and systems
US9742713B2 (en) 2013-05-30 2017-08-22 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US9843720B1 (en) 2014-11-12 2017-12-12 Snap Inc. User interface for accessing media at a geographic location
US9854219B2 (en) 2014-12-19 2017-12-26 Snap Inc. Gallery of videos set to an audio time line
US9866999B1 (en) 2014-01-12 2018-01-09 Investment Asset Holdings Llc Location-based messaging
US9882907B1 (en) 2012-11-08 2018-01-30 Snap Inc. Apparatus and method for single action control of social network profile access
US9881094B2 (en) 2015-05-05 2018-01-30 Snap Inc. Systems and methods for automated local story generation and curation
US9886458B2 (en) 2012-11-26 2018-02-06 Elwha Llc Methods and systems for managing one or more services and/or device data
US9936030B2 (en) 2014-01-03 2018-04-03 Investel Capital Corporation User content sharing system and method with location-based external content integration
US9948492B2 (en) 2012-10-30 2018-04-17 Elwha Llc Methods and systems for managing data
US10055717B1 (en) 2014-08-22 2018-08-21 Snap Inc. Message processor with application prompts
US10082926B1 (en) 2014-02-21 2018-09-25 Snap Inc. Apparatus and method for alternate channel communication initiated through a common message thread
US10091325B2 (en) 2012-10-30 2018-10-02 Elwha Llc Methods and systems for data services
US10102680B2 (en) 2015-10-30 2018-10-16 Snap Inc. Image based tracking in augmented reality systems
US10123166B2 (en) 2015-01-26 2018-11-06 Snap Inc. Content request by location
US10135949B1 (en) 2015-05-05 2018-11-20 Snap Inc. Systems and methods for story and sub-story navigation
US10133705B1 (en) 2015-01-19 2018-11-20 Snap Inc. Multichannel system
US10157449B1 (en) 2015-01-09 2018-12-18 Snap Inc. Geo-location-based image filters
US10165402B1 (en) 2016-06-28 2018-12-25 Snap Inc. System to track engagement of media items
US10203855B2 (en) 2016-12-09 2019-02-12 Snap Inc. Customized user-controlled media overlays
US10216957B2 (en) 2012-11-26 2019-02-26 Elwha Llc Methods and systems for managing data and/or services for devices
US10219111B1 (en) 2018-04-18 2019-02-26 Snap Inc. Visitation tracking system
US10223397B1 (en) 2015-03-13 2019-03-05 Snap Inc. Social graph based co-location of network users
US10284508B1 (en) 2014-10-02 2019-05-07 Snap Inc. Ephemeral gallery of ephemeral messages with opt-in permanence
US10311916B2 (en) 2014-12-19 2019-06-04 Snap Inc. Gallery of videos set to an audio time line
US10319149B1 (en) 2017-02-17 2019-06-11 Snap Inc. Augmented reality anamorphosis system
US10327096B1 (en) 2018-03-06 2019-06-18 Snap Inc. Geo-fence selection system
US10334307B2 (en) 2011-07-12 2019-06-25 Snap Inc. Methods and systems of providing visual content editing functions
US10348662B2 (en) 2016-07-19 2019-07-09 Snap Inc. Generating customized electronic messaging graphics
US10354425B2 (en) 2015-12-18 2019-07-16 Snap Inc. Method and system for providing context relevant media augmentation
US10387514B1 (en) 2016-06-30 2019-08-20 Snap Inc. Automated content curation and communication
US10387730B1 (en) 2017-04-20 2019-08-20 Snap Inc. Augmented reality typography personalization system
US10423983B2 (en) 2014-09-16 2019-09-24 Snap Inc. Determining targeting information based on a predictive targeting model
US10430838B1 (en) 2016-06-28 2019-10-01 Snap Inc. Methods and systems for generation, curation, and presentation of media collections with automated advertising
US10439972B1 (en) 2013-05-30 2019-10-08 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US10474321B2 (en) 2015-11-30 2019-11-12 Snap Inc. Network resource location linking and visual content sharing
US10499191B1 (en) 2017-10-09 2019-12-03 Snap Inc. Context sensitive presentation of content
US10523625B1 (en) 2017-03-09 2019-12-31 Snap Inc. Restricted group content collection
US10581782B2 (en) 2017-03-27 2020-03-03 Snap Inc. Generating a stitched data stream
US10582277B2 (en) 2017-03-27 2020-03-03 Snap Inc. Generating a stitched data stream
US10616239B2 (en) 2015-03-18 2020-04-07 Snap Inc. Geo-fence authorization provisioning
US10614828B1 (en) 2017-02-20 2020-04-07 Snap Inc. Augmented reality speech balloon system
US10623666B2 (en) 2016-11-07 2020-04-14 Snap Inc. Selective identification and order of image modifiers
US10630894B2 (en) * 2016-06-01 2020-04-21 Canon Kabushiki Kaisha Notification system, wearable device, information processing apparatus, control method thereof, and computer-readable storage medium
US10638256B1 (en) 2016-06-20 2020-04-28 Pipbin, Inc. System for distribution and display of mobile targeted augmented reality content
US10657708B1 (en) 2015-11-30 2020-05-19 Snap Inc. Image and point cloud based tracking and in augmented reality systems
US10678818B2 (en) 2018-01-03 2020-06-09 Snap Inc. Tag distribution visualization system
US10679389B2 (en) 2016-02-26 2020-06-09 Snap Inc. Methods and systems for generation, curation, and presentation of media collections
US10679393B2 (en) 2018-07-24 2020-06-09 Snap Inc. Conditional modification of augmented reality object
US10740974B1 (en) 2017-09-15 2020-08-11 Snap Inc. Augmented reality system
WO2020168956A1 (en) * 2019-02-18 2020-08-27 华为技术有限公司 Method for photographing the moon and electronic device
US10805696B1 (en) 2016-06-20 2020-10-13 Pipbin, Inc. System for recording and targeting tagged content of user interest
US10817898B2 (en) 2015-08-13 2020-10-27 Placed, Llc Determining exposures to content presented by physical objects
US10824654B2 (en) 2014-09-18 2020-11-03 Snap Inc. Geolocation-based pictographs
US10834525B2 (en) 2016-02-26 2020-11-10 Snap Inc. Generation, curation, and presentation of media collections
US10839219B1 (en) 2016-06-20 2020-11-17 Pipbin, Inc. System for curation, distribution and display of location-dependent augmented reality content
US10862951B1 (en) 2007-01-05 2020-12-08 Snap Inc. Real-time display of multiple images
US10885136B1 (en) 2018-02-28 2021-01-05 Snap Inc. Audience filtering system
US10915911B2 (en) 2017-02-03 2021-02-09 Snap Inc. System to determine a price-schedule to distribute media content
US10933311B2 (en) 2018-03-14 2021-03-02 Snap Inc. Generating collectible items based on location information
US10948717B1 (en) 2015-03-23 2021-03-16 Snap Inc. Reducing boot time and power consumption in wearable display systems
US10952013B1 (en) 2017-04-27 2021-03-16 Snap Inc. Selective location-based identity communication
US10963529B1 (en) 2017-04-27 2021-03-30 Snap Inc. Location-based search mechanism in a graphical user interface
US10979752B1 (en) 2018-02-28 2021-04-13 Snap Inc. Generating media content items based on location information
US10993069B2 (en) 2015-07-16 2021-04-27 Snap Inc. Dynamically adaptive media content delivery
US10997760B2 (en) 2018-08-31 2021-05-04 Snap Inc. Augmented reality anthropomorphization system
US11017173B1 (en) 2017-12-22 2021-05-25 Snap Inc. Named entity recognition visual context and caption data
US11023514B2 (en) 2016-02-26 2021-06-01 Snap Inc. Methods and systems for generation, curation, and presentation of media collections
US11030787B2 (en) 2017-10-30 2021-06-08 Snap Inc. Mobile-based cartographic control of display content
US11037372B2 (en) 2017-03-06 2021-06-15 Snap Inc. Virtual vision system
US11044393B1 (en) 2016-06-20 2021-06-22 Pipbin, Inc. System for curation and display of location-dependent augmented reality content in an augmented estate system
US11128715B1 (en) 2019-12-30 2021-09-21 Snap Inc. Physical friend proximity in chat
US11163941B1 (en) 2018-03-30 2021-11-02 Snap Inc. Annotating a collection of media content items
US11170393B1 (en) 2017-04-11 2021-11-09 Snap Inc. System to calculate an engagement score of location based media content
US11182383B1 (en) 2012-02-24 2021-11-23 Placed, Llc System and method for data collection to validate location data
US11201981B1 (en) 2016-06-20 2021-12-14 Pipbin, Inc. System for notification of user accessibility of curated location-dependent content in an augmented estate
US11199957B1 (en) 2018-11-30 2021-12-14 Snap Inc. Generating customized avatars based on location information
US11206615B2 (en) 2019-05-30 2021-12-21 Snap Inc. Wearable device location systems
US11218838B2 (en) 2019-10-31 2022-01-04 Snap Inc. Focused map-based context information surfacing
US11216869B2 (en) 2014-09-23 2022-01-04 Snap Inc. User interface to augment an image using geolocation
US11228551B1 (en) 2020-02-12 2022-01-18 Snap Inc. Multiple gateway message exchange
US11232040B1 (en) 2017-04-28 2022-01-25 Snap Inc. Precaching unlockable data elements
US20220046163A1 (en) * 2016-11-01 2022-02-10 Snap Inc. Fast video capture and sensor adjustment
US11250075B1 (en) 2017-02-17 2022-02-15 Snap Inc. Searching social media content
US11249614B2 (en) 2019-03-28 2022-02-15 Snap Inc. Generating personalized map interface with enhanced icons
US11265273B1 (en) 2017-12-01 2022-03-01 Snap, Inc. Dynamic media overlay with smart widget
US11290851B2 (en) 2020-06-15 2022-03-29 Snap Inc. Location sharing using offline and online objects
US11294936B1 (en) 2019-01-30 2022-04-05 Snap Inc. Adaptive spatial density based clustering
US11301117B2 (en) 2019-03-08 2022-04-12 Snap Inc. Contextual information in chat
US11314776B2 (en) 2020-06-15 2022-04-26 Snap Inc. Location sharing using friend list versions
US11343323B2 (en) 2019-12-31 2022-05-24 Snap Inc. Augmented reality objects registry
US11361493B2 (en) 2019-04-01 2022-06-14 Snap Inc. Semantic texture mapping system
US11388226B1 (en) 2015-01-13 2022-07-12 Snap Inc. Guided personal identity based actions
US11430091B2 (en) 2020-03-27 2022-08-30 Snap Inc. Location mapping for large scale augmented-reality
US11429618B2 (en) 2019-12-30 2022-08-30 Snap Inc. Surfacing augmented reality objects
US11455082B2 (en) 2018-09-28 2022-09-27 Snap Inc. Collaborative achievement interface
US11475254B1 (en) 2017-09-08 2022-10-18 Snap Inc. Multimodal entity identification
US11483267B2 (en) 2020-06-15 2022-10-25 Snap Inc. Location sharing using different rate-limited links
US11500525B2 (en) 2019-02-25 2022-11-15 Snap Inc. Custom media overlay system
US11503432B2 (en) 2020-06-15 2022-11-15 Snap Inc. Scalable real-time location sharing framework
US11507614B1 (en) 2018-02-13 2022-11-22 Snap Inc. Icon based tagging
US11516167B2 (en) 2020-03-05 2022-11-29 Snap Inc. Storing data based on device location
US11533444B2 (en) * 2017-07-19 2022-12-20 Fujifilm Business Innovation Corp. Image processing device
US11558709B2 (en) 2018-11-30 2023-01-17 Snap Inc. Position service to determine relative position to map features
US11574431B2 (en) 2019-02-26 2023-02-07 Snap Inc. Avatar based on weather
US11601783B2 (en) 2019-06-07 2023-03-07 Snap Inc. Detection of a physical collision between two client devices in a location sharing system
US11601888B2 (en) 2021-03-29 2023-03-07 Snap Inc. Determining location using multi-source geolocation data
US11606755B2 (en) 2019-05-30 2023-03-14 Snap Inc. Wearable device location systems architecture
US11611691B2 (en) 2018-09-11 2023-03-21 Profoto Aktiebolag Computer implemented method and a system for coordinating taking of a picture using a camera and initiation of a flash pulse of at least one flash device
US11616745B2 (en) 2017-01-09 2023-03-28 Snap Inc. Contextual generation and selection of customized media content
US11619501B2 (en) 2020-03-11 2023-04-04 Snap Inc. Avatar based on trip
US11625443B2 (en) 2014-06-05 2023-04-11 Snap Inc. Web document enhancement
US11631276B2 (en) 2016-03-31 2023-04-18 Snap Inc. Automated avatar generation
US11645324B2 (en) 2021-03-31 2023-05-09 Snap Inc. Location-based timeline media content system
US11676378B2 (en) 2020-06-29 2023-06-13 Snap Inc. Providing travel-based augmented reality content with a captured image
US11675831B2 (en) 2017-05-31 2023-06-13 Snap Inc. Geolocation based playlists
US11714535B2 (en) 2019-07-11 2023-08-01 Snap Inc. Edge gesture interface with smart interactions
US11729343B2 (en) 2019-12-30 2023-08-15 Snap Inc. Including video feed in message thread
US11734712B2 (en) 2012-02-24 2023-08-22 Foursquare Labs, Inc. Attributing in-store visits to media consumption based on data collected from user devices
US11751015B2 (en) 2019-01-16 2023-09-05 Snap Inc. Location-based context information sharing in a messaging system
US11776256B2 (en) 2020-03-27 2023-10-03 Snap Inc. Shared augmented reality system
US11785161B1 (en) 2016-06-20 2023-10-10 Pipbin, Inc. System for user accessibility of tagged curated augmented reality content
US11799811B2 (en) 2018-10-31 2023-10-24 Snap Inc. Messaging and gaming applications communication platform
US11809624B2 (en) 2019-02-13 2023-11-07 Snap Inc. Sleep detection in a location sharing system
US11816853B2 (en) 2016-08-30 2023-11-14 Snap Inc. Systems and methods for simultaneous localization and mapping
US11821742B2 (en) 2019-09-26 2023-11-21 Snap Inc. Travel based notifications
US11829834B2 (en) 2021-10-29 2023-11-28 Snap Inc. Extended QR code
US11843456B2 (en) 2016-10-24 2023-12-12 Snap Inc. Generating and displaying customized avatars in media overlays
US11842411B2 (en) 2017-04-27 2023-12-12 Snap Inc. Location-based virtual avatars
US11852554B1 (en) 2019-03-21 2023-12-26 Snap Inc. Barometer calibration in a location sharing system
US11860888B2 (en) 2018-05-22 2024-01-02 Snap Inc. Event detection system
US11863866B2 (en) 2019-02-01 2024-01-02 Profoto Aktiebolag Housing for an intermediate signal transmission unit and an intermediate signal transmission unit
US11870743B1 (en) 2017-01-23 2024-01-09 Snap Inc. Customized digital avatar accessories
US11868414B1 (en) 2019-03-14 2024-01-09 Snap Inc. Graph-based prediction for contact suggestion in a location sharing system
US11877211B2 (en) 2019-01-14 2024-01-16 Snap Inc. Destination sharing in location sharing system
US11876941B1 (en) 2016-06-20 2024-01-16 Pipbin, Inc. Clickable augmented reality content manager, system, and network
US11893208B2 (en) 2019-12-31 2024-02-06 Snap Inc. Combined map icon with action indicator
US11900418B2 (en) 2016-04-04 2024-02-13 Snap Inc. Mutable geo-fencing system
US11925869B2 (en) 2012-05-08 2024-03-12 Snap Inc. System and method for generating and displaying avatars
US11943192B2 (en) 2020-08-31 2024-03-26 Snap Inc. Co-location connection service

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201222429A (en) * 2010-11-23 2012-06-01 Inventec Corp Web camera device and operating method thereof
CN103188423A (en) * 2011-12-27 2013-07-03 富泰华工业(深圳)有限公司 Camera shooting device and camera shooting method
US8928776B2 (en) * 2012-11-21 2015-01-06 International Business Machines Corporation Camera resolution modification based on intended printing location
US9060127B2 (en) 2013-01-23 2015-06-16 Orcam Technologies Ltd. Apparatus for adjusting image capture settings
EP3649774A1 (en) * 2017-07-03 2020-05-13 C/o Canon Kabushiki Kaisha Method and system for auto-setting cameras
CN112470472B (en) * 2018-06-11 2023-03-24 无锡安科迪智能技术有限公司 Blind compression sampling method and device and imaging system

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5086314A (en) * 1990-05-21 1992-02-04 Nikon Corporation Exposure control apparatus for camera
US6301440B1 (en) * 2000-04-13 2001-10-09 International Business Machines Corp. System and method for automatically setting image acquisition controls
US20020034384A1 (en) * 2000-09-18 2002-03-21 Mikhail Peter G. Location sensing camera
US20030007076A1 (en) * 2001-07-02 2003-01-09 Minolta Co., Ltd. Image-processing apparatus and image-quality control method
US20040004663A1 (en) * 2002-07-02 2004-01-08 Lightsurf Technologies, Inc. Imaging system providing automatic organization and processing of images based on location
US20050122405A1 (en) * 2003-12-09 2005-06-09 Voss James S. Digital cameras and methods using GPS/time-based and/or location data to provide scene selection, and dynamic illumination and exposure adjustment
US20050172147A1 (en) * 2004-02-04 2005-08-04 Eric Edwards Methods and apparatuses for identifying opportunities to capture content
US20050221841A1 (en) * 2004-03-31 2005-10-06 Piccionelli Gregory A Location-based control of functions of electronic devices
US20050275726A1 (en) * 2004-06-14 2005-12-15 Charles Abraham Method and apparatus for tagging digital photographs with geographic location data
US20060153469A1 (en) * 2005-01-11 2006-07-13 Eastman Kodak Company Image processing based on ambient air attributes
US20070255456A1 (en) * 2004-09-07 2007-11-01 Chisato Funayama Image Processing System and Method, and Terminal and Server Used for the Same

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4499908B2 (en) * 2000-12-19 2010-07-14 オリンパス株式会社 Electronic camera system, electronic camera, server computer, and photographing condition correction method
KR100814426B1 (en) * 2001-07-14 2008-03-18 삼성전자주식회사 multi-channel image processer and system employing the same
JP2004064385A (en) * 2002-07-29 2004-02-26 Seiko Epson Corp Digital camera
JP4296385B2 (en) * 2002-09-26 2009-07-15 富士フイルム株式会社 Electronic camera
GB2403365B (en) 2003-06-27 2008-01-30 Hewlett Packard Development Co An autonomous camera having exchangeable behaviours
DE602006009191D1 (en) * 2005-07-26 2009-10-29 Canon Kk Imaging device and method
GB2428927A (en) * 2005-08-05 2007-02-07 Hewlett Packard Development Co Accurate positioning of a time lapse camera
JP2007282017A (en) * 2006-04-10 2007-10-25 Nec Commun Syst Ltd Portable mobile communication terminal with use limited picture photographing function, and picture photographing function use limiting method and program for the same
US9430587B2 (en) * 2006-06-05 2016-08-30 Qualcomm Incorporated Techniques for managing media content

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5086314A (en) * 1990-05-21 1992-02-04 Nikon Corporation Exposure control apparatus for camera
US6301440B1 (en) * 2000-04-13 2001-10-09 International Business Machines Corp. System and method for automatically setting image acquisition controls
US20020034384A1 (en) * 2000-09-18 2002-03-21 Mikhail Peter G. Location sensing camera
US20030007076A1 (en) * 2001-07-02 2003-01-09 Minolta Co., Ltd. Image-processing apparatus and image-quality control method
US20040004663A1 (en) * 2002-07-02 2004-01-08 Lightsurf Technologies, Inc. Imaging system providing automatic organization and processing of images based on location
US20050122405A1 (en) * 2003-12-09 2005-06-09 Voss James S. Digital cameras and methods using GPS/time-based and/or location data to provide scene selection, and dynamic illumination and exposure adjustment
US20050172147A1 (en) * 2004-02-04 2005-08-04 Eric Edwards Methods and apparatuses for identifying opportunities to capture content
US20050221841A1 (en) * 2004-03-31 2005-10-06 Piccionelli Gregory A Location-based control of functions of electronic devices
US20050275726A1 (en) * 2004-06-14 2005-12-15 Charles Abraham Method and apparatus for tagging digital photographs with geographic location data
US20070255456A1 (en) * 2004-09-07 2007-11-01 Chisato Funayama Image Processing System and Method, and Terminal and Server Used for the Same
US20060153469A1 (en) * 2005-01-11 2006-07-13 Eastman Kodak Company Image processing based on ambient air attributes

Cited By (362)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10862951B1 (en) 2007-01-05 2020-12-08 Snap Inc. Real-time display of multiple images
US11588770B2 (en) 2007-01-05 2023-02-21 Snap Inc. Real-time display of multiple images
US8817131B2 (en) * 2008-04-30 2014-08-26 Sony Corporation Information recording apparatus, image capturing apparatus, and information recording method for controlling recording of location information in generated images
US20110043658A1 (en) * 2008-04-30 2011-02-24 Sony Corporation Information recording apparatus, image capturing apparatus, information recording method, and program
US10999623B2 (en) 2011-07-12 2021-05-04 Snap Inc. Providing visual content editing functions
US11750875B2 (en) 2011-07-12 2023-09-05 Snap Inc. Providing visual content editing functions
US11451856B2 (en) 2011-07-12 2022-09-20 Snap Inc. Providing visual content editing functions
US10334307B2 (en) 2011-07-12 2019-06-25 Snap Inc. Methods and systems of providing visual content editing functions
US11182383B1 (en) 2012-02-24 2021-11-23 Placed, Llc System and method for data collection to validate location data
US11734712B2 (en) 2012-02-24 2023-08-22 Foursquare Labs, Inc. Attributing in-store visits to media consumption based on data collected from user devices
US9325862B2 (en) * 2012-03-27 2016-04-26 Sony Corporation Server, client terminal, system, and storage medium for capturing landmarks
US20130262565A1 (en) * 2012-03-27 2013-10-03 Sony Corporation Server, client terminal, system, and storage medium
US11925869B2 (en) 2012-05-08 2024-03-12 Snap Inc. System and method for generating and displaying avatars
US9792733B2 (en) 2012-08-22 2017-10-17 Snaps Media, Inc. Augmented reality virtual content platform apparatuses, methods and systems
US9721394B2 (en) 2012-08-22 2017-08-01 Snaps Media, Inc. Augmented reality virtual content platform apparatuses, methods and systems
US10169924B2 (en) 2012-08-22 2019-01-01 Snaps Media Inc. Augmented reality virtual content platform apparatuses, methods and systems
US10361900B2 (en) 2012-10-30 2019-07-23 Elwha Llc Methods and systems for managing data
US10091325B2 (en) 2012-10-30 2018-10-02 Elwha Llc Methods and systems for data services
US9948492B2 (en) 2012-10-30 2018-04-17 Elwha Llc Methods and systems for managing data
US20140122702A1 (en) * 2012-10-31 2014-05-01 Elwha Llc Methods and systems for monitoring and/or managing device data
US10069703B2 (en) * 2012-10-31 2018-09-04 Elwha Llc Methods and systems for monitoring and/or managing device data
US11252158B2 (en) 2012-11-08 2022-02-15 Snap Inc. Interactive user-interface to adjust access privileges
US9882907B1 (en) 2012-11-08 2018-01-30 Snap Inc. Apparatus and method for single action control of social network profile access
US10887308B1 (en) 2012-11-08 2021-01-05 Snap Inc. Interactive user-interface to adjust access privileges
US9886458B2 (en) 2012-11-26 2018-02-06 Elwha Llc Methods and systems for managing one or more services and/or device data
US10216957B2 (en) 2012-11-26 2019-02-26 Elwha Llc Methods and systems for managing data and/or services for devices
US11134046B2 (en) 2013-05-30 2021-09-28 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US9742713B2 (en) 2013-05-30 2017-08-22 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US10587552B1 (en) 2013-05-30 2020-03-10 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US9705831B2 (en) 2013-05-30 2017-07-11 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US11115361B2 (en) 2013-05-30 2021-09-07 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US10439972B1 (en) 2013-05-30 2019-10-08 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US11509618B2 (en) 2013-05-30 2022-11-22 Snap Inc. Maintaining a message thread with opt-in permanence for entries
US20150049211A1 (en) * 2013-08-19 2015-02-19 Lg Electronics Inc. Mobile terminal and control method thereof
US9538059B2 (en) * 2013-08-19 2017-01-03 Lg Electronics Inc. Mobile terminal and control method thereof
US20160248958A1 (en) * 2013-10-28 2016-08-25 Canon Kabushiki Kaisha Image capturing apparatus, external apparatus, image capturing system, method for controlling image capturing apparatus, computer program, and computer-readable storage medium
US10171747B2 (en) * 2013-10-28 2019-01-01 Canon Kabushiki Kaisha Image capturing apparatus, external apparatus, image capturing system, method for controlling image capturing apparatus, computer program, and computer-readable storage medium
US9083770B1 (en) 2013-11-26 2015-07-14 Snapchat, Inc. Method and system for integrating real time communication features in applications
US11102253B2 (en) 2013-11-26 2021-08-24 Snap Inc. Method and system for integrating real time communication features in applications
US9794303B1 (en) 2013-11-26 2017-10-17 Snap Inc. Method and system for integrating real time communication features in applications
US10069876B1 (en) 2013-11-26 2018-09-04 Snap Inc. Method and system for integrating real time communication features in applications
US11546388B2 (en) 2013-11-26 2023-01-03 Snap Inc. Method and system for integrating real time communication features in applications
US10681092B1 (en) 2013-11-26 2020-06-09 Snap Inc. Method and system for integrating real time communication features in applications
US9936030B2 (en) 2014-01-03 2018-04-03 Investel Capital Corporation User content sharing system and method with location-based external content integration
US9866999B1 (en) 2014-01-12 2018-01-09 Investment Asset Holdings Llc Location-based messaging
US10080102B1 (en) 2014-01-12 2018-09-18 Investment Asset Holdings Llc Location-based messaging
US10349209B1 (en) 2014-01-12 2019-07-09 Investment Asset Holdings Llc Location-based messaging
US10949049B1 (en) 2014-02-21 2021-03-16 Snap Inc. Apparatus and method for alternate channel communication initiated through a common message thread
US11463394B2 (en) 2014-02-21 2022-10-04 Snap Inc. Apparatus and method for alternate channel communication initiated through a common message thread
US10082926B1 (en) 2014-02-21 2018-09-25 Snap Inc. Apparatus and method for alternate channel communication initiated through a common message thread
US11463393B2 (en) 2014-02-21 2022-10-04 Snap Inc. Apparatus and method for alternate channel communication initiated through a common message thread
US10958605B1 (en) 2014-02-21 2021-03-23 Snap Inc. Apparatus and method for alternate channel communication initiated through a common message thread
US11902235B2 (en) 2014-02-21 2024-02-13 Snap Inc. Apparatus and method for alternate channel communication initiated through a common message thread
US10084735B1 (en) 2014-02-21 2018-09-25 Snap Inc. Apparatus and method for alternate channel communication initiated through a common message thread
US9237202B1 (en) 2014-03-07 2016-01-12 Snapchat, Inc. Content delivery network for ephemeral objects
US9407712B1 (en) 2014-03-07 2016-08-02 Snapchat, Inc. Content delivery network for ephemeral objects
US11743219B2 (en) 2014-05-09 2023-08-29 Snap Inc. Dynamic configuration of application component tiles
US9276886B1 (en) 2014-05-09 2016-03-01 Snapchat, Inc. Apparatus and method for dynamically configuring application component tiles
US10817156B1 (en) 2014-05-09 2020-10-27 Snap Inc. Dynamic configuration of application component tiles
US11310183B2 (en) 2014-05-09 2022-04-19 Snap Inc. Dynamic configuration of application component tiles
US9785796B1 (en) 2014-05-28 2017-10-10 Snap Inc. Apparatus and method for automated privacy protection in distributed images
US9396354B1 (en) 2014-05-28 2016-07-19 Snapchat, Inc. Apparatus and method for automated privacy protection in distributed images
US10990697B2 (en) 2014-05-28 2021-04-27 Snap Inc. Apparatus and method for automated privacy protection in distributed images
US10572681B1 (en) 2014-05-28 2020-02-25 Snap Inc. Apparatus and method for automated privacy protection in distributed images
US11921805B2 (en) 2014-06-05 2024-03-05 Snap Inc. Web document enhancement
US11625443B2 (en) 2014-06-05 2023-04-11 Snap Inc. Web document enhancement
US10524087B1 (en) 2014-06-13 2019-12-31 Snap Inc. Message destination list mechanism
US11317240B2 (en) 2014-06-13 2022-04-26 Snap Inc. Geo-location based event gallery
US9693191B2 (en) 2014-06-13 2017-06-27 Snap Inc. Prioritization of messages within gallery
US9430783B1 (en) 2014-06-13 2016-08-30 Snapchat, Inc. Prioritization of messages within gallery
US10182311B2 (en) 2014-06-13 2019-01-15 Snap Inc. Prioritization of messages within a message collection
US10200813B1 (en) 2014-06-13 2019-02-05 Snap Inc. Geo-location based event gallery
US10779113B2 (en) 2014-06-13 2020-09-15 Snap Inc. Prioritization of messages within a message collection
US9532171B2 (en) 2014-06-13 2016-12-27 Snap Inc. Geo-location based event gallery
US9825898B2 (en) 2014-06-13 2017-11-21 Snap Inc. Prioritization of messages within a message collection
US10623891B2 (en) 2014-06-13 2020-04-14 Snap Inc. Prioritization of messages within a message collection
US11166121B2 (en) 2014-06-13 2021-11-02 Snap Inc. Prioritization of messages within a message collection
US10659914B1 (en) 2014-06-13 2020-05-19 Snap Inc. Geo-location based event gallery
US9113301B1 (en) 2014-06-13 2015-08-18 Snapchat, Inc. Geo-location based event gallery
US9094137B1 (en) 2014-06-13 2015-07-28 Snapchat, Inc. Priority based placement of messages in a geo-location based event gallery
US10448201B1 (en) 2014-06-13 2019-10-15 Snap Inc. Prioritization of messages within a message collection
US10602057B1 (en) * 2014-07-07 2020-03-24 Snap Inc. Supplying content aware photo filters
US9407816B1 (en) 2014-07-07 2016-08-02 Snapchat, Inc. Apparatus and method for supplying content aware photo filters
US10432850B1 (en) 2014-07-07 2019-10-01 Snap Inc. Apparatus and method for supplying content aware photo filters
US11122200B2 (en) 2014-07-07 2021-09-14 Snap Inc. Supplying content aware photo filters
US10154192B1 (en) 2014-07-07 2018-12-11 Snap Inc. Apparatus and method for supplying content aware photo filters
US11595569B2 (en) 2014-07-07 2023-02-28 Snap Inc. Supplying content aware photo filters
US9225897B1 (en) * 2014-07-07 2015-12-29 Snapchat, Inc. Apparatus and method for supplying content aware photo filters
US11496673B1 (en) 2014-07-07 2022-11-08 Snap Inc. Apparatus and method for supplying content aware photo filters
US11849214B2 (en) * 2014-07-07 2023-12-19 Snap Inc. Apparatus and method for supplying content aware photo filters
US20230020575A1 (en) * 2014-07-07 2023-01-19 Snap Inc. Apparatus and method for supplying content aware photo filters
US10701262B1 (en) 2014-07-07 2020-06-30 Snap Inc. Apparatus and method for supplying content aware photo filters
US10348960B1 (en) * 2014-07-07 2019-07-09 Snap Inc. Apparatus and method for supplying content aware photo filters
WO2016014657A1 (en) * 2014-07-23 2016-01-28 Ebay Inc. Use of camera metadata for recommendations
US11704905B2 (en) 2014-07-23 2023-07-18 Ebay Inc. Use of camera metadata for recommendations
US10248862B2 (en) 2014-07-23 2019-04-02 Ebay Inc. Use of camera metadata for recommendations
US10055717B1 (en) 2014-08-22 2018-08-21 Snap Inc. Message processor with application prompts
US11017363B1 (en) 2014-08-22 2021-05-25 Snap Inc. Message processor with application prompts
US11625755B1 (en) 2014-09-16 2023-04-11 Foursquare Labs, Inc. Determining targeting information based on a predictive targeting model
US10423983B2 (en) 2014-09-16 2019-09-24 Snap Inc. Determining targeting information based on a predictive targeting model
US10824654B2 (en) 2014-09-18 2020-11-03 Snap Inc. Geolocation-based pictographs
US11281701B2 (en) 2014-09-18 2022-03-22 Snap Inc. Geolocation-based pictographs
US11741136B2 (en) 2014-09-18 2023-08-29 Snap Inc. Geolocation-based pictographs
US11216869B2 (en) 2014-09-23 2022-01-04 Snap Inc. User interface to augment an image using geolocation
US9537811B2 (en) 2014-10-02 2017-01-03 Snap Inc. Ephemeral gallery of ephemeral messages
US20170374003A1 (en) 2014-10-02 2017-12-28 Snapchat, Inc. Ephemeral gallery of ephemeral messages
US10958608B1 (en) 2014-10-02 2021-03-23 Snap Inc. Ephemeral gallery of visual media messages
US10284508B1 (en) 2014-10-02 2019-05-07 Snap Inc. Ephemeral gallery of ephemeral messages with opt-in permanence
US11012398B1 (en) 2014-10-02 2021-05-18 Snap Inc. Ephemeral message gallery user interface with screenshot messages
US11038829B1 (en) 2014-10-02 2021-06-15 Snap Inc. Ephemeral gallery of ephemeral messages with opt-in permanence
US11522822B1 (en) 2014-10-02 2022-12-06 Snap Inc. Ephemeral gallery elimination based on gallery and message timers
US10708210B1 (en) 2014-10-02 2020-07-07 Snap Inc. Multi-user ephemeral message gallery
US11411908B1 (en) 2014-10-02 2022-08-09 Snap Inc. Ephemeral message gallery user interface with online viewing history indicia
US10944710B1 (en) 2014-10-02 2021-03-09 Snap Inc. Ephemeral gallery user interface with remaining gallery time indication
US10476830B2 (en) 2014-10-02 2019-11-12 Snap Inc. Ephemeral gallery of ephemeral messages
US11855947B1 (en) 2014-10-02 2023-12-26 Snap Inc. Gallery of ephemeral messages
US9843720B1 (en) 2014-11-12 2017-12-12 Snap Inc. User interface for accessing media at a geographic location
US11956533B2 (en) 2014-11-12 2024-04-09 Snap Inc. Accessing media at a geographic location
US11190679B2 (en) 2014-11-12 2021-11-30 Snap Inc. Accessing media at a geographic location
US10616476B1 (en) 2014-11-12 2020-04-07 Snap Inc. User interface for accessing media at a geographic location
US10811053B2 (en) 2014-12-19 2020-10-20 Snap Inc. Routing messages by message parameter
US10514876B2 (en) 2014-12-19 2019-12-24 Snap Inc. Gallery of messages from individuals with a shared interest
US11250887B2 (en) 2014-12-19 2022-02-15 Snap Inc. Routing messages by message parameter
US11372608B2 (en) 2014-12-19 2022-06-28 Snap Inc. Gallery of messages from individuals with a shared interest
US9854219B2 (en) 2014-12-19 2017-12-26 Snap Inc. Gallery of videos set to an audio time line
US11803345B2 (en) 2014-12-19 2023-10-31 Snap Inc. Gallery of messages from individuals with a shared interest
US10311916B2 (en) 2014-12-19 2019-06-04 Snap Inc. Gallery of videos set to an audio time line
US11783862B2 (en) 2014-12-19 2023-10-10 Snap Inc. Routing messages by message parameter
US10580458B2 (en) 2014-12-19 2020-03-03 Snap Inc. Gallery of videos set to an audio time line
US9385983B1 (en) 2014-12-19 2016-07-05 Snapchat, Inc. Gallery of messages from individuals with a shared interest
US10157449B1 (en) 2015-01-09 2018-12-18 Snap Inc. Geo-location-based image filters
US11301960B2 (en) 2015-01-09 2022-04-12 Snap Inc. Object recognition based image filters
US10380720B1 (en) 2015-01-09 2019-08-13 Snap Inc. Location-based image filters
US11734342B2 (en) 2015-01-09 2023-08-22 Snap Inc. Object recognition based image overlays
US11962645B2 (en) 2015-01-13 2024-04-16 Snap Inc. Guided personal identity based actions
US11388226B1 (en) 2015-01-13 2022-07-12 Snap Inc. Guided personal identity based actions
US11249617B1 (en) 2015-01-19 2022-02-15 Snap Inc. Multichannel system
US10133705B1 (en) 2015-01-19 2018-11-20 Snap Inc. Multichannel system
US10416845B1 (en) 2015-01-19 2019-09-17 Snap Inc. Multichannel system
US10123166B2 (en) 2015-01-26 2018-11-06 Snap Inc. Content request by location
US11910267B2 (en) 2015-01-26 2024-02-20 Snap Inc. Content request by location
US10932085B1 (en) 2015-01-26 2021-02-23 Snap Inc. Content request by location
US11528579B2 (en) 2015-01-26 2022-12-13 Snap Inc. Content request by location
US10536800B1 (en) 2015-01-26 2020-01-14 Snap Inc. Content request by location
US10223397B1 (en) 2015-03-13 2019-03-05 Snap Inc. Social graph based co-location of network users
US11902287B2 (en) 2015-03-18 2024-02-13 Snap Inc. Geo-fence authorization provisioning
US10893055B2 (en) 2015-03-18 2021-01-12 Snap Inc. Geo-fence authorization provisioning
US10616239B2 (en) 2015-03-18 2020-04-07 Snap Inc. Geo-fence authorization provisioning
US11662576B2 (en) 2015-03-23 2023-05-30 Snap Inc. Reducing boot time and power consumption in displaying data content
US11320651B2 (en) 2015-03-23 2022-05-03 Snap Inc. Reducing boot time and power consumption in displaying data content
US10948717B1 (en) 2015-03-23 2021-03-16 Snap Inc. Reducing boot time and power consumption in wearable display systems
US10592574B2 (en) 2015-05-05 2020-03-17 Snap Inc. Systems and methods for automated local story generation and curation
US10135949B1 (en) 2015-05-05 2018-11-20 Snap Inc. Systems and methods for story and sub-story navigation
US11496544B2 (en) 2015-05-05 2022-11-08 Snap Inc. Story and sub-story navigation
US11392633B2 (en) 2015-05-05 2022-07-19 Snap Inc. Systems and methods for automated local story generation and curation
US10911575B1 (en) 2015-05-05 2021-02-02 Snap Inc. Systems and methods for story and sub-story navigation
US11449539B2 (en) 2015-05-05 2022-09-20 Snap Inc. Automated local story generation and curation
US9881094B2 (en) 2015-05-05 2018-01-30 Snap Inc. Systems and methods for automated local story generation and curation
CN106339022A (en) * 2015-07-09 2017-01-18 中华映管股份有限公司 Electronic Device And Multimedia Control Method Thereof
US9621819B2 (en) * 2015-07-09 2017-04-11 Chunghwa Picture Tubes, Ltd. Electronic device and multimedia control method thereof
US10993069B2 (en) 2015-07-16 2021-04-27 Snap Inc. Dynamically adaptive media content delivery
US11961116B2 (en) 2015-08-13 2024-04-16 Foursquare Labs, Inc. Determining exposures to content presented by physical objects
US10817898B2 (en) 2015-08-13 2020-10-27 Placed, Llc Determining exposures to content presented by physical objects
US10366543B1 (en) 2015-10-30 2019-07-30 Snap Inc. Image based tracking in augmented reality systems
US11769307B2 (en) 2015-10-30 2023-09-26 Snap Inc. Image based tracking in augmented reality systems
US11315331B2 (en) 2015-10-30 2022-04-26 Snap Inc. Image based tracking in augmented reality systems
US10102680B2 (en) 2015-10-30 2018-10-16 Snap Inc. Image based tracking in augmented reality systems
US10733802B2 (en) 2015-10-30 2020-08-04 Snap Inc. Image based tracking in augmented reality systems
US10657708B1 (en) 2015-11-30 2020-05-19 Snap Inc. Image and point cloud based tracking and in augmented reality systems
US11380051B2 (en) 2015-11-30 2022-07-05 Snap Inc. Image and point cloud based tracking and in augmented reality systems
US10997783B2 (en) 2015-11-30 2021-05-04 Snap Inc. Image and point cloud based tracking and in augmented reality systems
US11599241B2 (en) 2015-11-30 2023-03-07 Snap Inc. Network resource location linking and visual content sharing
US10474321B2 (en) 2015-11-30 2019-11-12 Snap Inc. Network resource location linking and visual content sharing
US10997758B1 (en) 2015-12-18 2021-05-04 Snap Inc. Media overlay publication system
US11830117B2 (en) 2015-12-18 2023-11-28 Snap Inc Media overlay publication system
US10354425B2 (en) 2015-12-18 2019-07-16 Snap Inc. Method and system for providing context relevant media augmentation
US11468615B2 (en) 2015-12-18 2022-10-11 Snap Inc. Media overlay publication system
US11023514B2 (en) 2016-02-26 2021-06-01 Snap Inc. Methods and systems for generation, curation, and presentation of media collections
US11889381B2 (en) 2016-02-26 2024-01-30 Snap Inc. Generation, curation, and presentation of media collections
US10834525B2 (en) 2016-02-26 2020-11-10 Snap Inc. Generation, curation, and presentation of media collections
US11611846B2 (en) 2016-02-26 2023-03-21 Snap Inc. Generation, curation, and presentation of media collections
US10679389B2 (en) 2016-02-26 2020-06-09 Snap Inc. Methods and systems for generation, curation, and presentation of media collections
US11197123B2 (en) 2016-02-26 2021-12-07 Snap Inc. Generation, curation, and presentation of media collections
US11631276B2 (en) 2016-03-31 2023-04-18 Snap Inc. Automated avatar generation
US11900418B2 (en) 2016-04-04 2024-02-13 Snap Inc. Mutable geo-fencing system
US10630894B2 (en) * 2016-06-01 2020-04-21 Canon Kabushiki Kaisha Notification system, wearable device, information processing apparatus, control method thereof, and computer-readable storage medium
US10638256B1 (en) 2016-06-20 2020-04-28 Pipbin, Inc. System for distribution and display of mobile targeted augmented reality content
US11201981B1 (en) 2016-06-20 2021-12-14 Pipbin, Inc. System for notification of user accessibility of curated location-dependent content in an augmented estate
US10839219B1 (en) 2016-06-20 2020-11-17 Pipbin, Inc. System for curation, distribution and display of location-dependent augmented reality content
US11876941B1 (en) 2016-06-20 2024-01-16 Pipbin, Inc. Clickable augmented reality content manager, system, and network
US11044393B1 (en) 2016-06-20 2021-06-22 Pipbin, Inc. System for curation and display of location-dependent augmented reality content in an augmented estate system
US11785161B1 (en) 2016-06-20 2023-10-10 Pipbin, Inc. System for user accessibility of tagged curated augmented reality content
US10805696B1 (en) 2016-06-20 2020-10-13 Pipbin, Inc. System for recording and targeting tagged content of user interest
US10992836B2 (en) 2016-06-20 2021-04-27 Pipbin, Inc. Augmented property system of curated augmented reality media elements
US10506371B2 (en) 2016-06-28 2019-12-10 Snap Inc. System to track engagement of media items
US10735892B2 (en) 2016-06-28 2020-08-04 Snap Inc. System to track engagement of media items
US10885559B1 (en) 2016-06-28 2021-01-05 Snap Inc. Generation, curation, and presentation of media collections with automated advertising
US10327100B1 (en) 2016-06-28 2019-06-18 Snap Inc. System to track engagement of media items
US10165402B1 (en) 2016-06-28 2018-12-25 Snap Inc. System to track engagement of media items
US11445326B2 (en) 2016-06-28 2022-09-13 Snap Inc. Track engagement of media items
US10430838B1 (en) 2016-06-28 2019-10-01 Snap Inc. Methods and systems for generation, curation, and presentation of media collections with automated advertising
US11640625B2 (en) 2016-06-28 2023-05-02 Snap Inc. Generation, curation, and presentation of media collections with automated advertising
US10785597B2 (en) 2016-06-28 2020-09-22 Snap Inc. System to track engagement of media items
US10219110B2 (en) 2016-06-28 2019-02-26 Snap Inc. System to track engagement of media items
US10387514B1 (en) 2016-06-30 2019-08-20 Snap Inc. Automated content curation and communication
US11080351B1 (en) 2016-06-30 2021-08-03 Snap Inc. Automated content curation and communication
US11895068B2 (en) 2016-06-30 2024-02-06 Snap Inc. Automated content curation and communication
US10348662B2 (en) 2016-07-19 2019-07-09 Snap Inc. Generating customized electronic messaging graphics
US11509615B2 (en) 2016-07-19 2022-11-22 Snap Inc. Generating customized electronic messaging graphics
US11816853B2 (en) 2016-08-30 2023-11-14 Snap Inc. Systems and methods for simultaneous localization and mapping
US11843456B2 (en) 2016-10-24 2023-12-12 Snap Inc. Generating and displaying customized avatars in media overlays
US11876762B1 (en) 2016-10-24 2024-01-16 Snap Inc. Generating and displaying customized avatars in media overlays
US11812160B2 (en) * 2016-11-01 2023-11-07 Snap Inc. Fast video capture and sensor adjustment
US20220046163A1 (en) * 2016-11-01 2022-02-10 Snap Inc. Fast video capture and sensor adjustment
US11750767B2 (en) 2016-11-07 2023-09-05 Snap Inc. Selective identification and order of image modifiers
US10623666B2 (en) 2016-11-07 2020-04-14 Snap Inc. Selective identification and order of image modifiers
US11233952B2 (en) 2016-11-07 2022-01-25 Snap Inc. Selective identification and order of image modifiers
US11397517B2 (en) 2016-12-09 2022-07-26 Snap Inc. Customized media overlays
US10203855B2 (en) 2016-12-09 2019-02-12 Snap Inc. Customized user-controlled media overlays
US10754525B1 (en) 2016-12-09 2020-08-25 Snap Inc. Customized media overlays
US11616745B2 (en) 2017-01-09 2023-03-28 Snap Inc. Contextual generation and selection of customized media content
US11870743B1 (en) 2017-01-23 2024-01-09 Snap Inc. Customized digital avatar accessories
US10915911B2 (en) 2017-02-03 2021-02-09 Snap Inc. System to determine a price-schedule to distribute media content
US10319149B1 (en) 2017-02-17 2019-06-11 Snap Inc. Augmented reality anamorphosis system
US11250075B1 (en) 2017-02-17 2022-02-15 Snap Inc. Searching social media content
US11720640B2 (en) 2017-02-17 2023-08-08 Snap Inc. Searching social media content
US11861795B1 (en) 2017-02-17 2024-01-02 Snap Inc. Augmented reality anamorphosis system
US11748579B2 (en) 2017-02-20 2023-09-05 Snap Inc. Augmented reality speech balloon system
US10614828B1 (en) 2017-02-20 2020-04-07 Snap Inc. Augmented reality speech balloon system
US11189299B1 (en) 2017-02-20 2021-11-30 Snap Inc. Augmented reality speech balloon system
US11670057B2 (en) 2017-03-06 2023-06-06 Snap Inc. Virtual vision system
US11037372B2 (en) 2017-03-06 2021-06-15 Snap Inc. Virtual vision system
US11961196B2 (en) 2017-03-06 2024-04-16 Snap Inc. Virtual vision system
US10887269B1 (en) 2017-03-09 2021-01-05 Snap Inc. Restricted group content collection
US10523625B1 (en) 2017-03-09 2019-12-31 Snap Inc. Restricted group content collection
US11258749B2 (en) 2017-03-09 2022-02-22 Snap Inc. Restricted group content collection
US10582277B2 (en) 2017-03-27 2020-03-03 Snap Inc. Generating a stitched data stream
US11297399B1 (en) 2017-03-27 2022-04-05 Snap Inc. Generating a stitched data stream
US10581782B2 (en) 2017-03-27 2020-03-03 Snap Inc. Generating a stitched data stream
US11558678B2 (en) 2017-03-27 2023-01-17 Snap Inc. Generating a stitched data stream
US11349796B2 (en) 2017-03-27 2022-05-31 Snap Inc. Generating a stitched data stream
US11170393B1 (en) 2017-04-11 2021-11-09 Snap Inc. System to calculate an engagement score of location based media content
US11195018B1 (en) 2017-04-20 2021-12-07 Snap Inc. Augmented reality typography personalization system
US10387730B1 (en) 2017-04-20 2019-08-20 Snap Inc. Augmented reality typography personalization system
US11842411B2 (en) 2017-04-27 2023-12-12 Snap Inc. Location-based virtual avatars
US11418906B2 (en) 2017-04-27 2022-08-16 Snap Inc. Selective location-based identity communication
US10963529B1 (en) 2017-04-27 2021-03-30 Snap Inc. Location-based search mechanism in a graphical user interface
US11385763B2 (en) 2017-04-27 2022-07-12 Snap Inc. Map-based graphical user interface indicating geospatial activity metrics
US11392264B1 (en) 2017-04-27 2022-07-19 Snap Inc. Map-based graphical user interface for multi-type social media galleries
US10952013B1 (en) 2017-04-27 2021-03-16 Snap Inc. Selective location-based identity communication
US11409407B2 (en) 2017-04-27 2022-08-09 Snap Inc. Map-based graphical user interface indicating geospatial activity metrics
US11782574B2 (en) 2017-04-27 2023-10-10 Snap Inc. Map-based graphical user interface indicating geospatial activity metrics
US11474663B2 (en) 2017-04-27 2022-10-18 Snap Inc. Location-based search mechanism in a graphical user interface
US11893647B2 (en) 2017-04-27 2024-02-06 Snap Inc. Location-based virtual avatars
US11556221B2 (en) 2017-04-27 2023-01-17 Snap Inc. Friend location sharing mechanism for social media platforms
US11451956B1 (en) 2017-04-27 2022-09-20 Snap Inc. Location privacy management on map-based social media platforms
US11232040B1 (en) 2017-04-28 2022-01-25 Snap Inc. Precaching unlockable data elements
US11675831B2 (en) 2017-05-31 2023-06-13 Snap Inc. Geolocation based playlists
US11533444B2 (en) * 2017-07-19 2022-12-20 Fujifilm Business Innovation Corp. Image processing device
US11475254B1 (en) 2017-09-08 2022-10-18 Snap Inc. Multimodal entity identification
US11335067B2 (en) 2017-09-15 2022-05-17 Snap Inc. Augmented reality system
US10740974B1 (en) 2017-09-15 2020-08-11 Snap Inc. Augmented reality system
US11721080B2 (en) 2017-09-15 2023-08-08 Snap Inc. Augmented reality system
US10499191B1 (en) 2017-10-09 2019-12-03 Snap Inc. Context sensitive presentation of content
US11617056B2 (en) 2017-10-09 2023-03-28 Snap Inc. Context sensitive presentation of content
US11006242B1 (en) 2017-10-09 2021-05-11 Snap Inc. Context sensitive presentation of content
US11030787B2 (en) 2017-10-30 2021-06-08 Snap Inc. Mobile-based cartographic control of display content
US11670025B2 (en) 2017-10-30 2023-06-06 Snap Inc. Mobile-based cartographic control of display content
US11265273B1 (en) 2017-12-01 2022-03-01 Snap, Inc. Dynamic media overlay with smart widget
US11558327B2 (en) 2017-12-01 2023-01-17 Snap Inc. Dynamic media overlay with smart widget
US11943185B2 (en) 2017-12-01 2024-03-26 Snap Inc. Dynamic media overlay with smart widget
US11017173B1 (en) 2017-12-22 2021-05-25 Snap Inc. Named entity recognition visual context and caption data
US11687720B2 (en) 2017-12-22 2023-06-27 Snap Inc. Named entity recognition visual context and caption data
US11487794B2 (en) 2018-01-03 2022-11-01 Snap Inc. Tag distribution visualization system
US10678818B2 (en) 2018-01-03 2020-06-09 Snap Inc. Tag distribution visualization system
US11507614B1 (en) 2018-02-13 2022-11-22 Snap Inc. Icon based tagging
US11841896B2 (en) 2018-02-13 2023-12-12 Snap Inc. Icon based tagging
US10885136B1 (en) 2018-02-28 2021-01-05 Snap Inc. Audience filtering system
US11523159B2 (en) 2018-02-28 2022-12-06 Snap Inc. Generating media content items based on location information
US10979752B1 (en) 2018-02-28 2021-04-13 Snap Inc. Generating media content items based on location information
US10327096B1 (en) 2018-03-06 2019-06-18 Snap Inc. Geo-fence selection system
US11044574B2 (en) 2018-03-06 2021-06-22 Snap Inc. Geo-fence selection system
US11570572B2 (en) 2018-03-06 2023-01-31 Snap Inc. Geo-fence selection system
US10524088B2 (en) 2018-03-06 2019-12-31 Snap Inc. Geo-fence selection system
US11722837B2 (en) 2018-03-06 2023-08-08 Snap Inc. Geo-fence selection system
US11491393B2 (en) 2018-03-14 2022-11-08 Snap Inc. Generating collectible items based on location information
US10933311B2 (en) 2018-03-14 2021-03-02 Snap Inc. Generating collectible items based on location information
US11163941B1 (en) 2018-03-30 2021-11-02 Snap Inc. Annotating a collection of media content items
US10779114B2 (en) 2018-04-18 2020-09-15 Snap Inc. Visitation tracking system
US10924886B2 (en) 2018-04-18 2021-02-16 Snap Inc. Visitation tracking system
US10219111B1 (en) 2018-04-18 2019-02-26 Snap Inc. Visitation tracking system
US11297463B2 (en) 2018-04-18 2022-04-05 Snap Inc. Visitation tracking system
US10448199B1 (en) 2018-04-18 2019-10-15 Snap Inc. Visitation tracking system
US11683657B2 (en) 2018-04-18 2023-06-20 Snap Inc. Visitation tracking system
US10681491B1 (en) 2018-04-18 2020-06-09 Snap Inc. Visitation tracking system
US11860888B2 (en) 2018-05-22 2024-01-02 Snap Inc. Event detection system
US11670026B2 (en) 2018-07-24 2023-06-06 Snap Inc. Conditional modification of augmented reality object
US10943381B2 (en) 2018-07-24 2021-03-09 Snap Inc. Conditional modification of augmented reality object
US11367234B2 (en) 2018-07-24 2022-06-21 Snap Inc. Conditional modification of augmented reality object
US10679393B2 (en) 2018-07-24 2020-06-09 Snap Inc. Conditional modification of augmented reality object
US10789749B2 (en) 2018-07-24 2020-09-29 Snap Inc. Conditional modification of augmented reality object
US11676319B2 (en) 2018-08-31 2023-06-13 Snap Inc. Augmented reality anthropomorphtzation system
US11450050B2 (en) 2018-08-31 2022-09-20 Snap Inc. Augmented reality anthropomorphization system
US10997760B2 (en) 2018-08-31 2021-05-04 Snap Inc. Augmented reality anthropomorphization system
US11611691B2 (en) 2018-09-11 2023-03-21 Profoto Aktiebolag Computer implemented method and a system for coordinating taking of a picture using a camera and initiation of a flash pulse of at least one flash device
US11455082B2 (en) 2018-09-28 2022-09-27 Snap Inc. Collaborative achievement interface
US11704005B2 (en) 2018-09-28 2023-07-18 Snap Inc. Collaborative achievement interface
US11799811B2 (en) 2018-10-31 2023-10-24 Snap Inc. Messaging and gaming applications communication platform
US11558709B2 (en) 2018-11-30 2023-01-17 Snap Inc. Position service to determine relative position to map features
US11199957B1 (en) 2018-11-30 2021-12-14 Snap Inc. Generating customized avatars based on location information
US11698722B2 (en) 2018-11-30 2023-07-11 Snap Inc. Generating customized avatars based on location information
US11812335B2 (en) 2018-11-30 2023-11-07 Snap Inc. Position service to determine relative position to map features
US11877211B2 (en) 2019-01-14 2024-01-16 Snap Inc. Destination sharing in location sharing system
US11751015B2 (en) 2019-01-16 2023-09-05 Snap Inc. Location-based context information sharing in a messaging system
US11294936B1 (en) 2019-01-30 2022-04-05 Snap Inc. Adaptive spatial density based clustering
US11693887B2 (en) 2019-01-30 2023-07-04 Snap Inc. Adaptive spatial density based clustering
US11863866B2 (en) 2019-02-01 2024-01-02 Profoto Aktiebolag Housing for an intermediate signal transmission unit and an intermediate signal transmission unit
US11809624B2 (en) 2019-02-13 2023-11-07 Snap Inc. Sleep detection in a location sharing system
WO2020168956A1 (en) * 2019-02-18 2020-08-27 华为技术有限公司 Method for photographing the moon and electronic device
US11500525B2 (en) 2019-02-25 2022-11-15 Snap Inc. Custom media overlay system
US11954314B2 (en) 2019-02-25 2024-04-09 Snap Inc. Custom media overlay system
US11574431B2 (en) 2019-02-26 2023-02-07 Snap Inc. Avatar based on weather
US11301117B2 (en) 2019-03-08 2022-04-12 Snap Inc. Contextual information in chat
US11868414B1 (en) 2019-03-14 2024-01-09 Snap Inc. Graph-based prediction for contact suggestion in a location sharing system
US11852554B1 (en) 2019-03-21 2023-12-26 Snap Inc. Barometer calibration in a location sharing system
US11740760B2 (en) 2019-03-28 2023-08-29 Snap Inc. Generating personalized map interface with enhanced icons
US11249614B2 (en) 2019-03-28 2022-02-15 Snap Inc. Generating personalized map interface with enhanced icons
US11361493B2 (en) 2019-04-01 2022-06-14 Snap Inc. Semantic texture mapping system
US11606755B2 (en) 2019-05-30 2023-03-14 Snap Inc. Wearable device location systems architecture
US11206615B2 (en) 2019-05-30 2021-12-21 Snap Inc. Wearable device location systems
US11963105B2 (en) 2019-05-30 2024-04-16 Snap Inc. Wearable device location systems architecture
US11785549B2 (en) 2019-05-30 2023-10-10 Snap Inc. Wearable device location systems
US11917495B2 (en) 2019-06-07 2024-02-27 Snap Inc. Detection of a physical collision between two client devices in a location sharing system
US11601783B2 (en) 2019-06-07 2023-03-07 Snap Inc. Detection of a physical collision between two client devices in a location sharing system
US11714535B2 (en) 2019-07-11 2023-08-01 Snap Inc. Edge gesture interface with smart interactions
US11821742B2 (en) 2019-09-26 2023-11-21 Snap Inc. Travel based notifications
US11218838B2 (en) 2019-10-31 2022-01-04 Snap Inc. Focused map-based context information surfacing
US11729343B2 (en) 2019-12-30 2023-08-15 Snap Inc. Including video feed in message thread
US11429618B2 (en) 2019-12-30 2022-08-30 Snap Inc. Surfacing augmented reality objects
US11128715B1 (en) 2019-12-30 2021-09-21 Snap Inc. Physical friend proximity in chat
US11893208B2 (en) 2019-12-31 2024-02-06 Snap Inc. Combined map icon with action indicator
US11343323B2 (en) 2019-12-31 2022-05-24 Snap Inc. Augmented reality objects registry
US11943303B2 (en) 2019-12-31 2024-03-26 Snap Inc. Augmented reality objects registry
US11228551B1 (en) 2020-02-12 2022-01-18 Snap Inc. Multiple gateway message exchange
US11888803B2 (en) 2020-02-12 2024-01-30 Snap Inc. Multiple gateway message exchange
US11765117B2 (en) 2020-03-05 2023-09-19 Snap Inc. Storing data based on device location
US11516167B2 (en) 2020-03-05 2022-11-29 Snap Inc. Storing data based on device location
US11619501B2 (en) 2020-03-11 2023-04-04 Snap Inc. Avatar based on trip
US11776256B2 (en) 2020-03-27 2023-10-03 Snap Inc. Shared augmented reality system
US11430091B2 (en) 2020-03-27 2022-08-30 Snap Inc. Location mapping for large scale augmented-reality
US11915400B2 (en) 2020-03-27 2024-02-27 Snap Inc. Location mapping for large scale augmented-reality
US11483267B2 (en) 2020-06-15 2022-10-25 Snap Inc. Location sharing using different rate-limited links
US11503432B2 (en) 2020-06-15 2022-11-15 Snap Inc. Scalable real-time location sharing framework
US11290851B2 (en) 2020-06-15 2022-03-29 Snap Inc. Location sharing using offline and online objects
US11314776B2 (en) 2020-06-15 2022-04-26 Snap Inc. Location sharing using friend list versions
US11676378B2 (en) 2020-06-29 2023-06-13 Snap Inc. Providing travel-based augmented reality content with a captured image
US11943192B2 (en) 2020-08-31 2024-03-26 Snap Inc. Co-location connection service
US11601888B2 (en) 2021-03-29 2023-03-07 Snap Inc. Determining location using multi-source geolocation data
US11606756B2 (en) 2021-03-29 2023-03-14 Snap Inc. Scheduling requests for location data
US11902902B2 (en) 2021-03-29 2024-02-13 Snap Inc. Scheduling requests for location data
US11645324B2 (en) 2021-03-31 2023-05-09 Snap Inc. Location-based timeline media content system
US11829834B2 (en) 2021-10-29 2023-11-28 Snap Inc. Extended QR code

Also Published As

Publication number Publication date
US8305452B2 (en) 2012-11-06
US20110228045A1 (en) 2011-09-22
WO2009085099A2 (en) 2009-07-09
WO2009085099A3 (en) 2009-08-27

Similar Documents

Publication Publication Date Title
US8305452B2 (en) Remote determination of image-acquisition settings and opportunities
US7929796B2 (en) Image processing system and method, and terminal and server used for the same
US8094963B2 (en) Imaging system, imaging condition setting method, terminal and server used for the same
US7391886B1 (en) Digital camera with image tracking system
US7805066B2 (en) System for guided photography based on image capturing device rendered user recommendations according to embodiments
CN109005366A (en) Camera module night scene image pickup processing method, device, electronic equipment and storage medium
US20200404160A1 (en) Modifying image parameters using wearable device input
US20060044422A1 (en) Image capture apparatus and control method therefor
US20050046706A1 (en) Image data capture method and apparatus
JP2002010135A (en) System and method of setting image acquisition controls for cameras
US8913150B2 (en) Dynamic image capture utilizing prior capture settings and user behaviors
CN110493524B (en) Photometric adjustment method, device and equipment and storage medium
JP2007258869A (en) Image trimming method and apparatus, and program
US20070230930A1 (en) Methods and systems for automatic image acquisition
CN106254755B (en) Photographic device, camera shooting control method and recording medium
CN110581950A (en) Camera, system and method for selecting camera settings
US20120249840A1 (en) Electronic camera
US20160156825A1 (en) Outdoor exposure control of still image capture
JP2005286379A (en) Photographing support system and photographing support method
JP2012085228A (en) Photographing condition setting device, imaging device, image processing device, and photographing condition setting program
JP2016085248A (en) Exposure computation device
CN112804463A (en) Exposure time control method, device, terminal and readable storage medium
JP2019212961A (en) Mobile unit, light amount adjustment method, program, and recording medium
WO2016159107A1 (en) Digital camera and recording card for digital camera
JP5397004B2 (en) camera

Legal Events

Date Code Title Description
AS Assignment

Owner name: EASTMAN KODAK COMPANY,NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FREDLUND, JOHN R.;PILLMAN, BRUCE H.;GALLAGHER, ANDREW C.;AND OTHERS;SIGNING DATES FROM 20080218 TO 20080304;REEL/FRAME:020671/0126

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: EASTMAN KODAK COMPANY, NEW YORK

Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001

Effective date: 20130201

Owner name: FPC INC., CALIFORNIA

Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001

Effective date: 20130201

Owner name: KODAK (NEAR EAST), INC., NEW YORK

Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001

Effective date: 20130201

Owner name: FAR EAST DEVELOPMENT LTD., NEW YORK

Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001

Effective date: 20130201

Owner name: QUALEX INC., NORTH CAROLINA

Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001

Effective date: 20130201

Owner name: PAKON, INC., INDIANA

Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001

Effective date: 20130201

Owner name: KODAK REALTY, INC., NEW YORK

Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001

Effective date: 20130201

Owner name: KODAK AVIATION LEASING LLC, NEW YORK

Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001

Effective date: 20130201

Owner name: KODAK IMAGING NETWORK, INC., CALIFORNIA

Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001

Effective date: 20130201

Owner name: LASER-PACIFIC MEDIA CORPORATION, NEW YORK

Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001

Effective date: 20130201

Owner name: NPEC INC., NEW YORK

Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001

Effective date: 20130201

Owner name: CREO MANUFACTURING AMERICA LLC, WYOMING

Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001

Effective date: 20130201

Owner name: KODAK PORTUGUESA LIMITED, NEW YORK

Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001

Effective date: 20130201

Owner name: KODAK PHILIPPINES, LTD., NEW YORK

Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001

Effective date: 20130201

Owner name: EASTMAN KODAK INTERNATIONAL CAPITAL COMPANY, INC.,

Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001

Effective date: 20130201

Owner name: KODAK AMERICAS, LTD., NEW YORK

Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001

Effective date: 20130201

AS Assignment

Owner name: MONUMENT PEAK VENTURES, LLC, TEXAS

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:INTELLECTUAL VENTURES FUND 83 LLC;REEL/FRAME:064599/0304

Effective date: 20230728