US20070008321A1 - Identifying collection images with special events - Google Patents

Identifying collection images with special events Download PDF

Info

Publication number
US20070008321A1
US20070008321A1 US11/178,992 US17899205A US2007008321A1 US 20070008321 A1 US20070008321 A1 US 20070008321A1 US 17899205 A US17899205 A US 17899205A US 2007008321 A1 US2007008321 A1 US 2007008321A1
Authority
US
United States
Prior art keywords
event
time
special event
special
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/178,992
Inventor
Andrew Gallagher
Samuel Fryer
Alexander Loui
Jason Oliver
Neal Eckhaus
Kenneth Parulski
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Intellectual Ventures Fund 83 LLC
Original Assignee
Eastman Kodak Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US11/178,992 priority Critical patent/US20070008321A1/en
Application filed by Eastman Kodak Co filed Critical Eastman Kodak Co
Assigned to EASTMAN KODAK COMPANY reassignment EASTMAN KODAK COMPANY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ECKHAUS, NEAL, OLIVER, JASON R., PARULSKI, KENNETH A., FRYER, SAMUEL M., GALLAGHER, ANDREW C., LOUI, ALEXANDER C.
Priority to JP2008521410A priority patent/JP5225082B2/en
Priority to PCT/US2006/024754 priority patent/WO2007008386A2/en
Priority to EP09176588A priority patent/EP2161670A1/en
Priority to EP10193979A priority patent/EP2287755A1/en
Priority to EP06785560A priority patent/EP1902392A2/en
Publication of US20070008321A1 publication Critical patent/US20070008321A1/en
Priority to US12/796,698 priority patent/US8717461B2/en
Priority to US12/983,904 priority patent/US8358358B2/en
Assigned to KODAK PHILIPPINES, LTD., QUALEX INC., LASER-PACIFIC MEDIA CORPORATION, EASTMAN KODAK COMPANY, PAKON, INC., CREO MANUFACTURING AMERICA LLC, KODAK (NEAR EAST), INC., EASTMAN KODAK INTERNATIONAL CAPITAL COMPANY, INC., FPC INC., KODAK AVIATION LEASING LLC, KODAK IMAGING NETWORK, INC., NPEC INC., KODAK PORTUGUESA LIMITED, KODAK AMERICAS, LTD., KODAK REALTY, INC., FAR EAST DEVELOPMENT LTD. reassignment KODAK PHILIPPINES, LTD. PATENT RELEASE Assignors: CITICORP NORTH AMERICA, INC., WILMINGTON TRUST, NATIONAL ASSOCIATION
Assigned to INTELLECTUAL VENTURES FUND 83 LLC reassignment INTELLECTUAL VENTURES FUND 83 LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: EASTMAN KODAK COMPANY
Priority to US14/177,395 priority patent/US9049388B2/en
Assigned to MONUMENT PEAK VENTURES, LLC reassignment MONUMENT PEAK VENTURES, LLC RELEASE BY SECURED PARTY (SEE DOCUMENT FOR DETAILS). Assignors: INTELLECTUAL VENTURES FUND 83 LLC
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N1/32101Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N1/32128Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title attached to the image data, e.g. file header, transmitted message header, information on the same page or in the same computer file as the image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2458Special types of queries, e.g. statistical queries, fuzzy queries or distributed queries
    • G06F16/2477Temporal data queries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/30Scenes; Scene-specific elements in albums, collections or shared content, e.g. social network photos or video

Definitions

  • the present invention related to an improved way to identify digital images from a collection, making use of special events.
  • images will be understood to include both still images and “videos” which are a collection of image frames, often having associated audio stream. Therefore, an image collection can contain images or videos or both.
  • the average number of images captures with digital cameras per photographer is still increasing each year.
  • the organization and retrieval of images and videos is already a problem for the typical consumer.
  • the length of time spanned by a typical consumer's digital image collection is only a few years. The organization and retrieval problem will continue to grow as the length of time spanned by the average digital image and video collection increases.
  • An event consists of a set of images of videos related to a common event, for example “trip to the beach.”
  • Certain applications allow for viewing images on a timeline.
  • the images can be viewed or sorted into consecutive order based on the image capture time.
  • the application Picasa 2 distributed by Google, has a timeline view where groups or sets of images and videos are shown to the user and each image set has an associated time (e.g. “April 2005”).
  • the Adobe application Album 2.0 has a calendar view where calendar pages are shown and small versions of images captured on a specific calendar date are shown on that date.
  • the software groups images related by capture time into sets.
  • the sets of images are not labeled with meaningful names other than the capture date or date range.
  • the only calendar information used by these applications is the day, month, and year. They do not use any occasion (e.g. Thanksgiving) or appointment (e.g. vacation trip to Florida) information to label the images with meaningful names.
  • U.S. Pat. No. 6,108,640 describes a method for determining periodic occasions such as holidays. However, there is no description of assigning meaningful labels to images or sets of images.
  • Nakamura and Gibson describe a method of placing images into storage locations based on calendar information. Their method does not provide for automatic annotation of images.
  • Hooper and Mao describe a calendar-based image asset organization method. Their method allows people to indicate via a graphical user interface a date range of interest. Images captured during that date range are then retrieved for the user. The method does not provide for automatic annotation of images.
  • An object of the present invention to provide an improved way of identifying digital images of interest from a collection of digital images.
  • This object is achieved by storing a collection of digital images or videos or both each having an associated capture time; comparing the associated capture time in the collection with a special event time to determine if a digital image or video in the collection is of interest, wherein the comparing step includes calculation of a special event time associated with a special event based on the calendar time associated with the special event and using such information to perform the comparison step; and associating digital images and videos of interest with the special event.
  • Another object of the present invention is to provide an improved way to labeling digital images captured by a digital capture device.
  • This object is achieved by transferring and storing calendar information in a digital camera device, the calendar information including occasion or appointment information, capturing and storing a digital image in the digital camera device, determining a capture time for the captured digital image, automatically comparing the capture time with the calendar information to determine a special event label, and storing the special event label in association with the captured digital image.
  • FIG. 1 is a schematic diagram of computer system that can implement the present invention
  • FIG. 2 is a flow chart of an embodiment of the present invention
  • FIG. 3 is a more detailed flow chart of an embodiment of the present invention.
  • FIG. 4 is graph representation of a calendar time (cross hatch region) and a special event time (line) associated with the special event Christmas;
  • FIG. 5 is a flow chart of an embodiment of the present invention.
  • FIG. 6 is a flow chart of a further embodiment of the present invention.
  • FIG. 7 is a flow chart of yet another embodiment of the present invention.
  • FIG. 8 is a flow chart of another embodiment of the present invention.
  • FIG. 9 is a flow chart of a still further embodiment of the present invention.
  • FIG. 10 is a block diagram of a camera phone based imaging system that can implement the present invention.
  • FIG. 11 is a flow chart of yet another embodiment of the present invention.
  • FIG. 12 is a flow chart of a still further embodiment of the present invention.
  • FIG. 13 is a flow chart of an additional embodiment of the present invention.
  • the present invention can be implemented in computer hardware and computerized equipment.
  • the method can be performed in a digital camera (as will be described later in reference to FIG. 10 ), a digital printer, on an internet server, on a kiosk, and on a personal computer.
  • FIG. 1 there is illustrated a computer system for implementing the present invention.
  • the computer system is shown for the purpose of illustrating a preferred embodiment, the present invention is not limited to the computer system shown, but can be used on any electronic processing system such as found in digital cameras, home computers, kiosks, retail or wholesale photofinishing, or any other system for the processing of digital images.
  • the computer system includes a microprocessor-based unit 20 (also referred to herein as a digital image processor) for receiving and processing software programs and for performing other processing functions.
  • the digital image processor 20 processes images from image capture devices 10 such as cameras, scanners, or computer image generation software.
  • the digital image processor 20 can be used to process digital images to make adjustments for overall brightness, tone scale, image structure, etc. of digital images in a manner such that a pleasing looking image is produced by an image output device.
  • the digital image processor 20 interfaces with the general control computer 40 (also a microprocessor based unit) for exchanging data and commands.
  • the general control computer 40 and the digital image processor 20 can be two different microprocessors, or the functions of each can be performs by a single physical microprocessor.
  • the digital image processor 20 often outputs an image to an image output device 30 for example a printer for displaying the image.
  • a display device 50 is electrically connected to the digital image processor 20 for displaying user-related information associated with the software, e.g., by means of a graphical user interface.
  • a keyboard 60 is also connected to the microprocessor based unit 20 via the general control computer 40 for permitting a user to input information to the software.
  • a mouse can be used for moving a selector on the display device 50 and for selecting an item on which the selector overlays, as is well known in the art.
  • Digital images and other data can also be stored on an offline memory device 70 such as an external hard drive, flash media, a drive that writes to CD-ROM or DVD media, or the like.
  • a compact disk-read only memory which typically includes software programs, is inserted into the general control computer 40 for providing a means of inputting the software programs and other information to the general control computer 40 and the digital image processor 20 .
  • a floppy disk can also include a software program, and is inserted into the general control computer 40 for inputting the software program.
  • the general control computer 40 can be programmed, as is well known in the art, for storing the software program internally.
  • the general control computer 40 can have a network connection, such as a telephone line or wireless connection, to an external network such as a local area network or the Internet ( 370 in FIG. 10 ).
  • Images can also be displayed on the display device 50 via a Flash EPROM memory card, such as the well-known PC Card, Compact Flash, SD, MemoryStick cards.
  • a Flash EPROM memory card such as the well-known PC Card, Compact Flash, SD, MemoryStick cards.
  • the image output device 30 provides a final image.
  • the image output device 30 can be a printer or other output device that provides a paper or other hard copy final image.
  • the image output device 30 can also be an output device that provides the final image as a digital file.
  • the image output device 30 can also include combinations of output, such as a printed image and a digital file on a memory unit, such as a CD or DVD.
  • a digital image includes one or more digital image channels or color components.
  • Each digital image channel is a two-dimensional array of pixels.
  • Each pixel value relates to the amount of light received by the imaging capture device corresponding to the physical region of pixel.
  • a digital image will often consist of red, green, and blue digital image channels.
  • Motion imaging applications can be thought of as a sequence of digital images.
  • a digital image channel is described as a two dimensional array of pixel values arranged by rows and columns, those skilled in the art will recognize that the present invention can be applied to non rectilinear arrays with equal effect.
  • the present invention can be implemented in a combination of software or hardware and is not limited to devices that are physically connected or located within the same physical location.
  • One or more of the devices illustrated in FIG. 1 can be located remotely and can be connected via a network.
  • One or more of the devices can be connected wirelessly, such as by a radio-frequency link, either directly or via a network.
  • the present invention can be employed in a variety of user contexts and environments.
  • Exemplary contexts and environments include, without limitation, wholesale digital photofinishing (which involves exemplary process steps or stages such as film in, digital processing, prints out), retail digital photofinishing (film in, digital processing, prints out), home printing (home scanned film or digital images, digital processing, prints out), desktop software (software that applies algorithms to digital prints to make them better—or even just to change them), digital fulfillment (digital images in—from media or over the web, digital processing, with images out—in digital form on media, digital form over the web, or printed on hard-copy prints), kiosks (digital or scanned input, digital processing, digital or hard copy output), mobile devices (e.g., PDA or cell phone that can be used as a processing unit, a display unit, or a unit to give processing instructions), and as a service offered via the World Wide Web.
  • wholesale digital photofinishing which involves exemplary process steps or stages such as film in, digital processing, prints out
  • retail digital photofinishing film in,
  • the invention can stand alone or can be a component of a larger system solution.
  • human interfaces e.g., the scanning or input, the digital processing, the display to a user (if needed), the input of user requests or processing instructions (if needed), the output, can each be on the same or different devices and physical locations, and communication between the devices and locations can be via public or private network connections, or media based communication.
  • the method of the invention can be fully automatic, can have user input (be fully or partially manual), can have user or operator review to accept/reject the result, or can be assisted by metadata (metadata that can be user supplied, supplied by a measuring device (e.g. in a camera), or determined by an algorithm).
  • the algorithm(s) can interface with a variety of workflow user interface schemes.
  • FIG. 2 An embodiment of the invention is illustrated in FIG. 2 .
  • a digital image and video 104 is input to a calendar label annotator 114 .
  • Special events from a personal calendar 146 are used by the calendar label annotator 114 to annotate the digital images and videos in the collection 104 .
  • the collection 104 can contain exclusively images, or exclusive videos, or some of each.
  • Resulting annotations 118 (equivalently called labels) are stored in a database 120 .
  • Each annotation 118 can be stored with the image (e.g. in the file header) or in a database 120 either located with the digital image or video or remotely (e.g. on a computer server).
  • the database 120 can be in any form.
  • the database 120 can be distributed across many files or few files. Consequently, the database 120 can be queried 122 to find those images and videos 104 containing the query special event.
  • a query 122 for images of “Daytona Beach” returns query results 124 containing the set of digital images or videos 104 with a related annotation of “vacation to Daytona Beach.” Then query results 124 can be returned.
  • the query results 124 are the set of digital images and videos 104 associated with the query special event.
  • the digital image and videos from the collection 104 are passed to a capture time extractor 108 .
  • the capture time extractor 108 determines the time each digital image or video 104 was captured, and outputs image capture times 112 .
  • the image capture time 112 of the digital image or video 104 is determined by one of several methods by the capture time extractor 108 .
  • the capture time is embedded in the file header of the digital image or video 104 .
  • the EXIF image format (described at www.exif.org) allows the image capture device 10 to store information associated with the digital image or video 104 in the file header.
  • the “Date ⁇ Time” entry is associated with the date and time the image was captured.
  • the digital image or video 104 results from scanning film or prints and the image capture time 112 is determined by detection of the date exposed into the image area (as is often done at capture time), usually in the lower right corner of the image.
  • the date a photograph is printed is often printed on the back of the print.
  • some film systems such as APS
  • the capture time extractor 108 uses the most appropriate method for extracting the image capture time 112 of the image.
  • the source of the digital images and videos 104 is a digital camera, and the capture time extractor 108 extracts the capture time from the file information.
  • the image capture time 112 can be a precise minute in time, e.g. Dec. 9, 2000 at 10:00 AM. Or the image capture time 112 can be less precise, e.g. 2000 or December 2000.
  • the image capture time 112 can be in the form of a probability distribution function e.g. Dec. 9, 2000 ⁇ 2 days with 95% confidence.
  • the image capture time 112 is input to the calendar label annotator 114 .
  • the personal calendar 146 is preferably a software application running on a computer that is useful for recording special events.
  • the special events can be input to the personal calendar 146 by the user or by others or other applications.
  • the special events recorded in the personal calendar 146 are of three types; occasions, appointments, and journal entries. Essentially, the special events are labels for blocks of time that are personalized to the user (or users) of the calendar.
  • Outlook by Microsoft and Notes by Lotus are both examples of calendar applications.
  • the personal calendar 146 can also be an application that operates on a handheld device, such as a personal digital assistant (PDA) or a cellular telephone camera, as will be described later in reference to FIG. 11 .
  • PDA personal digital assistant
  • the personal calendar 146 can also be an application that runs on a server and is accessed by the user via the Internet.
  • the first type of the three special events recorded in a personal calendar 146 is an occasion 142 .
  • the occasion 142 is a periodically occurring special event, celebration, religious holiday (fixed or moveable), national holiday, festival, or the like.
  • occasions 142 can be computed in advance for any time period because mathematical formulas are used to determine the date of the occasion 142 in each subsequent year, as described by U.S. Pat. No. 6,108,640.
  • an occasion 142 is a name placed on an entire day. For example, every year Christmas is December 25 .
  • the occasions 142 celebrated by, observed by, or relevant to a particular person depend on a number of factors, including geographical, religious, and political factors.
  • the user can indicate to the personal calendar 146 which occasions 142 are relevant for him or her via user input 150 .
  • the user can select relevant holidays from a list, or select between sets of holidays (i.e. Canadian holidays vs. American holidays), or indicate the geographic, religious, and political factors and have the computer guess at the set of relevant occasions to the user, which can then be refined through further user input 150 . While holidays such as Easter, Christmas, Hanukkah and the like are widely observed; most people also celebrate personal occasions 142 such as birthday anniversaries, wedding anniversaries, etc. These birthday and wedding anniversaries can be for the user's family and close friends.
  • the user indicates these personal occasions 142 via user input 150 to the personal calendar 146 .
  • the user can indicate special events by user inspection (i.e. selecting from lists of occasions presented to the user via the display device 50 as shown in FIG. 1 ) or automatically or both.
  • a second type of special event is an appointment 140 .
  • Appointment 140 is a special event describing something planned to take place in the future that is relevant for the user or users of the personal calendar 146 .
  • appointments 140 are not names placed on an entire day (as is the case with occasions).
  • an appointment 140 can be “doctor appointment at 2” or “Matthew's cub scout meeting at 6 PM” or also “meet Sarah at the zoo at 4 PM”.
  • the appointment 140 can span several days or weeks, such as “Camping vacation Aug. 9-14, 2004”.
  • the third type of special event is a journal entry 144 .
  • the journal entry 144 is an entry to the personal calendar 146 describing events that have occurred in the past (as of the time of entry of the journal entry 144 .)
  • the special event on Nov. 24, 2004 “Jonah first crawled today” is a journal entry because it describes events that already took place at the time of entry.
  • Some journal entries 144 are quite lengthy, such as a diary where a user enters a few paragraphs describing the events and reflections of the day. These diary entries are journal entries 144 because they are associated with a particular calendar time, and are related to events associated with that calendar time.
  • a modern form of the diary is a blog (or weblog) that allows users to keep an on-line diary on the Internet (e.g. www.blogger.com.)
  • Each special event has an associated calendar time.
  • the calendar time associated with an occasion 142 is a single day.
  • the calendar time associated with an appointment 140 can be a precise moment in time or a range of time.
  • the range of time can be any length; seconds, minutes, days or weeks long for example.
  • the calendar time associated with an appointment 140 can have only a starting time indicated (e.g. “doctor appointment at 2”).
  • the appointment 140 ending time is estimated (e.g. to be 2 hours long).
  • the journal entry's associated calendar time can be a precise moment in time or a range of time.
  • the calendar time associated with a special event can be in the form of a probability distribution function e.g. a normal distribution centered on Dec. 9, 2000 at 10:00 AM with standard deviation of 2 days.
  • the calendar label annotator 114 compares the image capture time 118 with the special event times from the personal calendar 146 . Special event times will be described further herein below.
  • the calendar label annotator 114 can reside on a camera, a personal computer, a server, a cellular telephone, a PDA, or the like.
  • the calendar label annotator 114 determines digital images and videos of interest from the collection 104 by comparing the associated image capture times 112 with the special event times associated with special events. Each digital image or video of interest is then associated with its corresponding special event, preferably by producing an annotation 118 stored in a database 120 indicating that the digital image or video 104 is associated with the special event.
  • FIG. 3 shows a more detailed view of the calendar label annotator 114 .
  • the event processor 152 processes the special events from the personal calendar 146 .
  • the special events from the personal calendar 146 each have an associated calendar time, as previously described.
  • the special events can have other information from user input 150 .
  • each special event can have labeled whether it is an “imaging event” or not by the user.
  • An imaging event is a special event that might correspond to digital images or videos 104 in the user's digital image and video collection 104 . For example, the appointment “Vacation to the Beach” from Aug. 12-19, 2005 probably is an imaging event, while the appointment “dentist” at 4:00 Aug. 2, 2005 probably is not.
  • the event processor 152 determines a special event time for each special event.
  • the special event time related to the calendar time associated with the special event, but generally has a broader time span to account for the fact that images related to a special event need not occur during the calendar time associated with the special event.
  • the special event time is necessary because it is common for images and videos related to a special event to occur slightly before or after the calendar time associated with the special event. For example, a child's birthday occasion can be June 26, but the birthday party can be held on June 23. As another example, it is common that many images relate to Christmas even though captured before December 25. Images of setting up a Christmas tree, singing Christmas carols, or visiting St. Nicolas at the mall can occur days or weeks prior to the calendar time (Dec.
  • the event processor 152 sets a special event time for each special event using user input 150 or a rule-based system.
  • the special event time can be a period of time or represented as a probability distribution function.
  • the user can indicate that the special event time associated with Christmas is the period of time beginning on Dec. 13 and ending on Dec. 26.
  • the user can also indicate that the special event time for all birthdays is the period of time from 3 days prior to 3 days following the calendar time of the birthday.
  • FIG. 4 shows an illustrative plot of the calendar time and the special event time associated with Christmas 2004.
  • the calendar time is shown as a hatched area and the special event time is shown with a line.
  • the special event time is computed as a time period beginning at time w ⁇ a, and ending at time q+b, where w is the beginning of the calendar time associated with the special event (or 5% of the calendar time when the calendar time is a probability distribution function p(x)), and q is the end of the calendar time associated with the special event (or 95% of the calendar time when the calendar time is a probability distribution function p(x)).
  • the special event time associated with a special event is calculated from the calendar time associated with the special event.
  • the collection of special event times is a list of special event times.
  • the event processor 152 also determines the likelihood that special events from the personal calendar 146 are imaging events as previously mentioned. This determination can be based on user input 150 where the user indicates whether an event is an imaging event or not. Furthermore, in the absence of user input 150 , the event processor 152 can automatically determine the likelihood that a special event is an imaging event. This is accomplished by collecting the images collections and personal calendars from a large number of people and manually determining the relationships between images and videos in the collection 104 and special events in the calendar. The likelihood that special events are imaging events can therefore be learned in such fashion using well-known machine learning techniques. For example, it is extremely rare that images in a collection 104 correspond to appointments for medical treatment such as doctor and dentist visits.
  • the event processor 152 assigns a likelihood to each special event indicating the likelihood that the special event is an imaging event.
  • the likelihood can be a probability or can be a binary flag. This flag is used to improve the accuracy of the annotation provided by the present invention. For example, images that happen by coincidence to have an image capture time 112 also corresponding to a special event time would normally be incorrectly annotated by the special event name. For example, a doctor appointment ended earlier than expected, then the person walked to a park and photographed ducks. The images of ducks would erroneously be associated with the annotation “Doctor Appointment” were it not that the event processor 152 determined that the doctor appointment is not an imaging event.)
  • the event processor 152 outputs a special event list 154 .
  • the special event list 154 is a list of special events including the name of the special event, the associated calendar time, and the associated special event time.
  • the special event list 154 also indicates the likelihood that each special event is an imaging event as previously mentioned. Additional information such as the people present at the special event, the location of the special event (e.g. my house, the park, Erie, Pa., etc.) can also be included for special events in the list.
  • the time comparer 156 compares the associated capture times for the digital images and videos in the collection 104 with a special event time or time period to determine digital images and videos of interest 158 .
  • Digital images and videos of interest 158 are those images and videos from the digital image and video collection having an image capture time 112 coincident with the special event time associated with a special event. For example, an image captured on Dec. 24, 2004 is an image of interest 158 associated with the special event occasion Christmas when the special event time associated with the occasion Christmas is as shown in FIG. 4 . The image would then receive the annotation 118 “Christmas” that would be stored in association with the image in a database 120 .
  • the database 120 is searched by well-known techniques and the aforementioned image with image capture time 112 of Dec. 24, 2004 will be among the query results 124 .
  • the query results can then be shown on the display device 50 such as shown in FIG. 1 .
  • An annotation 118 associated with a digital image or video of interest 158 is special event identifier information and can contain multiple fields, such as: the name of the special event, the associated calendar time, and the associated special event time, the number of and identities of people present at the special event (e.g. guest list for a birthday party), the location of the special event (e.g. my house, the park, Erie, Pa., etc.).
  • the annotation 118 is information related to the special event. Note that the annotations 118 made automatically by comparing the image capture time to the special event time can be marked in the database 120 as being automatically generated. This allows a user to provide a query 122 and search the database 120 for matches considering just manually entered annotations 118 , just automatically generated annotations 118 , or combinations thereof as desired by the user.
  • Not all of the images and videos in the collection 104 will have capture times 112 that correspond to special event times associated with events in the special event list 154 .
  • not all of the digital images and videos from the collection 104 will be images and videos of interest 158 . These images can be shown to the user on the display device 50 such as shown in FIG. 1 for manual labeling.
  • the time comparer 156 can output a confidence score associated with each image of interest 158 that depends on both the image capture time 112 of the image of interest and the special event time of the special event.
  • the confidence score is the value of the p(x) evaluated where x is the image capture time 112 (or mean image capture time when the image capture time is a probability distribution function).
  • FIG. 5 An alternative embodiment is illustrated in FIG. 5 .
  • the digital image and video collection 104 is input to an event clusterer 162 for clustering the digital images and videos 104 into event clusters 164 which are mutually exclusive smaller sets of digital images and videos 104 according to image content and image capture times 112 .
  • Each set of digital images and videos 104 corresponds to a chronologically related segmentation of the digital image and video collection 104 .
  • each event cluster 164 contains all images from the digital image and video collection 104 captured after the earliest image or video in the event cluster and before the latest image in the event cluster 164 .
  • Commonly assigned U.S. Pat. No. 6,606,411 to Loui and Pavie describes the production of event clusters 164 from a set of digital images and videos 104 in detail.
  • the capture time extractor 108 then inputs each event cluster 164 and determined the image capture times 112 associated with each event cluster 164 .
  • the image capture time 112 for a particular event cluster 164 is a time period or time range spanning the time between the earliest and the latest image in the event cluster 164 .
  • the image capture times 112 and the pre-determined groups of images and videos called event clusters 164 are input to the calendar label annotator 114 along with the personal calendar 146 for producing annotations 118 that are associated with the event clusters 164 in a database 120 .
  • This alternative embodiment illustrates the utility of the present invention for annotating groups of images and videos in addition to the aforementioned utility of annotating single images and videos from a collection of digital images and videos 104 .
  • the comparing step performed by the time comparer 156 of FIG. 3 includes associating pre-determined groups of images with a special event by comparing the image capture times 112 associated with groups of images with the special event times.
  • FIG. 6 A further alternative embodiment is illustrated in FIG. 6 .
  • the database 120 containing annotations 118 from the calendar label annotator 114 is queried 122 by a user to find images of interest.
  • the user can search for images of interest by entering the query 122 of “mom”.
  • the query 122 is input to a keyword expander 170 which expands the terms in the query 122 by augmenting additional related words.
  • the keyword expansion can be performed using techniques from the field of natural language expansion.
  • the database WordNet maintained by Princeton University and available on-line at http://wordnet.princeton.edu/, can be used to determine the word sense, part of speech, synonyms, hyponyms, hypernyms, etc.
  • the keyword expander 170 then outputs an expanded query 172 .
  • the expanded query 172 is “mom”, “mother”, and “parent”.
  • Each of the additional query words added by the keyword expander 170 has an associated keyword score based on the strength of the relationship between the additional query word and the original query 122 .
  • “mother” has a keyword score of 1.0 because it is a synonym for “mom” and “parent” has a keyword score of 0.4 because it has a related but not equivalent definition.
  • the expanded query 172 is used to search the database 120 for images and videos having annotations related with the expanded query 172 terms.
  • images and videos with the associated annotation “Mother's Day” would be detected and returned as the query results 124 for displaying to the user on the display device 50 such as shown in FIG. 1 .
  • Query results 124 can be sorted according to a relevance score, the keyword score, the capture time of the image or video, alphabetically according to the name of the image or video, or the like. A user can inspect the results and use manual tools to refine mistakes made by the automatic retrieval of the images and videos.
  • the query 122 for “animals” is processed by the keyword expander 170 to yield the expanded query “creatures”, “zoo”, and “farm”.
  • the database 120 is searched for relevant images and videos, images labeled “Zoo Trip Apr. 6, 2005” are found and returned as the query results 124 .
  • Annotations 118 stored in the database 120 are based on the special event names from the personal calendar 146 . Additional annotations can be generated to store in the database from the special event names. Preferably, additional annotations are hypernyns of the special event names. For example, if the special event name is “dog show” then additional annotations could be “animal”, “creature” and “living thing”.
  • FIG. 7 A further alternative embodiment is illustrated by FIG. 7 .
  • the digital image and video collection 104 is input to a person finder 106 for automatic detection of people in the images and videos.
  • the calendar label annotator 114 inputs personal features 110 , one set per person detected by the person finder 106 , and also inputs appearance models 116 1-N of N different persons of interest.
  • the calendar label annotator 114 also inputs image capture times 112 and special events from the personal calendar 146 .
  • the special events can contain the names (identities) of people present at a special event.
  • the calendar label annotator 114 then recognizes persons of interest from the persons found by the person finder 106 using the appearance models 116 , the image capture time 112 and the special events from the personal calendar 146 .
  • FIG. 8 A further embodiment is shown in FIG. 8 .
  • the list of special events 154 is presented to the user via the display device 50 . The user then selects a subset of the displayed special events, thereby defining a list of special events of interest 180 .
  • the list of special events 154 can be displayed as a text list of the titles of the special events and the associated calendar times. Alternatively, the list of special events can be displayed as a calendar with the positions of the titles of the special events indicative of the calendar times associated with the special events. This type of calendar display is commonly known and practiced for calendar applications.
  • the title of a particular special event can be replaced by or augmented by an image or video of interest 158 associated with that particular special event.
  • the list of special events of interest 180 is then passed to the time comparer 156 for finding images and videos of interest 158 associated with the special events of interest 180 .
  • This interface allows the user to retrieve images of interest for annotation from only the special events of interest 180 rather than all of the special events 154 .
  • FIG. 9 A still further embodiment is shown in FIG. 9 .
  • the personal calendar 146 is selected from among a set of personal calendars 242 by the calendar selector 244 .
  • the calendar selector 244 selects the personal calendar 146 for passing to the calendar label annotator 114 that corresponds to the identity of a photographer 232 .
  • the images and videos of the collection 104 are also analyzed by a photographer determiner 238 to determine the identity of the particular photographer for each image and video.
  • the identity of the photographer 232 can be stored in the “Camera Owner”, “Image Creator”, “Photographer”, or “Copyright” tags for example.
  • the identity of the photographer of an image or video can be enterer manually before, during, or after capturing the video.
  • several cameras e.g. in U.S. Pat. Application Publication 20020080256A1 have been described that contain means for extracting biometric information from the photographer 232 , identifying the photographer 232 , and then annotating the image with the identity of the photographer 232 .
  • the photographer determiner 238 discovers the identity of the photographer 232 and passes that information to the calendar selector 244 .
  • the photographer 232 cannot be identified by the photographer determiner 238 .
  • the photographer 232 is “unknown”. For example, this situation can occur when a person who owns the camera is on vacation and asks a stranger to use her (the vacationing camera owner) camera to capture an image of her in front of a landmark.
  • a camera such as described in U.S. Pat. Application Publication 20020080256A1 can only feasibly identify the photographer 232 from a small set of potential camera users (e.g. the primary user is probably the camera owner, and secondary users are friends and family of the camera owner) whose profiles are known by the camera. In this case, an image captured by a stranger using the camera would simply be identified by the photographer determiner 238 as having an “unknown” photographer 232 .
  • each member of a family can have a personal calendar 146 .
  • the selection of the personal calendar 146 from a set of personal calendars can be based on the identity of the photographer 232 of each image or video.
  • an image captured by Jim is input to the calendar label annotator 114 for comparing the image capture time with the special event times associated with Jim's calendar.
  • FIG. 10 is a block diagram of a digital camera phone 300 based imaging system that can implement the present invention.
  • the digital camera phone 300 is one type of image capture device 10 .
  • the digital camera phone 300 is a portable battery operated device, small enough to be easily handheld by a user when capturing and reviewing images.
  • the digital camera phone 300 produces digital images that are stored using the image data/memory 330 , which can be, for example, internal Flash EPROM memory, or a removable memory card.
  • Other types of digital image storage media such as magnetic hard drives, magnetic tape, or optical disks, can alternatively be used to provide the image/data memory 330 .
  • the digital camera phone 300 includes a lens 304 which focuses light from a scene (not shown) onto an image sensor array 314 of a CMOS image sensor 310 .
  • the image sensor array 314 can provide color image information using the well-known Bayer color filter pattern.
  • the image sensor array 314 is controlled by timing generator 312 , which also controls a flash 302 in order to illuminate the scene when the ambient illumination is low.
  • the image sensor array 314 can have, for example, 1280 columns ⁇ 960 rows of pixels.
  • the digital camera phone 300 can also store video clips, by summing multiple pixels of the image sensor array 314 together (e.g. summing pixels of the same color within each 4 column ⁇ 4 row area of the image sensor array 314 ) to create a lower resolution video image frame.
  • the video image frames are read from the image sensor array 314 at regular intervals, for example using a 15 frame per second readout rate.
  • the analog output signals from the image sensor array 314 are amplified and converted to digital data by the analog-to-digital (A/D) converter circuit 316 on the CMOS image sensor 310 .
  • the digital data is stored in a DRAM buffer memory 318 and subsequently processed by an digital processor 320 controlled by the firmware stored in firmware memory 328 , which can be flash EPROM memory.
  • the digital processor 320 includes a real-time clock 324 , which keeps the date and time even when the digital camera phone 300 and digital processor 320 are in their low power state.
  • the processed digital image files are stored in the image/data memory 330 .
  • the image/data memory 330 can also be used to store the user's personal calendar information, as will be described later in reference to FIG. 11 .
  • the image/data memory can also store other types of data, such as phone numbers, to-do lists, and the like.
  • the digital processor 320 performs color interpolation followed by color and tone correction, in order to produce rendered sRGB image data.
  • the digital processor 320 can also provide various image sizes selected by the user.
  • the rendered sRGB image data is then JPEG compressed and stored as a JPEG image file in the image/data memory 330 .
  • the JPEG file uses the so-called “Exif” image format described earlier. This format includes an Exif application segment that stores particular image metadata using various TIFF tags. Separate TIFF tags can be used, for example, to store the date and time the picture was captured, the lens f/number and other camera settings, and to store image captions.
  • the ImageDescription tag can be used to store labels, as will be described later in reference to FIG. 11 .
  • the real-time clock 324 provides a date/time value, which is stored as date/time metadata in each Exif image file.
  • the digital processor 320 also creates a low-resolution “thumbnail” size image, which can be created as described in commonly-assigned U.S. Pat. No. 5,164,831, entitled “Electronic Still Camera Providing Multi-Format Storage Of Full And Reduced Resolution Images” to Kuchta, et al., the disclosure of which is herein incorporated by reference.
  • the thumbnail image can be stored in RAM memory 322 and supplied to a color display 332 , which can be, for example, an active matrix LCD or organic light emitting diode (OLED). After images are captured, they can be quickly reviewed on the color LCD image display 332 by using the thumbnail image data.
  • the graphical user interface displayed on the color display 332 is controlled by user controls 334 .
  • the user controls 334 can include dedicated push buttons (e.g. a telephone keypad) to dial a phone number, a control to set the mode (e.g. “phone” mode, “calendar” mode” “camera” mode), a joystick controller that includes 4- way control (up, down, left, right) and a push-button center “OK” switch, or the like.
  • An audio codec 340 connected to the digital processor 320 receives an audio signal from a microphone 342 and provides an audio signal to a speaker 344 .
  • These components can be used both for telephone conversations and to record and playback an audio track, along with a video sequence or still image.
  • the speaker 344 can also be used to inform the user of an incoming phone call. This can be done using a standard ring tone stored in firmware memory 328 , or by using a custom ring-tone downloaded from the mobile phone network 358 and stored in the image/data memory 330 .
  • a vibration device (not shown) can be used to provide a silent (e.g. non audible) notification of an incoming phone call.
  • a dock interface 362 can be used to connect the digital camera phone 300 to a dock/charger 364 , which is connected to the general control computer 40 .
  • the dock interface 362 may conform to, for example, the well-know USB interface specification.
  • the interface between the digital camera 300 and the image capture device 10 can be a wireless interface, such as the well-known Bluetooth wireless interface or the well-know 802.11b wireless interface.
  • the dock interface 362 can be used to download images from the image/data memory 330 to the general control computer 40 .
  • the dock interface 362 can also be used to transfer calendar information from the general control computer 40 to the image/data memory in the digital camera phone 300 .
  • the dock/charger 364 can also be used to recharge the batteries (not shown) in the digital camera phone 300 .
  • the digital processor 320 is coupled to a wireless modem 350 , which enables the digital camera phone 300 to transmit and receive information via an RF channel 352 .
  • the wireless modem 350 communicates over a radio frequency (e.g. wireless) link with a mobile phone network 358 , such as a 3GSM network.
  • the mobile phone network 358 communicates with a photo service provider 372 , which can store digital images uploaded from the digital camera phone 300 . These images can be accessed via the Internet 370 by other devices, including the general control computer 40 .
  • the mobile phone network 358 also connects to a standard telephone network (not shown) in order to provide normal telephone service.
  • FIG. 11 is a flow chart of yet another embodiment of the present invention.
  • This embodiment can use the digital camera phone 300 based imaging system described earlier in reference to FIG. 10 .
  • the calendar information is stored in the digital camera phone 300 and accessed in order to provide appropriate labels for digital images as they are captured.
  • special events are recorded in the personal calendar 146 . This can be done using general control computer 40 . As described earlier in reference to FIG. 2 , the special events recorded in the personal calendar 146 can include appointments 140 and occasions 142 . These special events are labels for blocks of time that are personalized to the user of the digital camera phone 300 .
  • the personal calendar 146 is transferred to the digital camera phone 300 .
  • the transfer can be accomplished by using the dock/recharger 364 and dock interface 362 to transfer the calendar data from a hard drive (not shown) of the general control computer 40 to the digital processor 320 in the digital camera 300 .
  • the digital processor 320 then stores the personal calendar 140 in the image/data memory 330 .
  • the digital camera phone 300 includes user controls 334 that enable the user to select various operating modes.
  • the digital camera phone 330 operates as a standard mobile phone.
  • the digital camera phone 330 displays calendar information on the color display 332 . This enables the user to review and modify the appointments 140 or occasions 142 stored as part of the personal calendar 146 in the image/data memory 330 .
  • the digital camera phone 300 operates as a still or video camera, in order to capture, display, and transfer images.
  • the user selects the camera mode.
  • the user composes the image(s) to be captured, using the color display 332 as a viewfinder, and presses one of the user controls 334 (e.g. a shutter button, not shown) to capture the image(s).
  • the image signals provided by the image sensor array 314 are converted to digital signals by A/D converter circuit 316 and stored in DRAM buffer memory 318 .
  • the digital processor 320 reads the value of the real time clock 324 after each image is captured, to determine the date and time the picture was taken. In block 412 , the digital processor 320 retrieves the calendar entry for the date/time provided by the real time clock 324 .
  • the digital processor 320 determines if the calendar entry for the current date/time corresponds to a special event time, as was described earlier in reference to FIG. 3 .
  • the digital processor 320 uses the calendar entry to create proposed image metadata. For example, if the calendar entry is “Matthew's Soccer game”, the proposed image metadata could be “Event: Soccer game, Subject: Matthew”.
  • the proposed image metadata is displayed on the color display 332 along with the image(s) captured in block 406 .
  • This enables the user to see the proposed image metadata, and check whether or not it is an appropriate label for the captured image(s).
  • the processor 320 displays a request for the user to approve the proposed metadata. This can be done, for example, by displaying the text “OK?” along with “yes” and “no” selectable responses, on the color display 332 .
  • the user selects either the “yes” or “no” response, using the user controls 334 . If the metadata is not appropriate, the user selects the “no” response. This can happen if the special event did not take place, for example if the soccer game was cancelled, so that the images correspond to a different type of event.
  • the digital processor 320 displays a user interface screen on the color display 332 which enables the user to modify the metadata. This can be done by simply deleting the metadata, or by selecting alternate metadata.
  • the alternate metadata can be selected from a list of frequently used labels (e.g. Science museum, playground) or can be manually entered text strings.
  • the approved (or modified) metadata can be automatically stored in the image files of all subsequent images taken of the same photo event (for example for all images taken until the camera is turned off), without repeating blocks 416 - 422 .
  • the digital processor 320 stores the metadata in the image file(s) of the captured image(s).
  • the metadata can be stored in the ImageDescription tag of the Exif file which contains the captured still image.
  • the image files are transferred to the database 120 .
  • This can be done, for example, by using the wireless modem 350 to transmit the image files over the mobile phone network 358 to the photo service provider 372 .
  • the photo service provider 372 can then store the image files, and enable them to be accessed by various computers, including general control computer 40 , over the Internet 370 .
  • the image files can also be accessed by the digital camera phone 300 , using the mobile phone network 358 .
  • the image files can be transferred to the general control computer 40 using the dock interface 362 and dock/recharger 364 .
  • the metadata in the image file such as the Date/Time metadata and the special event labels stored using the ImageDescription tag, can also be read from each image file and stored in a separate metadata database along with the image name, to enable more rapid searching.
  • the metadata of the database 120 is searched to locate images of interest. This can be accomplished by entering the query 122 , as described earlier in reference to FIG. 6 .
  • the images having metadata which best match the query 122 are displayed. If the images are stored in the general control computer 40 , they can be displayed on the display device 50 . Alternatively, if the images are stored by the photo service provider 372 , they can be transferred to the digital camera phone 300 using the mobile phone network 358 and displayed on the color display 332 .
  • the user can modify the metadata associated with particular photo events, in order to correct or augment the metadata labels.
  • the modified metadata labels are then stored in the database 120 .
  • FIG. 12 is a flow chart of a still further embodiment of the present invention.
  • different special events are associated with different list of individuals (such as family, friends, team mates, work associates, etc.) who the user would like to be able to access the images of that type of special event taken by the user.
  • individuals such as family, friends, team mates, work associates, etc.
  • the user may want to allow all of Sarah's relatives to view images of Sarah's birthday, and may want to allow all of Matthew's team mates to view images of Matthew's soccer game.
  • a personal calendar 146 is made accessible over a network. This can be done by enabling the photo service provider 372 to access the personal calendar 146 via the Internet 370 .
  • share lists for at least some of the special events recorded in the personal calendar 146 are stored. This can be done, for example, by enabling the user to select, from the user's share list already provided by the photo service provider 372 , those users that will be allowed to access images associated with different types of special events. For example, there might be a first list of family and friends that are allowed to access holiday and birthday images, a second list of teammates that are allowed to access images of team events, and a third list of work colleagues that are allowed to access images of work associated events.
  • an event-specific share list can be created automatically by adding all of the participants of a particular event recorded in a personal calendar (e.g. all of the people invited to a party or a meeting) to the share list for that special event.
  • the user selects the camera mode and in block 406 , the user composes the image(s) to be captured, as was described earlier in reference to FIG. 11 .
  • the digital processor 320 reads the value of the real time clock 324 after each image is captured, to determine the date and time the picture was taken.
  • the date/time is stored as metadata is association with the captured image(s), for example in the Date/Time TIFF tag of the Exif image file.
  • the digital processor 320 in the digital camera phone 300 initiates the transfer of the captured image(s) to the database. This can be done automatically after each image is captured or after a certain number of images are captured, or can be manually initiated by the user. Once initiated, the wireless modem 354 begins to transmit the image file(s) over the mobile phone network 358 to the photo service provider 372 .
  • photo service provider retrieves the calendar entry for the date/time stored in the transferred image file.
  • the service provider 372 determines if the calendar entry for the current date/time corresponds to a special event time, as was described earlier in reference to FIG. 11 .
  • the service provider 372 uses the photo event to create image metadata and a proposed “share list”, which is one of the share lists stored in block 403 .
  • a proposed “share list” is one of the share lists stored in block 403 .
  • the calendar entry is “Matthew's Soccer game”
  • the image metadata could be “Event: Soccer game, Subject: Matthew”
  • the proposed share list could be the list of all of Matthew's teammates.
  • the share list is communicated to the digital camera phone 300 via the wireless modem 350 , and is displayed on the color display 332 along with the image(s) captured in block 406 .
  • This enables the user to decide whether or not to share the captured images with the share list associated with the identified special event.
  • the processor 320 displays a request for the user to approve the proposed share list. This can be done, for example, by displaying the text “OK?” along with “yes” and “no” selectable responses, on the color display 332 .
  • the user selects either the “yes” or “no” response, using the user controls 334 . If the share list is not appropriate, the user selects the “no” response. This can happen if the special event did not take place, for example if the soccer game was cancelled, so that the images correspond to a different type of event.
  • the digital processor 320 displays a user interface screen on the color display 332 which enables the user to modify the share list. This can be done by simply not allowing the images to be shared with anyone else, or by selecting an alternative share list.
  • the approved (or modified) share list can be automatically used for all subsequent images taken of the same photo event (for example for all images taken until the camera is turned off), without repeating blocks 414 - 423 .
  • the service provider 372 enables the uploaded images associated with the special event to be shared with those uses on the share list.
  • the images can be shared using methods well-known in the prior art, for example by sending an email to each individual on the share list which contains a link to enable the images to be view using a web browser.
  • the individuals on any share list can also be given other types of authorization, for example to order prints of the transferred images, as described in commonly assigned U.S. Pat. No. 5,760,917 to Sheridan, the disclosure of which is herein incorporated by reference.
  • the metadata of the database 120 is searched to locate images of interest.
  • the search can be performed either by the user of the digital camera 300 , who has access to all of the images, or to individuals on one of the user's share lists, who have access to only those images associated with particular special events. This search can be accomplished by entering the query 122 , as described earlier in reference to FIG. 6 .
  • the images having metadata which best match the query 122 are displayed.
  • the images can be displayed on the display device 50 of the general control computer 40 or they can be transferred to the digital camera phone 300 using the mobile phone network 358 and displayed on the color display 332 .
  • FIG. 13 is a flow chart of an additional embodiment of the present invention.
  • particular special events such as important meetings, concerts, weddings, etc.
  • ring tones can be associated with other special events, such as an short excerpt from “Stars and Stripes forever” for the 4 th of July holiday, or “Happy birthday” for a birthday anniversary.
  • the personal calendar is checked to determine if the current date/time corresponds to a silent event. If it does, the vibration device is used to indicate to the user that there is an incoming call, instead of using a ring tone. If not, the personal calendar is checked to determine if a special ring tone should be used, instead of the default ring tone, to indicate the incoming call.
  • special events are recorded in the personal calendar 146 . This can be done using general control computer 40 . As described earlier in reference to FIG. 2 , the special events recorded in the personal calendar 146 can include appointments 140 and occasions 142 . These special events are labels for blocks of time that are personalized to the user of the digital camera phone 300 .
  • the user can record whether some special events are silent events, during which they do not want their phone to ring. These silent events may include particular work-related meetings or social events, such as weddings, concerts, etc., where a ringing phone would annoy others and embarrass the user.
  • the user can also specify that some recurring events (such as staff meetings or concerts) should always be silent events, so that they are automatically recorded as silent events whether a new event of this type is added to their calendar in block 400 .
  • the user can assign specific ring tones to specific special events. For example, the user can assign holiday ring tones (e.g. a Christmas theme, Thanksgiving theme, Halloween theme) to these holiday special events, a “Happy Birthday” song ring tone to birthday anniversary special events, etc.
  • holiday ring tones e.g. a Christmas theme, Thanksgiving theme, Halloween theme
  • the personal calendar 146 and the assigned ring tones are transferred from the general control computer 40 to the digital camera phone 300 , and stored in the image/data memory 330 .
  • the digital processor 320 detects an incoming phone call from the mobile phone network 358 via the wireless modem 350 .
  • the digital processor 320 reads the value of the real time clock 324 to determine the current date and time. In block 412 , the digital processor 320 retrieves the personal calendar entry for the date/time provided by the real time clock 324 .
  • the digital processor 320 determines if the calendar entry for the current date/time corresponds to a silent event recorded in block 436 .
  • the digital processor 320 determines if the current date/time corresponds to a special event which has been assigned a special ring tone.
  • the digital processor 320 in the digital camera phone 300 uses either the special ring tone determined in block 448 , or the default ring tone, to indicate to the user that there is an incoming phone call.
  • different ring tones can be assigned to different callers.
  • a particular special ring tone may be used only for certain callers.
  • the “Happy birthday” ring tone can be used only for calls from a phone number associated with the person who's birthday anniversary is the special event in the user's personal calendar. This can remind the user to convey a “happy birthday” message to the caller.

Abstract

A method for associating event times or time periods with digital images in a collection for determining if a digital image is of interest, includes storing a collection of digital images each having an associated capture time; comparing the associated capture time in the collection with a special event time to determine if a digital image in the collection is of interest, wherein the comparing step includes calculation of a special event time associated with a special event based on the calendar time associated with the special event and using such information to perform the comparison step; and associating digital images of interest with the special event.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • Reference is made to commonly assigned U.S. patent application Ser. No. 11/116,279, filed Apr. 28, 2005, entitled “Using Time in Recognizing Persons in Images” by Gallagher et al and U.S. patent application Ser. No. ______ filed Jun. 2, 2005, entitled “Using Photographer Identity to Classify Images” by Gallagher et al, the disclosures of which are incorporated herein.
  • FIELD OF THE INVENTION
  • The present invention related to an improved way to identify digital images from a collection, making use of special events.
  • BACKGROUND OF THE INVENTION
  • With the advent of digital photography, consumers are amassing large media collections of digital images and videos. For purposes of this description, the term “images” will be understood to include both still images and “videos” which are a collection of image frames, often having associated audio stream. Therefore, an image collection can contain images or videos or both. The average number of images captures with digital cameras per photographer is still increasing each year. As a consequence, the organization and retrieval of images and videos is already a problem for the typical consumer. Currently, the length of time spanned by a typical consumer's digital image collection is only a few years. The organization and retrieval problem will continue to grow as the length of time spanned by the average digital image and video collection increases.
  • The automatic organization of a media collection, either as an end in itself or for use in other applications—has been the subject of recent research. Relatively sophisticated image content analysis techniques have been used for image indexing and organization. For image indexing and retrieval applications, simple text analysis techniques have also been used on text or spoken annotations associated with individual images or videos. The recent research has involved a number of techniques and tools for automatic albuming of images and videos.
  • Date and time information from the camera has been used to perform event segmentation, as for example described in U.S. Pat. No. 6,606,411 by Loui and Pavie. An event consists of a set of images of videos related to a common event, for example “trip to the beach.”
  • U.S. Pat. No. 6,810,146 by Loui and Stent, described extracting certain types of information from spoken annotations, or the transcriptions of spoken annotations, associated with photographs, and then using the results to perform event segmentation, identification, and summarization.
  • Certain applications allow for viewing images on a timeline. In essence, the images can be viewed or sorted into consecutive order based on the image capture time. For example, the application Picasa 2, distributed by Google, has a timeline view where groups or sets of images and videos are shown to the user and each image set has an associated time (e.g. “April 2005”). In addition, the Adobe application Album 2.0 has a calendar view where calendar pages are shown and small versions of images captured on a specific calendar date are shown on that date. In each case, the software groups images related by capture time into sets. However, the sets of images are not labeled with meaningful names other than the capture date or date range. Thus, the only calendar information used by these applications is the day, month, and year. They do not use any occasion (e.g. Thanksgiving) or appointment (e.g. vacation trip to Florida) information to label the images with meaningful names.
  • Furthermore, U.S. Pat. No. 6,108,640 describes a method for determining periodic occasions such as holidays. However, there is no description of assigning meaningful labels to images or sets of images.
  • In UK Pat. Application GB2403304A, Rowe describes a method of labeling images with labels based on the image capture dates corresponding to national events for later use in text-based search and retrieval of images. This method cannot provide for the fact that for many people, many national holidays are not observed. For example, few people actually celebrate Groundhog Day. Subsequent searches by a user for images of “groundhogs” would return images captured on February 2. Furthermore, since may consumer images are taken on occasions that are not associated with national holidays (such as the birthdays of family members, or vacation trips), this method cannot provide useful labels for most consumer photos.
  • In U.S. Patent Application Publication US20040201740A1, Nakamura and Gibson describe a method of placing images into storage locations based on calendar information. Their method does not provide for automatic annotation of images.
  • In U.S. Pat. application Publication US20050044066 A1, Hooper and Mao describe a calendar-based image asset organization method. Their method allows people to indicate via a graphical user interface a date range of interest. Images captured during that date range are then retrieved for the user. The method does not provide for automatic annotation of images.
  • SUMMARY OF THE INVENTION
  • An object of the present invention to provide an improved way of identifying digital images of interest from a collection of digital images.
  • This object is achieved by storing a collection of digital images or videos or both each having an associated capture time; comparing the associated capture time in the collection with a special event time to determine if a digital image or video in the collection is of interest, wherein the comparing step includes calculation of a special event time associated with a special event based on the calendar time associated with the special event and using such information to perform the comparison step; and associating digital images and videos of interest with the special event.
  • Another object of the present invention is to provide an improved way to labeling digital images captured by a digital capture device.
  • This object is achieved by transferring and storing calendar information in a digital camera device, the calendar information including occasion or appointment information, capturing and storing a digital image in the digital camera device, determining a capture time for the captured digital image, automatically comparing the capture time with the calendar information to determine a special event label, and storing the special event label in association with the captured digital image.
  • It has been found that digital images of interest can be effectively searched in a collection of digital images using special event information that has associated event time or time period information.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above-mentioned and other features and objects of this invention and the manner of attaining them will become more apparent and the invention itself will be better understood by reference to the following description of an embodiment of the invention taken in conjunction with the accompanying figures wherein:
  • FIG. 1 is a schematic diagram of computer system that can implement the present invention;
  • FIG. 2 is a flow chart of an embodiment of the present invention;
  • FIG. 3 is a more detailed flow chart of an embodiment of the present invention;
  • FIG. 4 is graph representation of a calendar time (cross hatch region) and a special event time (line) associated with the special event Christmas;
  • FIG. 5 is a flow chart of an embodiment of the present invention;
  • FIG. 6 is a flow chart of a further embodiment of the present invention;
  • FIG. 7 is a flow chart of yet another embodiment of the present invention;
  • FIG. 8 is a flow chart of another embodiment of the present invention;
  • FIG. 9 is a flow chart of a still further embodiment of the present invention;
  • FIG. 10 is a block diagram of a camera phone based imaging system that can implement the present invention;
  • FIG. 11 is a flow chart of yet another embodiment of the present invention;
  • FIG. 12 is a flow chart of a still further embodiment of the present invention; and
  • FIG. 13 is a flow chart of an additional embodiment of the present invention.
  • DETAILED DESCRIPTION OF THE INVENTION
  • In the following description, some embodiments of the present invention will be described as software programs. Those skilled in the art will readily recognize that the equivalent of such software can also be constructed in hardware. Because image manipulation algorithms and systems are well known, the present description will be directed in particular to algorithms and systems forming part of, or cooperating more directly with, the method in accordance with the present invention. Other aspects of such algorithms and systems, and hardware or software for producing and otherwise processing the image signals involved therewith, not specifically shown or described herein can be selected from such systems, algorithms, components, and elements known in the art. Given the description as set forth in the following specification, all software implementation thereof is conventional and within the ordinary skill in such arts.
  • The present invention can be implemented in computer hardware and computerized equipment. For example, the method can be performed in a digital camera (as will be described later in reference to FIG. 10), a digital printer, on an internet server, on a kiosk, and on a personal computer. Referring to FIG. 1, there is illustrated a computer system for implementing the present invention. Although the computer system is shown for the purpose of illustrating a preferred embodiment, the present invention is not limited to the computer system shown, but can be used on any electronic processing system such as found in digital cameras, home computers, kiosks, retail or wholesale photofinishing, or any other system for the processing of digital images. The computer system includes a microprocessor-based unit 20 (also referred to herein as a digital image processor) for receiving and processing software programs and for performing other processing functions. The digital image processor 20 processes images from image capture devices 10 such as cameras, scanners, or computer image generation software. The digital image processor 20 can be used to process digital images to make adjustments for overall brightness, tone scale, image structure, etc. of digital images in a manner such that a pleasing looking image is produced by an image output device. The digital image processor 20 interfaces with the general control computer 40 (also a microprocessor based unit) for exchanging data and commands. The general control computer 40 and the digital image processor 20 can be two different microprocessors, or the functions of each can be performs by a single physical microprocessor. The digital image processor 20 often outputs an image to an image output device 30 for example a printer for displaying the image. A display device 50 is electrically connected to the digital image processor 20 for displaying user-related information associated with the software, e.g., by means of a graphical user interface. A keyboard 60 is also connected to the microprocessor based unit 20 via the general control computer 40 for permitting a user to input information to the software. As an alternative to using the keyboard 60 for input, a mouse can be used for moving a selector on the display device 50 and for selecting an item on which the selector overlays, as is well known in the art. Digital images and other data can also be stored on an offline memory device 70 such as an external hard drive, flash media, a drive that writes to CD-ROM or DVD media, or the like.
  • A compact disk-read only memory (CD-ROM) which typically includes software programs, is inserted into the general control computer 40 for providing a means of inputting the software programs and other information to the general control computer 40 and the digital image processor 20. In addition, a floppy disk can also include a software program, and is inserted into the general control computer 40 for inputting the software program. Still further, the general control computer 40 can be programmed, as is well known in the art, for storing the software program internally. The general control computer 40 can have a network connection, such as a telephone line or wireless connection, to an external network such as a local area network or the Internet (370 in FIG. 10).
  • Images can also be displayed on the display device 50 via a Flash EPROM memory card, such as the well-known PC Card, Compact Flash, SD, MemoryStick cards.
  • The image output device 30 provides a final image. The image output device 30 can be a printer or other output device that provides a paper or other hard copy final image. The image output device 30 can also be an output device that provides the final image as a digital file. The image output device 30 can also include combinations of output, such as a printed image and a digital file on a memory unit, such as a CD or DVD.
  • A digital image includes one or more digital image channels or color components. Each digital image channel is a two-dimensional array of pixels. Each pixel value relates to the amount of light received by the imaging capture device corresponding to the physical region of pixel. For color imaging applications, a digital image will often consist of red, green, and blue digital image channels. Motion imaging applications can be thought of as a sequence of digital images. Those skilled in the art will recognize that the present invention can be applied to, but is not limited to, a digital image channel for any of the herein-mentioned applications. Although a digital image channel is described as a two dimensional array of pixel values arranged by rows and columns, those skilled in the art will recognize that the present invention can be applied to non rectilinear arrays with equal effect. Those skilled in the art will also recognize that for digital image processing steps described herein below as replacing original pixel values with processed pixel values is functionally equivalent to describing the same processing steps as generating a new digital image with the processed pixel values while retaining the original pixel values.
  • It should also be noted that the present invention can be implemented in a combination of software or hardware and is not limited to devices that are physically connected or located within the same physical location. One or more of the devices illustrated in FIG. 1 can be located remotely and can be connected via a network. One or more of the devices can be connected wirelessly, such as by a radio-frequency link, either directly or via a network.
  • The present invention can be employed in a variety of user contexts and environments. Exemplary contexts and environments include, without limitation, wholesale digital photofinishing (which involves exemplary process steps or stages such as film in, digital processing, prints out), retail digital photofinishing (film in, digital processing, prints out), home printing (home scanned film or digital images, digital processing, prints out), desktop software (software that applies algorithms to digital prints to make them better—or even just to change them), digital fulfillment (digital images in—from media or over the web, digital processing, with images out—in digital form on media, digital form over the web, or printed on hard-copy prints), kiosks (digital or scanned input, digital processing, digital or hard copy output), mobile devices (e.g., PDA or cell phone that can be used as a processing unit, a display unit, or a unit to give processing instructions), and as a service offered via the World Wide Web.
  • In each case, the invention can stand alone or can be a component of a larger system solution. Furthermore, human interfaces, e.g., the scanning or input, the digital processing, the display to a user (if needed), the input of user requests or processing instructions (if needed), the output, can each be on the same or different devices and physical locations, and communication between the devices and locations can be via public or private network connections, or media based communication. Where consistent with the foregoing disclosure of the present invention, the method of the invention can be fully automatic, can have user input (be fully or partially manual), can have user or operator review to accept/reject the result, or can be assisted by metadata (metadata that can be user supplied, supplied by a measuring device (e.g. in a camera), or determined by an algorithm). Moreover, the algorithm(s) can interface with a variety of workflow user interface schemes.
  • The invention is inclusive of combinations of the embodiments described herein. References to “a particular embodiment” and the like refer to features that are present in at least one embodiment of the invention. Separate references to “an embodiment” or “particular embodiments” or the like do not necessarily refer to the same embodiment or embodiments; however, such embodiments are not mutually exclusive, unless so indicated or as are readily apparent to one of skill in the art.
  • An embodiment of the invention is illustrated in FIG. 2. A digital image and video 104 is input to a calendar label annotator 114. It should be understood that herein when referring to digital images, digital videos, being a collection of digital images, are also included. Special events from a personal calendar 146 are used by the calendar label annotator 114 to annotate the digital images and videos in the collection 104. The collection 104 can contain exclusively images, or exclusive videos, or some of each. Resulting annotations 118 (equivalently called labels) are stored in a database 120. Each annotation 118 can be stored with the image (e.g. in the file header) or in a database 120 either located with the digital image or video or remotely (e.g. on a computer server). The database 120 can be in any form. In addition, the database 120 can be distributed across many files or few files. Consequently, the database 120 can be queried 122 to find those images and videos 104 containing the query special event. For example, a query 122 for images of “Daytona Beach” returns query results 124 containing the set of digital images or videos 104 with a related annotation of “vacation to Daytona Beach.” Then query results 124 can be returned. The query results 124 are the set of digital images and videos 104 associated with the query special event.
  • The digital image and videos from the collection 104 are passed to a capture time extractor 108. The capture time extractor 108 determines the time each digital image or video 104 was captured, and outputs image capture times 112. The image capture time 112 of the digital image or video 104 is determined by one of several methods by the capture time extractor 108. Typically the capture time is embedded in the file header of the digital image or video 104. For example, the EXIF image format (described at www.exif.org) allows the image capture device 10 to store information associated with the digital image or video 104 in the file header. The “Date\Time” entry is associated with the date and time the image was captured. In some cases, the digital image or video 104 results from scanning film or prints and the image capture time 112 is determined by detection of the date exposed into the image area (as is often done at capture time), usually in the lower right corner of the image. The date a photograph is printed is often printed on the back of the print. Alternatively, some film systems (such as APS) contain a magnetic layer in the film for storing information such as the capture date. The capture time extractor 108 uses the most appropriate method for extracting the image capture time 112 of the image. Preferably, the source of the digital images and videos 104 is a digital camera, and the capture time extractor 108 extracts the capture time from the file information.
  • Note that the image capture time 112 can be a precise minute in time, e.g. Dec. 9, 2000 at 10:00 AM. Or the image capture time 112 can be less precise, e.g. 2000 or December 2000. The image capture time 112 can be in the form of a probability distribution function e.g. Dec. 9, 2000±2 days with 95% confidence. The image capture time 112 is input to the calendar label annotator 114.
  • The personal calendar 146 is preferably a software application running on a computer that is useful for recording special events. For example, the special events can be input to the personal calendar 146 by the user or by others or other applications. The special events recorded in the personal calendar 146 are of three types; occasions, appointments, and journal entries. Essentially, the special events are labels for blocks of time that are personalized to the user (or users) of the calendar. For example, Outlook by Microsoft and Notes by Lotus are both examples of calendar applications. The personal calendar 146 can also be an application that operates on a handheld device, such as a personal digital assistant (PDA) or a cellular telephone camera, as will be described later in reference to FIG. 11. The personal calendar 146 can also be an application that runs on a server and is accessed by the user via the Internet.
  • The first type of the three special events recorded in a personal calendar 146 is an occasion 142. The occasion 142 is a periodically occurring special event, celebration, religious holiday (fixed or moveable), national holiday, festival, or the like. In general, occasions 142 can be computed in advance for any time period because mathematical formulas are used to determine the date of the occasion 142 in each subsequent year, as described by U.S. Pat. No. 6,108,640. In general, an occasion 142 is a name placed on an entire day. For example, every year Christmas is December 25. The occasions 142 celebrated by, observed by, or relevant to a particular person depend on a number of factors, including geographical, religious, and political factors. The user can indicate to the personal calendar 146 which occasions 142 are relevant for him or her via user input 150. The user can select relevant holidays from a list, or select between sets of holidays (i.e. Canadian holidays vs. American holidays), or indicate the geographic, religious, and political factors and have the computer guess at the set of relevant occasions to the user, which can then be refined through further user input 150. While holidays such as Easter, Christmas, Hanukkah and the like are widely observed; most people also celebrate personal occasions 142 such as birthday anniversaries, wedding anniversaries, etc. These birthday and wedding anniversaries can be for the user's family and close friends. The user indicates these personal occasions 142 via user input 150 to the personal calendar 146. The user can indicate special events by user inspection (i.e. selecting from lists of occasions presented to the user via the display device 50 as shown in FIG. 1) or automatically or both.
  • A second type of special event is an appointment 140. Appointment 140 is a special event describing something planned to take place in the future that is relevant for the user or users of the personal calendar 146. In general, appointments 140 are not names placed on an entire day (as is the case with occasions). For example, an appointment 140 can be “doctor appointment at 2” or “Matthew's cub scout meeting at 6 PM” or also “meet Sarah at the zoo at 4 PM”. The appointment 140 can span several days or weeks, such as “Camping vacation Aug. 9-14, 2004”.
  • The third type of special event is a journal entry 144. The journal entry 144 is an entry to the personal calendar 146 describing events that have occurred in the past (as of the time of entry of the journal entry 144.) For example, the special event on Nov. 24, 2004 “Jonah first crawled today” is a journal entry because it describes events that already took place at the time of entry. Some journal entries 144 are quite lengthy, such as a diary where a user enters a few paragraphs describing the events and reflections of the day. These diary entries are journal entries 144 because they are associated with a particular calendar time, and are related to events associated with that calendar time. A modern form of the diary is a blog (or weblog) that allows users to keep an on-line diary on the Internet (e.g. www.blogger.com.)
  • Each special event has an associated calendar time. As previously stated, the calendar time associated with an occasion 142 is a single day. The calendar time associated with an appointment 140 can be a precise moment in time or a range of time. The range of time can be any length; seconds, minutes, days or weeks long for example. In fact, the calendar time associated with an appointment 140 can have only a starting time indicated (e.g. “doctor appointment at 2”). In this case, the appointment 140 ending time is estimated (e.g. to be 2 hours long). The journal entry's associated calendar time can be a precise moment in time or a range of time. The calendar time associated with a special event can be in the form of a probability distribution function e.g. a normal distribution centered on Dec. 9, 2000 at 10:00 AM with standard deviation of 2 days.
  • The calendar label annotator 114 compares the image capture time 118 with the special event times from the personal calendar 146. Special event times will be described further herein below. The calendar label annotator 114 can reside on a camera, a personal computer, a server, a cellular telephone, a PDA, or the like. The calendar label annotator 114 determines digital images and videos of interest from the collection 104 by comparing the associated image capture times 112 with the special event times associated with special events. Each digital image or video of interest is then associated with its corresponding special event, preferably by producing an annotation 118 stored in a database 120 indicating that the digital image or video 104 is associated with the special event.
  • FIG. 3 shows a more detailed view of the calendar label annotator 114. The event processor 152 processes the special events from the personal calendar 146. The special events from the personal calendar 146 each have an associated calendar time, as previously described. In addition, the special events can have other information from user input 150. For example, each special event can have labeled whether it is an “imaging event” or not by the user. An imaging event is a special event that might correspond to digital images or videos 104 in the user's digital image and video collection 104. For example, the appointment “Vacation to the Beach” from Aug. 12-19, 2005 probably is an imaging event, while the appointment “dentist” at 4:00 Aug. 2, 2005 probably is not.
  • The event processor 152 determines a special event time for each special event. The special event time related to the calendar time associated with the special event, but generally has a broader time span to account for the fact that images related to a special event need not occur during the calendar time associated with the special event. The special event time is necessary because it is common for images and videos related to a special event to occur slightly before or after the calendar time associated with the special event. For example, a child's birthday occasion can be June 26, but the birthday party can be held on June 23. As another example, it is common that many images relate to Christmas even though captured before December 25. Images of setting up a Christmas tree, singing Christmas carols, or visiting St. Nicolas at the mall can occur days or weeks prior to the calendar time (Dec. 25) associated with the special event occasion of Christmas. The event processor 152 sets a special event time for each special event using user input 150 or a rule-based system. As with the image capture time 112 and the calendar time associated with special events, the special event time can be a period of time or represented as a probability distribution function. For example the user can indicate that the special event time associated with Christmas is the period of time beginning on Dec. 13 and ending on Dec. 26. The user can also indicate that the special event time for all birthdays is the period of time from 3 days prior to 3 days following the calendar time of the birthday. For example, FIG. 4 shows an illustrative plot of the calendar time and the special event time associated with Christmas 2004. The calendar time is shown as a hatched area and the special event time is shown with a line. If the user does not indicate special event times, then the following default rules are used to determine the special event time from the calendar time for a special event:
    Appointments: a = b = 0.1 (q-w)
    Journal Entries: a = b = 0
    Occasions:
    Birthdays/anniversaries a = b = 3 days
    Christmas a = 20 days, b = 1 day
    Other occasions a = b = 1 days

    The special event time is computed as a time period beginning at time w−a, and ending at time q+b, where w is the beginning of the calendar time associated with the special event (or 5% of the calendar time when the calendar time is a probability distribution function p(x)), and q is the end of the calendar time associated with the special event (or 95% of the calendar time when the calendar time is a probability distribution function p(x)).
  • By inspection of the equations, it can be seen that the special event time associated with a special event is calculated from the calendar time associated with the special event.
  • The collection of special event times, whether automatically generated or manually entered, is a list of special event times.
  • Referring again to FIG. 3, the event processor 152 also determines the likelihood that special events from the personal calendar 146 are imaging events as previously mentioned. This determination can be based on user input 150 where the user indicates whether an event is an imaging event or not. Furthermore, in the absence of user input 150, the event processor 152 can automatically determine the likelihood that a special event is an imaging event. This is accomplished by collecting the images collections and personal calendars from a large number of people and manually determining the relationships between images and videos in the collection 104 and special events in the calendar. The likelihood that special events are imaging events can therefore be learned in such fashion using well-known machine learning techniques. For example, it is extremely rare that images in a collection 104 correspond to appointments for medical treatment such as doctor and dentist visits. Therefore, the event processor 152 assigns a likelihood to each special event indicating the likelihood that the special event is an imaging event. The likelihood can be a probability or can be a binary flag. This flag is used to improve the accuracy of the annotation provided by the present invention. For example, images that happen by coincidence to have an image capture time 112 also corresponding to a special event time would normally be incorrectly annotated by the special event name. For example, a doctor appointment ended earlier than expected, then the person walked to a park and photographed ducks. The images of ducks would erroneously be associated with the annotation “Doctor Appointment” were it not that the event processor 152 determined that the doctor appointment is not an imaging event.)
  • The event processor 152 outputs a special event list 154. The special event list 154 is a list of special events including the name of the special event, the associated calendar time, and the associated special event time. The special event list 154 also indicates the likelihood that each special event is an imaging event as previously mentioned. Additional information such as the people present at the special event, the location of the special event (e.g. my house, the park, Erie, Pa., etc.) can also be included for special events in the list.
  • The time comparer 156 compares the associated capture times for the digital images and videos in the collection 104 with a special event time or time period to determine digital images and videos of interest 158. Digital images and videos of interest 158 are those images and videos from the digital image and video collection having an image capture time 112 coincident with the special event time associated with a special event. For example, an image captured on Dec. 24, 2004 is an image of interest 158 associated with the special event occasion Christmas when the special event time associated with the occasion Christmas is as shown in FIG. 4. The image would then receive the annotation 118 “Christmas” that would be stored in association with the image in a database 120. Then, when a user places a query 122 for images of “Christmas”, the database 120 is searched by well-known techniques and the aforementioned image with image capture time 112 of Dec. 24, 2004 will be among the query results 124. The query results can then be shown on the display device 50 such as shown in FIG. 1.
  • An annotation 118 associated with a digital image or video of interest 158 is special event identifier information and can contain multiple fields, such as: the name of the special event, the associated calendar time, and the associated special event time, the number of and identities of people present at the special event (e.g. guest list for a birthday party), the location of the special event (e.g. my house, the park, Erie, Pa., etc.). The annotation 118 is information related to the special event. Note that the annotations 118 made automatically by comparing the image capture time to the special event time can be marked in the database 120 as being automatically generated. This allows a user to provide a query 122 and search the database 120 for matches considering just manually entered annotations 118, just automatically generated annotations 118, or combinations thereof as desired by the user.
  • Not all of the images and videos in the collection 104 will have capture times 112 that correspond to special event times associated with events in the special event list 154. In other words, not all of the digital images and videos from the collection 104 will be images and videos of interest 158. These images can be shown to the user on the display device 50 such as shown in FIG. 1 for manual labeling.
  • The time comparer 156 can output a confidence score associated with each image of interest 158 that depends on both the image capture time 112 of the image of interest and the special event time of the special event. Preferably, the confidence score is the value of the p(x) evaluated where x is the image capture time 112 (or mean image capture time when the image capture time is a probability distribution function).
  • An alternative embodiment is illustrated in FIG. 5. In this embodiment, the digital image and video collection 104 is input to an event clusterer 162 for clustering the digital images and videos 104 into event clusters 164 which are mutually exclusive smaller sets of digital images and videos 104 according to image content and image capture times 112. Each set of digital images and videos 104 corresponds to a chronologically related segmentation of the digital image and video collection 104. In other words, each event cluster 164 contains all images from the digital image and video collection 104 captured after the earliest image or video in the event cluster and before the latest image in the event cluster 164. Commonly assigned U.S. Pat. No. 6,606,411 to Loui and Pavie, incorporated herein by reference, describes the production of event clusters 164 from a set of digital images and videos 104 in detail.
  • The capture time extractor 108 then inputs each event cluster 164 and determined the image capture times 112 associated with each event cluster 164. Preferably, the image capture time 112 for a particular event cluster 164 is a time period or time range spanning the time between the earliest and the latest image in the event cluster 164. The image capture times 112 and the pre-determined groups of images and videos called event clusters 164 are input to the calendar label annotator 114 along with the personal calendar 146 for producing annotations 118 that are associated with the event clusters 164 in a database 120. This alternative embodiment illustrates the utility of the present invention for annotating groups of images and videos in addition to the aforementioned utility of annotating single images and videos from a collection of digital images and videos 104. In this embodiment, the comparing step performed by the time comparer 156 of FIG. 3 includes associating pre-determined groups of images with a special event by comparing the image capture times 112 associated with groups of images with the special event times.
  • A further alternative embodiment is illustrated in FIG. 6. In this embodiment, the database 120 containing annotations 118 from the calendar label annotator 114 is queried 122 by a user to find images of interest. For example, the user can search for images of interest by entering the query 122 of “mom”. The query 122 is input to a keyword expander 170 which expands the terms in the query 122 by augmenting additional related words. The keyword expansion can be performed using techniques from the field of natural language expansion. For example, the database WordNet, maintained by Princeton University and available on-line at http://wordnet.princeton.edu/, can be used to determine the word sense, part of speech, synonyms, hyponyms, hypernyms, etc. The keyword expander 170 then outputs an expanded query 172. For example, when the query 122 is “mom” the expanded query 172 is “mom”, “mother”, and “parent”. Each of the additional query words added by the keyword expander 170 has an associated keyword score based on the strength of the relationship between the additional query word and the original query 122. For example, “mother” has a keyword score of 1.0 because it is a synonym for “mom” and “parent” has a keyword score of 0.4 because it has a related but not equivalent definition. The expanded query 172 is used to search the database 120 for images and videos having annotations related with the expanded query 172 terms. Continuing with the example, images and videos with the associated annotation “Mother's Day” would be detected and returned as the query results 124 for displaying to the user on the display device 50 such as shown in FIG. 1. Query results 124 can be sorted according to a relevance score, the keyword score, the capture time of the image or video, alphabetically according to the name of the image or video, or the like. A user can inspect the results and use manual tools to refine mistakes made by the automatic retrieval of the images and videos.
  • In another example, the query 122 for “animals” is processed by the keyword expander 170 to yield the expanded query “creatures”, “zoo”, and “farm”. When the database 120 is searched for relevant images and videos, images labeled “Zoo Trip Apr. 6, 2005” are found and returned as the query results 124.
  • Annotations 118 stored in the database 120 are based on the special event names from the personal calendar 146. Additional annotations can be generated to store in the database from the special event names. Preferably, additional annotations are hypernyns of the special event names. For example, if the special event name is “dog show” then additional annotations could be “animal”, “creature” and “living thing”.
  • A further alternative embodiment is illustrated by FIG. 7. The digital image and video collection 104 is input to a person finder 106 for automatic detection of people in the images and videos. The calendar label annotator 114 inputs personal features 110, one set per person detected by the person finder 106, and also inputs appearance models 116 1-N of N different persons of interest. The calendar label annotator 114 also inputs image capture times 112 and special events from the personal calendar 146. As described hereinabove, the special events can contain the names (identities) of people present at a special event. The calendar label annotator 114 then recognizes persons of interest from the persons found by the person finder 106 using the appearance models 116, the image capture time 112 and the special events from the personal calendar 146.
  • When the special event contains a list of names of people present at a special event, then it is more likely than normal that these people will appear in images or videos captured at the event (i.e. images and videos of interest 158 for that particular event). Thus, in recognizing people in images and videos of interest for a particular event, greater weight (probability) is placed on people who have both appearance models 116 and were present at the special event of interest. This embodiment produces more accurate recognition of people in a digital image and video collection 104.
  • A further embodiment is shown in FIG. 8. In this embodiment, the list of special events 154 is presented to the user via the display device 50. The user then selects a subset of the displayed special events, thereby defining a list of special events of interest 180. The list of special events 154 can be displayed as a text list of the titles of the special events and the associated calendar times. Alternatively, the list of special events can be displayed as a calendar with the positions of the titles of the special events indicative of the calendar times associated with the special events. This type of calendar display is commonly known and practiced for calendar applications. The title of a particular special event can be replaced by or augmented by an image or video of interest 158 associated with that particular special event. The list of special events of interest 180 is then passed to the time comparer 156 for finding images and videos of interest 158 associated with the special events of interest 180. This interface allows the user to retrieve images of interest for annotation from only the special events of interest 180 rather than all of the special events 154.
  • A still further embodiment is shown in FIG. 9. The personal calendar 146 is selected from among a set of personal calendars 242 by the calendar selector 244. The calendar selector 244 selects the personal calendar 146 for passing to the calendar label annotator 114 that corresponds to the identity of a photographer 232.
  • The images and videos of the collection 104 are also analyzed by a photographer determiner 238 to determine the identity of the particular photographer for each image and video. When the EXIF file format is use, the identity of the photographer 232 can be stored in the “Camera Owner”, “Image Creator”, “Photographer”, or “Copyright” tags for example. The identity of the photographer of an image or video can be enterer manually before, during, or after capturing the video. Furthermore, several cameras (e.g. in U.S. Pat. Application Publication 20020080256A1) have been described that contain means for extracting biometric information from the photographer 232, identifying the photographer 232, and then annotating the image with the identity of the photographer 232. In any case, the photographer determiner 238 discovers the identity of the photographer 232 and passes that information to the calendar selector 244. In some cases, the photographer 232 cannot be identified by the photographer determiner 238. In this case, the photographer 232 is “unknown”. For example, this situation can occur when a person who owns the camera is on vacation and asks a stranger to use her (the vacationing camera owner) camera to capture an image of her in front of a landmark. A camera such as described in U.S. Pat. Application Publication 20020080256A1 can only feasibly identify the photographer 232 from a small set of potential camera users (e.g. the primary user is probably the camera owner, and secondary users are friends and family of the camera owner) whose profiles are known by the camera. In this case, an image captured by a stranger using the camera would simply be identified by the photographer determiner 238 as having an “unknown” photographer 232.
  • For example, each member of a family can have a personal calendar 146. The selection of the personal calendar 146 from a set of personal calendars can be based on the identity of the photographer 232 of each image or video. For example, an image captured by Jim is input to the calendar label annotator 114 for comparing the image capture time with the special event times associated with Jim's calendar.
  • FIG. 10 is a block diagram of a digital camera phone 300 based imaging system that can implement the present invention. The digital camera phone 300 is one type of image capture device 10. Preferably, the digital camera phone 300 is a portable battery operated device, small enough to be easily handheld by a user when capturing and reviewing images. The digital camera phone 300 produces digital images that are stored using the image data/memory 330, which can be, for example, internal Flash EPROM memory, or a removable memory card. Other types of digital image storage media, such as magnetic hard drives, magnetic tape, or optical disks, can alternatively be used to provide the image/data memory 330.
  • The digital camera phone 300 includes a lens 304 which focuses light from a scene (not shown) onto an image sensor array 314 of a CMOS image sensor 310. The image sensor array 314 can provide color image information using the well-known Bayer color filter pattern. The image sensor array 314 is controlled by timing generator 312, which also controls a flash 302 in order to illuminate the scene when the ambient illumination is low. The image sensor array 314 can have, for example, 1280 columns×960 rows of pixels.
  • In some embodiments, the digital camera phone 300 can also store video clips, by summing multiple pixels of the image sensor array 314 together (e.g. summing pixels of the same color within each 4 column×4 row area of the image sensor array 314) to create a lower resolution video image frame. The video image frames are read from the image sensor array 314 at regular intervals, for example using a 15 frame per second readout rate.
  • The analog output signals from the image sensor array 314 are amplified and converted to digital data by the analog-to-digital (A/D) converter circuit 316 on the CMOS image sensor 310. The digital data is stored in a DRAM buffer memory 318 and subsequently processed by an digital processor 320 controlled by the firmware stored in firmware memory 328, which can be flash EPROM memory. The digital processor 320 includes a real-time clock 324, which keeps the date and time even when the digital camera phone 300 and digital processor 320 are in their low power state.
  • The processed digital image files are stored in the image/data memory 330. The image/data memory 330 can also be used to store the user's personal calendar information, as will be described later in reference to FIG. 11. The image/data memory can also store other types of data, such as phone numbers, to-do lists, and the like.
  • In the still image mode, the digital processor 320 performs color interpolation followed by color and tone correction, in order to produce rendered sRGB image data. The digital processor 320 can also provide various image sizes selected by the user. The rendered sRGB image data is then JPEG compressed and stored as a JPEG image file in the image/data memory 330. The JPEG file uses the so-called “Exif” image format described earlier. This format includes an Exif application segment that stores particular image metadata using various TIFF tags. Separate TIFF tags can be used, for example, to store the date and time the picture was captured, the lens f/number and other camera settings, and to store image captions. In particular, the ImageDescription tag can be used to store labels, as will be described later in reference to FIG. 11. The real-time clock 324 provides a date/time value, which is stored as date/time metadata in each Exif image file.
  • The digital processor 320 also creates a low-resolution “thumbnail” size image, which can be created as described in commonly-assigned U.S. Pat. No. 5,164,831, entitled “Electronic Still Camera Providing Multi-Format Storage Of Full And Reduced Resolution Images” to Kuchta, et al., the disclosure of which is herein incorporated by reference. The thumbnail image can be stored in RAM memory 322 and supplied to a color display 332, which can be, for example, an active matrix LCD or organic light emitting diode (OLED). After images are captured, they can be quickly reviewed on the color LCD image display 332 by using the thumbnail image data.
  • The graphical user interface displayed on the color display 332 is controlled by user controls 334. The user controls 334 can include dedicated push buttons (e.g. a telephone keypad) to dial a phone number, a control to set the mode (e.g. “phone” mode, “calendar” mode” “camera” mode), a joystick controller that includes 4-way control (up, down, left, right) and a push-button center “OK” switch, or the like.
  • An audio codec 340 connected to the digital processor 320 receives an audio signal from a microphone 342 and provides an audio signal to a speaker 344. These components can be used both for telephone conversations and to record and playback an audio track, along with a video sequence or still image. The speaker 344 can also be used to inform the user of an incoming phone call. This can be done using a standard ring tone stored in firmware memory 328, or by using a custom ring-tone downloaded from the mobile phone network 358 and stored in the image/data memory 330. In addition, a vibration device (not shown) can be used to provide a silent (e.g. non audible) notification of an incoming phone call.
  • A dock interface 362 can be used to connect the digital camera phone 300 to a dock/charger 364, which is connected to the general control computer 40. The dock interface 362 may conform to, for example, the well-know USB interface specification. Alternatively, the interface between the digital camera 300 and the image capture device 10 can be a wireless interface, such as the well-known Bluetooth wireless interface or the well-know 802.11b wireless interface. The dock interface 362 can be used to download images from the image/data memory 330 to the general control computer 40. The dock interface 362 can also be used to transfer calendar information from the general control computer 40 to the image/data memory in the digital camera phone 300. The dock/charger 364 can also be used to recharge the batteries (not shown) in the digital camera phone 300.
  • The digital processor 320 is coupled to a wireless modem 350, which enables the digital camera phone 300 to transmit and receive information via an RF channel 352. The wireless modem 350 communicates over a radio frequency (e.g. wireless) link with a mobile phone network 358, such as a 3GSM network. The mobile phone network 358 communicates with a photo service provider 372, which can store digital images uploaded from the digital camera phone 300. These images can be accessed via the Internet 370 by other devices, including the general control computer 40. The mobile phone network 358 also connects to a standard telephone network (not shown) in order to provide normal telephone service.
  • FIG. 11 is a flow chart of yet another embodiment of the present invention. This embodiment can use the digital camera phone 300 based imaging system described earlier in reference to FIG. 10. In this embodiment, the calendar information is stored in the digital camera phone 300 and accessed in order to provide appropriate labels for digital images as they are captured.
  • In block 400, special events are recorded in the personal calendar 146. This can be done using general control computer 40. As described earlier in reference to FIG. 2, the special events recorded in the personal calendar 146 can include appointments 140 and occasions 142. These special events are labels for blocks of time that are personalized to the user of the digital camera phone 300.
  • In block 402, the personal calendar 146 is transferred to the digital camera phone 300. The transfer can be accomplished by using the dock/recharger 364 and dock interface 362 to transfer the calendar data from a hard drive (not shown) of the general control computer 40 to the digital processor 320 in the digital camera 300. The digital processor 320 then stores the personal calendar 140 in the image/data memory 330.
  • The digital camera phone 300 includes user controls 334 that enable the user to select various operating modes. In the “phone” mode, the digital camera phone 330 operates as a standard mobile phone. In the “calendar” mode, the digital camera phone 330 displays calendar information on the color display 332. This enables the user to review and modify the appointments 140 or occasions 142 stored as part of the personal calendar 146 in the image/data memory 330. In a “camera” mode, the digital camera phone 300 operates as a still or video camera, in order to capture, display, and transfer images.
  • In block 404, the user selects the camera mode. In block 406, the user composes the image(s) to be captured, using the color display 332 as a viewfinder, and presses one of the user controls 334 (e.g. a shutter button, not shown) to capture the image(s). The image signals provided by the image sensor array 314 are converted to digital signals by A/D converter circuit 316 and stored in DRAM buffer memory 318.
  • In block 408, the digital processor 320 reads the value of the real time clock 324 after each image is captured, to determine the date and time the picture was taken. In block 412, the digital processor 320 retrieves the calendar entry for the date/time provided by the real time clock 324.
  • In block 414, the digital processor 320 determines if the calendar entry for the current date/time corresponds to a special event time, as was described earlier in reference to FIG. 3.
  • In block 416, the digital processor 320 uses the calendar entry to create proposed image metadata. For example, if the calendar entry is “Matthew's Soccer game”, the proposed image metadata could be “Event: Soccer game, Subject: Matthew”.
  • In block 418, the proposed image metadata is displayed on the color display 332 along with the image(s) captured in block 406. This enables the user to see the proposed image metadata, and check whether or not it is an appropriate label for the captured image(s). In addition to displaying the proposed image metadata, the processor 320 displays a request for the user to approve the proposed metadata. This can be done, for example, by displaying the text “OK?” along with “yes” and “no” selectable responses, on the color display 332.
  • In block 420, the user selects either the “yes” or “no” response, using the user controls 334. If the metadata is not appropriate, the user selects the “no” response. This can happen if the special event did not take place, for example if the soccer game was cancelled, so that the images correspond to a different type of event.
  • In block 422, if the user does not “OK” the metadata, (“no” to block 420) the digital processor 320 displays a user interface screen on the color display 332 which enables the user to modify the metadata. This can be done by simply deleting the metadata, or by selecting alternate metadata. The alternate metadata can be selected from a list of frequently used labels (e.g. Science museum, playground) or can be manually entered text strings.
  • It should be noted that once the user has approved or modified the metadata for one image (using blocks 412-422) in order to create an acceptable label for the images, there is normally no need to repeat these steps during the same photo event. Thus, the approved (or modified) metadata can be automatically stored in the image files of all subsequent images taken of the same photo event (for example for all images taken until the camera is turned off), without repeating blocks 416-422.
  • In block 424, if the user “OKs” the metadata, (“yes” to block 420), or if the user provides modified metadata in block 422, the digital processor 320 stores the metadata in the image file(s) of the captured image(s). For example, the metadata can be stored in the ImageDescription tag of the Exif file which contains the captured still image.
  • In block 426, the image files are transferred to the database 120. This can be done, for example, by using the wireless modem 350 to transmit the image files over the mobile phone network 358 to the photo service provider 372. The photo service provider 372 can then store the image files, and enable them to be accessed by various computers, including general control computer 40, over the Internet 370. The image files can also be accessed by the digital camera phone 300, using the mobile phone network 358. Alternatively, the image files can be transferred to the general control computer 40 using the dock interface 362 and dock/recharger 364. The metadata in the image file, such as the Date/Time metadata and the special event labels stored using the ImageDescription tag, can also be read from each image file and stored in a separate metadata database along with the image name, to enable more rapid searching.
  • In block 430, the metadata of the database 120 is searched to locate images of interest. This can be accomplished by entering the query 122, as described earlier in reference to FIG. 6.
  • In block 432, the images having metadata which best match the query 122 are displayed. If the images are stored in the general control computer 40, they can be displayed on the display device 50. Alternatively, if the images are stored by the photo service provider 372, they can be transferred to the digital camera phone 300 using the mobile phone network 358 and displayed on the color display 332.
  • In block 434, the user can modify the metadata associated with particular photo events, in order to correct or augment the metadata labels. The modified metadata labels are then stored in the database 120.
  • FIG. 12 is a flow chart of a still further embodiment of the present invention. In this embodiment, different special events are associated with different list of individuals (such as family, friends, team mates, work associates, etc.) who the user would like to be able to access the images of that type of special event taken by the user. For example, the user may want to allow all of Sarah's relatives to view images of Sarah's birthday, and may want to allow all of Matthew's team mates to view images of Matthew's soccer game.
  • In block 401, special events are recorded in a personal calendar 146, as described earlier in reference to block 400 of FIG. 11. In addition, the personal calendar 146 is made accessible over a network. This can be done by enabling the photo service provider 372 to access the personal calendar 146 via the Internet 370 .
  • In block 403, share lists for at least some of the special events recorded in the personal calendar 146 are stored. This can be done, for example, by enabling the user to select, from the user's share list already provided by the photo service provider 372, those users that will be allowed to access images associated with different types of special events. For example, there might be a first list of family and friends that are allowed to access holiday and birthday images, a second list of teammates that are allowed to access images of team events, and a third list of work colleagues that are allowed to access images of work associated events. In addition, an event-specific share list can be created automatically by adding all of the participants of a particular event recorded in a personal calendar (e.g. all of the people invited to a party or a meeting) to the share list for that special event.
  • In block 404, the user selects the camera mode and in block 406, the user composes the image(s) to be captured, as was described earlier in reference to FIG. 11.
  • In block 408, the digital processor 320 reads the value of the real time clock 324 after each image is captured, to determine the date and time the picture was taken. In block 409, the date/time is stored as metadata is association with the captured image(s), for example in the Date/Time TIFF tag of the Exif image file.
  • In block 411, the digital processor 320 in the digital camera phone 300 initiates the transfer of the captured image(s) to the database. This can be done automatically after each image is captured or after a certain number of images are captured, or can be manually initiated by the user. Once initiated, the wireless modem 354 begins to transmit the image file(s) over the mobile phone network 358 to the photo service provider 372.
  • In block 412, photo service provider retrieves the calendar entry for the date/time stored in the transferred image file.
  • In block 414, the service provider 372 determines if the calendar entry for the current date/time corresponds to a special event time, as was described earlier in reference to FIG. 11.
  • In block 417, the service provider 372 uses the photo event to create image metadata and a proposed “share list”, which is one of the share lists stored in block 403. For example, if the calendar entry is “Matthew's Soccer game”, the image metadata could be “Event: Soccer game, Subject: Matthew”, and the proposed share list could be the list of all of Matthew's teammates.
  • In block 419, the share list is communicated to the digital camera phone 300 via the wireless modem 350, and is displayed on the color display 332 along with the image(s) captured in block 406. This enables the user to decide whether or not to share the captured images with the share list associated with the identified special event. In addition to displaying the share list (which can list individuals or the name of the group), the processor 320 displays a request for the user to approve the proposed share list. This can be done, for example, by displaying the text “OK?” along with “yes” and “no” selectable responses, on the color display 332.
  • In block 421, the user selects either the “yes” or “no” response, using the user controls 334. If the share list is not appropriate, the user selects the “no” response. This can happen if the special event did not take place, for example if the soccer game was cancelled, so that the images correspond to a different type of event.
  • In block 423, if the user does not “OK” the share list, (“no” to block 421) the digital processor 320 displays a user interface screen on the color display 332 which enables the user to modify the share list. This can be done by simply not allowing the images to be shared with anyone else, or by selecting an alternative share list.
  • It should be noted that once the user has approved or modified the share list for one image (using blocks 412-423), there is normally no need to repeat these steps during the same photo event. Thus, the approved (or modified) share list can be automatically used for all subsequent images taken of the same photo event (for example for all images taken until the camera is turned off), without repeating blocks 414-423.
  • In block 425, if the user “OKs” the share list, (“yes” to block 421), or if the user provides a modified share list in block 423, the service provider 372 enables the uploaded images associated with the special event to be shared with those uses on the share list. The images can be shared using methods well-known in the prior art, for example by sending an email to each individual on the share list which contains a link to enable the images to be view using a web browser. The individuals on any share list can also be given other types of authorization, for example to order prints of the transferred images, as described in commonly assigned U.S. Pat. No. 5,760,917 to Sheridan, the disclosure of which is herein incorporated by reference.
  • In block 430, the metadata of the database 120 is searched to locate images of interest. The search can be performed either by the user of the digital camera 300, who has access to all of the images, or to individuals on one of the user's share lists, who have access to only those images associated with particular special events. This search can be accomplished by entering the query 122, as described earlier in reference to FIG. 6.
  • In block 432, the images having metadata which best match the query 122 are displayed. The images can be displayed on the display device 50 of the general control computer 40 or they can be transferred to the digital camera phone 300 using the mobile phone network 358 and displayed on the color display 332.
  • FIG. 13 is a flow chart of an additional embodiment of the present invention. In this embodiment, particular special events (such as important meetings, concerts, weddings, etc.) are recorded as silent events in the personal calendar. In addition, ring tones can be associated with other special events, such as an short excerpt from “Stars and Stripes forever” for the 4th of July holiday, or “Happy birthday” for a birthday anniversary. When an incoming phone call is received by the user's phone camera, the personal calendar is checked to determine if the current date/time corresponds to a silent event. If it does, the vibration device is used to indicate to the user that there is an incoming call, instead of using a ring tone. If not, the personal calendar is checked to determine if a special ring tone should be used, instead of the default ring tone, to indicate the incoming call.
  • In block 400, special events are recorded in the personal calendar 146. This can be done using general control computer 40. As described earlier in reference to FIG. 2, the special events recorded in the personal calendar 146 can include appointments 140 and occasions 142. These special events are labels for blocks of time that are personalized to the user of the digital camera phone 300.
  • In block 436, the user can record whether some special events are silent events, during which they do not want their phone to ring. These silent events may include particular work-related meetings or social events, such as weddings, concerts, etc., where a ringing phone would annoy others and embarrass the user. The user can also specify that some recurring events (such as staff meetings or concerts) should always be silent events, so that they are automatically recorded as silent events whether a new event of this type is added to their calendar in block 400.
  • In block 438, the user can assign specific ring tones to specific special events. For example, the user can assign holiday ring tones (e.g. a Christmas theme, Thanksgiving theme, Halloween theme) to these holiday special events, a “Happy Birthday” song ring tone to birthday anniversary special events, etc.
  • In block 440, the personal calendar 146 and the assigned ring tones are transferred from the general control computer 40 to the digital camera phone 300, and stored in the image/data memory 330.
  • In block 442, the digital processor 320 detects an incoming phone call from the mobile phone network 358 via the wireless modem 350.
  • In block 408, the digital processor 320 reads the value of the real time clock 324 to determine the current date and time. In block 412, the digital processor 320 retrieves the personal calendar entry for the date/time provided by the real time clock 324.
  • In block 444, the digital processor 320 determines if the calendar entry for the current date/time corresponds to a silent event recorded in block 436.
  • In block 446, if the current date/time corresponds to a silent event (yes to block 444), the vibration device described earlier in reference to FIG. 10 is used to indicate that there is an incoming call.
  • In block 448, if the current date/time does not correspond to a silent event (no to block 444), the digital processor 320 determines if the current date/time corresponds to a special event which has been assigned a special ring tone.
  • In block 450, the digital processor 320 in the digital camera phone 300 uses either the special ring tone determined in block 448, or the default ring tone, to indicate to the user that there is an incoming phone call.
  • It should be noted that in some embodiments, different ring tones can be assigned to different callers. In this case, a particular special ring tone may be used only for certain callers. For example, the “Happy birthday” ring tone can be used only for calls from a phone number associated with the person who's birthday anniversary is the special event in the user's personal calendar. This can remind the user to convey a “happy birthday” message to the caller.
  • The invention has been described in detail with particular reference to certain preferred embodiments thereof, but it will be understood that variations and modifications can be effected within the spirit and scope of the invention.
  • PARTS LIST
    • 10 image capture device
    • 20 microprocessor unit (aka digital image processor)
    • 30 image output device
    • 40 general control computer
    • 50 display device
    • 60 keyboard
    • 104 digital image and video (also referred to as “collection”)
    • 106 person finder
    • 108 capture time extractor
    • 110 personal features
    • 112 image capture time
    • 114 calendar label annotator
    • 116 appearance models
    • 118 annotations
    • 120 database
    • 122 queried or query
    • 124 query results
    • 140 appointment
    • 142 occasion
    • 144 journal entry
    • 146 personal calendar
    • 150 user input
    • 152 event processor
    • 154 special event list
    • 156 time comparer
    • 158 digital images and videos of interest
    • 162 event clusterer
    • 164 event clusters
    • 170 keyword expander
    • 172 expanded query
    • 180 events of interest
    • 232 photographer
    • 238 photographer determiner
    • 242 personal calendars
    • 244 calendar selector
    • 300 digital camera phone
    • 302 flash
    • 304 lens
    • 310 CMOS image sensor
    • 312 timing generator
    • 314 image sensor array
    • 316 A/D converter circuit
    • 318 DRAM buffer memory
    • 320 digital processor
    • 322 RAM memory
    • 324 real-time clock
    • 328 firmware memory
    • 330 image/data memory
    • 332 color display
    • 334 user controls
    • 340 audio codec
    • 342 microphone
    • 344 speaker
    • 350 wireless modem
    • 352 RF channel
    • 358 phone network
    • 362 dock interface
    • 364 dock/charger
    • 370 Interent
    • 372 service provider
    • 400 block
    • 401 bock
    • 402 block
    • 403 block
    • 404 block
    • 404 block
    • 406 block
    • 406 block
    • 408 block
    • 408 block
    • 409 block
    • 411 block
    • 412 block
    • 414 block
    • 414 block
    • 416 block
    • 417 block
    • 418 block
    • 419 block
    • 420 block
    • 421 block
    • 422 block
    • 423 block
    • 424 block
    • 425 block
    • 426 block
    • 430 block
    • 432 block
    • 434 block
    • 436 block
    • 438 block
    • 440 block
    • 442 block
    • 444 block
    • 446 block
    • 448 block
    • 450 block

Claims (28)

1. A method for associating event times or time periods with digital images in a collection for determining if a digital image is of interest, comprising:
a) storing a collection of digital images each having an associated capture time;
b) comparing the associated capture time in the collection with a special event time to determine if a digital image in the collection is of interest, wherein the comparing step includes calculation of a special event time associated with a special event based on the calendar time associated with the special event and using such information to perform the comparison step; and
c) associating digital images of interest with the special event.
2. The method of claim 1, further including:
d) labeling a captured digital image with special event identifier information.
3. The method of claim 1, wherein the comparing step further includes collecting a list of special events and related special event time for each collected special event, respectively.
4. The method of claim 1, wherein the associating step includes storing in a database digital images of interest with a special event.
5. The method of claim 1, wherein the comparing step includes user entry of a special event time associated with a special event and using such information to perform the comparison step.
6. The method of claim 1, wherein the comparing step includes associating pre-determined groups of images with a special event.
7. The method of claim 1, wherein a special event is identified from a journal entry, an appointment, or an occasion or a combination thereof.
8. The method of claim 1, wherein a special event is identified by user inspection or automatically or both.
9. The method of claim 1, further including:
d) supplying a query;
e) retrieving images and videos using the special event and the query; and
f) displaying the retrieved images and videos on a display.
10. A method for associating event times or time periods with digital images in a collection for determining if a digital image is of interest, comprising:
a) storing a collection of digital images or each having an associated capture time;
b) clustering chronologically related digital images into one or more event clusters;
c) comparing the associated capture times of images in the event clusters with a special event time to determine if an event cluster is of interest; and
d) associating an event cluster of interest with the special event.
11. A method for associating event times or time periods with digital images in a collection for determining if a digital image is of interest, comprising:
a) storing a collection of digital images each having an associated capture time;
b) providing a list of events with associated event times from a personal calendar;
c) classifying an event as a special event;
d) comparing the associated capture times in the collection with an event time associated with the special event to determine if a digital image in the collection is of interest; and
e) associating digital images and videos of interest with the special event.
12. The method of claim 11, wherein the classification of an event as a special event includes determining a likelihood that images will be captured at the event.
13. The method of claim 11, wherein the classification of an event as a special event includes using user input indicating whether images will be captured at the event.
14. A method for associating event times or time periods with digital images in a collection for determining if a digital image is of interest, comprising:
a) storing a collection of digital images each having an associated capture time and an associated photographer;
b) providing a personal calendar of special events with the associated photographer
c) comparing the associated capture times in the collection with a special event time to determine if a digital image in the collection is of interest; and
d) associating digital images of interest with the special event.
15. The method of claim 14, further including:
e) labeling digital images of interest with special event identifier information associated with the special event.
16. The method of claim 14, wherein the comparing step further includes collecting a list of special events and related special event time for each collected special event, respectively.
17. The method of claim 14, wherein the associating step includes storing in a database digital images of interest with a special event.
18. The method of claim 14, wherein the comparing step includes user entry of a special event time associated with a special event and using such information to perform the comparison step.
19. The method of claim 14, wherein the comparing step includes calculation of a special event time associated with a special event based on the calendar time associated with the special event and using such information to perform the comparison step.
20. A method for labeling digital images captured using a digital capture device, comprising:
a) receiving and storing personal calendar information in a digital capture device, the personal calendar information including occasion or appointment information;
b) capturing and storing a digital image in the digital capture device;
c) determining a capture time for the captured digital image;
d) automatically comparing the capture time with the personal calendar information to determine a special event label; and
e) storing the special event label in association with the captured digital image.
21. The method of claim 20, further including displaying the special event label on a display of the digital capture device.
22. The method of claim 21, further including providing a user interface screen on the display to enable a user to approve the special event label.
23. The method of claim 21, further including transferring the captured digital image and the special event label over a wireless network.
24. A method for enabling a plurality of different users to access particular digital images in a collection of digital images, comprising:
a) storing data listing a plurality of special events and the subset of the plurality of different users associated with each of the special events;
b) storing a collection of digital images each having an associated capture time;
c) determining a particular special event for at least one associated capture time; and
d) enabling the users associated with the particular special event to access the digital image having the associated capture time.
25. The method of claim 24, further including displaying a list of the users associated with the particular special event on a display.
26. The method of claim 25, further including providing a user interface screen on the display to enable the list of users to be approved.
27. A method for indicating an incoming phone call in a mobile phone, comprising:
a) storing personal calendar information in the mobile phone, the personal calendar information including occasion or appointment information and information indicating that at least one occasion or appointment is a silent event;
b) detecting an incoming phone call from a mobile phone network;
c) determining a current date and time;
d) comparing the current date and time with the personal calendar information to determine whether the current date and time corresponds to the silent event;
e) if the current date and time corresponds to the silent event, providing a non-audible indication of the incoming call; and
f) if the current data and time does not correspond to the silent event, providing an audible indication of the incoming call.
28. The method of claim 27 further including:
g) storing, in the personal calendar, information associating a custom ring tone with one of the occasions or appointments; h) comparing the current date and time with the personal calendar information to determine whether the current date and time corresponds to the occasion or appoint associated with the custom ring tone; and
h) if the current data and time corresponds to the occasion or appointment associated with the custom ring tone, using the custom ring tone to provide the audible indication of the incoming call.
US11/178,992 2005-07-11 2005-07-11 Identifying collection images with special events Abandoned US20070008321A1 (en)

Priority Applications (9)

Application Number Priority Date Filing Date Title
US11/178,992 US20070008321A1 (en) 2005-07-11 2005-07-11 Identifying collection images with special events
JP2008521410A JP5225082B2 (en) 2005-07-11 2006-06-26 Identifying aggregate images using special events
PCT/US2006/024754 WO2007008386A2 (en) 2005-07-11 2006-06-26 Identifying collection images with special events
EP09176588A EP2161670A1 (en) 2005-07-11 2006-06-26 Controlling the incoming call indication function of a mobile phone
EP10193979A EP2287755A1 (en) 2005-07-11 2006-06-26 Identifying collection images with special events
EP06785560A EP1902392A2 (en) 2005-07-11 2006-06-26 Identifying collection images with special events
US12/796,698 US8717461B2 (en) 2005-07-11 2010-06-09 Identifying collection images with special events
US12/983,904 US8358358B2 (en) 2005-07-11 2011-01-04 Identifying collection images with special events
US14/177,395 US9049388B2 (en) 2005-07-11 2014-02-11 Methods and systems for annotating images based on special events

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US11/178,992 US20070008321A1 (en) 2005-07-11 2005-07-11 Identifying collection images with special events

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US12/796,698 Continuation US8717461B2 (en) 2005-07-11 2010-06-09 Identifying collection images with special events
US12/983,904 Division US8358358B2 (en) 2005-07-11 2011-01-04 Identifying collection images with special events

Publications (1)

Publication Number Publication Date
US20070008321A1 true US20070008321A1 (en) 2007-01-11

Family

ID=37007617

Family Applications (4)

Application Number Title Priority Date Filing Date
US11/178,992 Abandoned US20070008321A1 (en) 2005-07-11 2005-07-11 Identifying collection images with special events
US12/796,698 Expired - Fee Related US8717461B2 (en) 2005-07-11 2010-06-09 Identifying collection images with special events
US12/983,904 Expired - Fee Related US8358358B2 (en) 2005-07-11 2011-01-04 Identifying collection images with special events
US14/177,395 Expired - Fee Related US9049388B2 (en) 2005-07-11 2014-02-11 Methods and systems for annotating images based on special events

Family Applications After (3)

Application Number Title Priority Date Filing Date
US12/796,698 Expired - Fee Related US8717461B2 (en) 2005-07-11 2010-06-09 Identifying collection images with special events
US12/983,904 Expired - Fee Related US8358358B2 (en) 2005-07-11 2011-01-04 Identifying collection images with special events
US14/177,395 Expired - Fee Related US9049388B2 (en) 2005-07-11 2014-02-11 Methods and systems for annotating images based on special events

Country Status (4)

Country Link
US (4) US20070008321A1 (en)
EP (3) EP2161670A1 (en)
JP (1) JP5225082B2 (en)
WO (1) WO2007008386A2 (en)

Cited By (137)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060221779A1 (en) * 2005-03-15 2006-10-05 Fuji Photo Film Co., Ltd. Album generating apparatus, album generating method and program
US20060220983A1 (en) * 2005-03-15 2006-10-05 Fuji Photo Film Co., Ltd. Album creating apparatus, album generating method and program
US20060244765A1 (en) * 2005-04-28 2006-11-02 Fuji Photo Film Co., Ltd. Album creating apparatus, album creating method and program
US20060271855A1 (en) * 2005-05-27 2006-11-30 Microsoft Corporation Operating system shell management of video files
US20070033109A1 (en) * 2005-08-05 2007-02-08 Microsoft Corporation Informal trust relationship to facilitate data sharing
US20070061759A1 (en) * 2005-08-05 2007-03-15 Realnetworks, Inc., System and method for chronologically presenting data
US20070124333A1 (en) * 2005-11-29 2007-05-31 General Instrument Corporation Method and apparatus for associating metadata with digital photographs
US20070158405A1 (en) * 2005-12-22 2007-07-12 Samsung Electronics Co., Ltd. Method and apparatus for managing content in a portable terminal
US20070208860A1 (en) * 2006-03-02 2007-09-06 Zellner Samuel N User specific data collection
US20070208861A1 (en) * 2006-03-02 2007-09-06 Zellner Samuel N User preference interpretation
US20070293265A1 (en) * 2006-06-20 2007-12-20 Nokia Corporation System, device, method, and computer program product for annotating media files
US20070297786A1 (en) * 2006-06-22 2007-12-27 Eli Pozniansky Labeling and Sorting Items of Digital Data by Use of Attached Annotations
US20080021928A1 (en) * 2006-07-24 2008-01-24 Yagnik Jay N Method and apparatus for automatically annotating images
US20080059618A1 (en) * 2006-08-30 2008-03-06 Research In Motion Limited, Automatic attachment of image and/or audio records to electronic calendar meeting event record in portable wireless devices
US20080133526A1 (en) * 2006-12-05 2008-06-05 Palm, Inc. Method and system for processing images using time and location filters
US20080229186A1 (en) * 2007-03-14 2008-09-18 Microsoft Corporation Persisting digital ink annotations as image metadata
US20080244006A1 (en) * 2007-03-27 2008-10-02 Sholem Weisner Method and apparatus for a digital leg history
US20080243868A1 (en) * 2007-03-27 2008-10-02 Sholem Weisner Method and apparatus for a digital leg history
US20080270914A1 (en) * 2007-04-30 2008-10-30 Microsoft Corporation Event highlighting and differentiation view
US20080304808A1 (en) * 2007-06-05 2008-12-11 Newell Catherine D Automatic story creation using semantic classifiers for digital assets and associated metadata
US20090019013A1 (en) * 2007-06-29 2009-01-15 Allvoices, Inc. Processing a content item with regard to an event
US20090030911A1 (en) * 2007-07-26 2009-01-29 Oracle International Corporation Mobile multimedia proxy database
US20090132489A1 (en) * 2007-11-15 2009-05-21 Transcend Information , Inc. Method for managing digital photograph, apparatus for displaying digital photograph, and method for playing the same
US20090150342A1 (en) * 2007-12-05 2009-06-11 International Business Machines Corporation Computer Method and Apparatus for Tag Pre-Search in Social Software
US20090276535A1 (en) * 2002-08-20 2009-11-05 Microsoft Corporation Media streaming of web content data
US7639943B1 (en) * 2005-11-15 2009-12-29 Kalajan Kevin E Computer-implemented system and method for automated image uploading and sharing from camera-enabled mobile devices
US20100030755A1 (en) * 2007-04-10 2010-02-04 Olaworks Inc. Method for inferring personal relationship by using readable data, and method and system for attaching tag to digital data by using the readable data
US20100054600A1 (en) * 2008-08-28 2010-03-04 Microsoft Corporation Tagging Images With Labels
US20100054601A1 (en) * 2008-08-28 2010-03-04 Microsoft Corporation Image Tagging User Interface
US20100114986A1 (en) * 2002-10-16 2010-05-06 Microsoft Corporation Navigating media content by groups
US20100114856A1 (en) * 2008-10-31 2010-05-06 Canon Kabushiki Kaisha Information search apparatus, information search method, and storage medium
US20100115399A1 (en) * 2007-03-15 2010-05-06 Koninklijke Philips Electronics N.V. Method and apparatus for generating an album of images
US20100121852A1 (en) * 2008-11-11 2010-05-13 Samsung Electronics Co., Ltd Apparatus and method of albuming content
WO2010061345A1 (en) * 2008-11-26 2010-06-03 Nokia Corporation An Apparatus and Method for Copying Data to a File
US20100191728A1 (en) * 2009-01-23 2010-07-29 James Francis Reilly Method, System Computer Program, and Apparatus for Augmenting Media Based on Proximity Detection
US20100199227A1 (en) * 2009-02-05 2010-08-05 Jun Xiao Image collage authoring
US20100223302A1 (en) * 2004-10-29 2010-09-02 Microsoft Corporation Features such as titles, transitions, and/or effects which vary according to positions
US20100235366A1 (en) * 2009-03-13 2010-09-16 Microsoft Corporation Data file aggregation with respect to user specific temporal range
US20100269062A1 (en) * 2009-04-15 2010-10-21 International Business Machines, Corpoation Presenting and zooming a set of objects within a window
US20110101270A1 (en) * 2008-04-22 2011-05-05 Atsutaka Manabe Liquid-crystalline medium
EP2218020A4 (en) * 2007-12-03 2011-06-08 Yahoo Inc Associating metadata with media objects using time
US20110161068A1 (en) * 2009-12-29 2011-06-30 Dynavox Systems, Llc System and method of using a sense model for symbol assignment
US20110206284A1 (en) * 2010-02-23 2011-08-25 Madirakshi Das Adaptive event timeline in consumer image collections
US20110211813A1 (en) * 2010-02-26 2011-09-01 Research In Motion Limited Enhanced banner advertisements
EP2369530A1 (en) * 2010-02-26 2011-09-28 Research In Motion Limited Enhanced banner advertisements
US20110252081A1 (en) * 2010-04-08 2011-10-13 Microsoft Corporation Metadata subscription registry
US20110261995A1 (en) * 2010-04-27 2011-10-27 Cok Ronald S Automated template layout system
US20110261994A1 (en) * 2010-04-27 2011-10-27 Cok Ronald S Automated template layout method
US20110269435A1 (en) * 2010-04-30 2011-11-03 Tim Dieckman Automatic iconic display of calendar events on computing devices by inspecting events text
US8055080B2 (en) 2005-03-15 2011-11-08 Fujifilm Corporation Album generating apparatus, album generating method and computer readable medium
US8098896B2 (en) 2005-03-15 2012-01-17 Fujifilm Corporation Album generating apparatus, album generating method and computer readable medium
US20120027303A1 (en) * 2010-07-27 2012-02-02 Eastman Kodak Company Automated multiple image product system
US20120030575A1 (en) * 2010-07-27 2012-02-02 Cok Ronald S Automated image-selection system
WO2012018517A1 (en) 2010-07-26 2012-02-09 Eastman Kodak Company Automatic digital camera photography mode selection
WO2012027178A1 (en) * 2010-08-25 2012-03-01 Eastman Kodak Company Detecting recurring events in consumer image collections
US20120094720A1 (en) * 2010-10-14 2012-04-19 Wonsik Choi Mobile terminal and displaying method thereof
US20120098837A1 (en) * 2010-10-20 2012-04-26 Luckybite Llp Apparatus for augmenting a handheld device
US20120102431A1 (en) * 2010-10-26 2012-04-26 Marc Krolczyk Digital media frame providing customized content
EP2521979A1 (en) * 2010-01-08 2012-11-14 Telefonaktiebolaget LM Ericsson (publ) A method and apparatus for social tagging of media files
US20130038756A1 (en) * 2011-08-08 2013-02-14 Samsung Electronics Co., Ltd. Life-logging and memory sharing
US20130058577A1 (en) * 2011-09-07 2013-03-07 Peter O. Stubler Event classification method for related digital images
US20130058542A1 (en) * 2011-09-07 2013-03-07 Peter O. Stubler Event classification method using lit candle detection
US20130058583A1 (en) * 2011-09-07 2013-03-07 Andrew Charles Gallagher Event classification method using light source detection
US20130111373A1 (en) * 2011-05-07 2013-05-02 Ryouichi Kawanishi Presentation content generation device, presentation content generation method, presentation content generation program, and integrated circuit
US20130185635A1 (en) * 2012-01-17 2013-07-18 Ron Barzel Presenting images from slow image-event stream
US8526925B2 (en) 2006-03-02 2013-09-03 At&T Intellectual Property I, L.P. Environment independent user preference communication
US8570375B1 (en) * 2007-12-04 2013-10-29 Stoplift, Inc. Method and apparatus for random-access review of point of sale transactional video
WO2013170023A1 (en) 2012-05-11 2013-11-14 Intellectual Ventures Fund 83 Llc Photo -album generaton from a photo-collection based on a scenario and on the statistical distribution of the types of photos.
WO2013177515A1 (en) * 2012-05-24 2013-11-28 Nant Holdings Ip, Llc Event archiving, systems and methods
WO2014014588A1 (en) * 2012-07-20 2014-01-23 Intel Corporation Calendar-aware devices
US20140025755A1 (en) * 2012-07-20 2014-01-23 Google Inc. Inferring events based on mob source video
US8644702B1 (en) 2005-12-28 2014-02-04 Xi Processing L.L.C. Computer-implemented system and method for notifying users upon the occurrence of an event
US20140079322A1 (en) * 2012-09-14 2014-03-20 Fujifilm Corporation Image synthesizing system, image processing apparatus, and image processing method
US8761523B2 (en) 2011-11-21 2014-06-24 Intellectual Ventures Fund 83 Llc Group method for making event-related media collection
US20140176419A1 (en) * 2012-12-21 2014-06-26 Nokia Corporation Method and apparatus for sharing content
US8831360B2 (en) 2011-10-21 2014-09-09 Intellectual Ventures Fund 83 Llc Making image-based product from digital image collection
US20140258297A1 (en) * 2013-03-07 2014-09-11 Shahram Davari Automatic grouping of photos into folders and naming the photo folders
US20140280561A1 (en) * 2013-03-15 2014-09-18 Fujifilm North America Corporation System and method of distributed event based digital image collection, organization and sharing
US20150081369A1 (en) * 2012-04-27 2015-03-19 Blackberry Limited Systems and Methods for Providing Files in Relation to a Calendar Event
US20150286722A1 (en) * 2014-04-07 2015-10-08 Sony Corporation Tagging of documents and other resources to enhance their searchability
US20150326778A1 (en) * 2012-12-05 2015-11-12 Tue FRELTOFT Photo survey
US20150324431A1 (en) * 2011-07-13 2015-11-12 Linkedin Corporation Method and system for semantic search against a document collection
US20150331930A1 (en) * 2014-05-16 2015-11-19 Here Global B.V. Method and apparatus for classification of media based on metadata
US20150356121A1 (en) * 2014-06-04 2015-12-10 Commachine, Inc. Position location-enabled, event-based, photo sharing software and service
US20150363409A1 (en) * 2014-06-11 2015-12-17 Kodak Alaris Inc. Method for creating view-based representations from multimedia collections
US20160050704A1 (en) * 2014-08-12 2016-02-18 Lyve Minds, Inc. Image linking and sharing
US20160175008A1 (en) * 2014-12-18 2016-06-23 Medtronic, Inc. Open channel implant tool with additional lumen and implant techniques utilizing such tools
US20160275418A1 (en) * 2012-06-22 2016-09-22 California Institute Of Technology Systems and Methods for the Determining Annotator Performance in the Distributed Annotation of Source Data
US9460205B2 (en) 2012-07-20 2016-10-04 Google Inc. Crowdsourced video collaboration
US20170039428A1 (en) * 2015-08-07 2017-02-09 Canon Kabushiki Kaisha Information processing method, information processing apparatus, and non-transitory computer-readable storage medium
US9584834B1 (en) 2012-06-25 2017-02-28 Google Inc. Video broadcasting with geolocation
US9582482B1 (en) 2014-07-11 2017-02-28 Google Inc. Providing an annotation linking related entities in onscreen content
US20170127019A1 (en) * 2012-11-26 2017-05-04 Intouch Technologies, Inc. Enhanced video interaction for a user interface of a telepresence network
US9703541B2 (en) 2015-04-28 2017-07-11 Google Inc. Entity action suggestion on a mobile device
US20170286808A1 (en) * 2013-08-07 2017-10-05 Google Inc. Systems and methods for inferential sharing of photos
US9965559B2 (en) 2014-08-21 2018-05-08 Google Llc Providing automatic actions for mobile onscreen content
US20180203825A1 (en) * 2017-01-16 2018-07-19 Seiko Epson Corporation Electronic apparatus, electronic system, method of controlling electronic apparatus, and computer-readable recording medium
WO2018145015A1 (en) 2017-02-06 2018-08-09 Kodak Alaris Inc. Method for creating audio tracks for accompanying visual imagery
US10055390B2 (en) 2015-11-18 2018-08-21 Google Llc Simulated hyperlinks on a mobile device based on user intent and a centered selection of text
US10073584B2 (en) 2016-06-12 2018-09-11 Apple Inc. User interfaces for retrieving contextually relevant media content
TWI637347B (en) * 2014-07-31 2018-10-01 三星電子股份有限公司 Method and device for providing image
US10157217B2 (en) 2012-05-18 2018-12-18 California Institute Of Technology Systems and methods for the distributed categorization of source data
US10157455B2 (en) 2014-07-31 2018-12-18 Samsung Electronics Co., Ltd. Method and device for providing image
US10178527B2 (en) 2015-10-22 2019-01-08 Google Llc Personalized entity repository
US10277714B2 (en) * 2017-05-10 2019-04-30 Facebook, Inc. Predicting household demographics based on image data
US10282562B1 (en) 2015-02-24 2019-05-07 ImageKeeper LLC Secure digital data collection
US10296166B2 (en) 2010-01-06 2019-05-21 Apple Inc. Device, method, and graphical user interface for navigating and displaying content in context
US10324973B2 (en) 2016-06-12 2019-06-18 Apple Inc. Knowledge graph metadata network based on notable moments
US10328576B2 (en) 2012-05-22 2019-06-25 Intouch Technologies, Inc. Social behavior rules for a medical telepresence robot
US10362219B2 (en) 2016-09-23 2019-07-23 Apple Inc. Avatar creation and editing
US10409858B2 (en) 2013-08-02 2019-09-10 Shoto, Inc. Discovery and sharing of photos between devices
US10429871B2 (en) 2012-07-14 2019-10-01 Causam Energy, Inc. Method and apparatus for actively managing electric power supply for an electric power grid
WO2019217202A1 (en) * 2018-05-07 2019-11-14 Apple, Inc. Automatic digital asset sharing suggestions
US10535005B1 (en) 2016-10-26 2020-01-14 Google Llc Providing contextual actions for mobile onscreen content
US10572132B2 (en) 2015-06-05 2020-02-25 Apple Inc. Formatting content for a reduced-size user interface
US10592750B1 (en) * 2015-12-21 2020-03-17 Amazon Technlogies, Inc. Video rule engine
US10595072B2 (en) * 2015-08-31 2020-03-17 Orcam Technologies Ltd. Systems and methods for recognizing faces using non-facial information
US10591921B2 (en) 2011-01-28 2020-03-17 Intouch Technologies, Inc. Time-dependent navigation of telepresence robots
US10803135B2 (en) 2018-09-11 2020-10-13 Apple Inc. Techniques for disambiguating clustered occurrence identifiers
US10846343B2 (en) 2018-09-11 2020-11-24 Apple Inc. Techniques for disambiguating clustered location identifiers
US10892052B2 (en) 2012-05-22 2021-01-12 Intouch Technologies, Inc. Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US10915868B2 (en) 2013-06-17 2021-02-09 Microsoft Technology Licensing, Llc Displaying life events while navigating a calendar
US10970646B2 (en) 2015-10-01 2021-04-06 Google Llc Action suggestions for user-selected content
US11086935B2 (en) 2018-05-07 2021-08-10 Apple Inc. Smart updates from historical database changes
US11212416B2 (en) 2018-07-06 2021-12-28 ImageKeeper LLC Secure digital media capture and analysis
US11237696B2 (en) 2016-12-19 2022-02-01 Google Llc Smart assist for repeated actions
US11243996B2 (en) * 2018-05-07 2022-02-08 Apple Inc. Digital asset search user interface
US11257044B2 (en) * 2017-06-20 2022-02-22 Microsoft Technology Licensing, Llc Automatic association and sharing of photos with calendar events
US11334209B2 (en) 2016-06-12 2022-05-17 Apple Inc. User interfaces for retrieving contextually relevant media content
US11449788B2 (en) 2017-03-17 2022-09-20 California Institute Of Technology Systems and methods for online annotation of source data using skill estimation
US11468198B2 (en) 2020-04-01 2022-10-11 ImageKeeper LLC Secure digital media authentication and analysis
US11481854B1 (en) 2015-02-23 2022-10-25 ImageKeeper LLC Property measurement with automated document production
US20220382811A1 (en) * 2021-06-01 2022-12-01 Apple Inc. Inclusive Holidays
US11553105B2 (en) 2020-08-31 2023-01-10 ImageKeeper, LLC Secure document certification and execution system
US11671493B2 (en) * 2019-12-23 2023-06-06 Apple Inc. Timeline generation
US20230249631A1 (en) * 2012-09-28 2023-08-10 Digital Ally, Inc. Portable video and imaging system
US11782575B2 (en) 2018-05-07 2023-10-10 Apple Inc. User interfaces for sharing contextually relevant media content

Families Citing this family (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090266718A1 (en) 2006-04-10 2009-10-29 Jing Lin Correction of Oxygen Effect in Test Sensor Using Reagents
AU2006249239B2 (en) * 2006-12-07 2010-02-18 Canon Kabushiki Kaisha A method of ordering and presenting images with smooth metadata transitions
US8515460B2 (en) * 2007-02-12 2013-08-20 Microsoft Corporation Tagging data utilizing nearby device information
CN101426082A (en) * 2007-10-31 2009-05-06 深圳富泰宏精密工业有限公司 Calendar display system and method having photo album function
US20100103463A1 (en) * 2008-10-28 2010-04-29 Dhiraj Joshi Determining geographic location of a scanned image
US9565217B2 (en) * 2009-12-31 2017-02-07 Bce Inc. Method, system, network and computer-readable media for controlling outgoing telephony calls
US10602241B2 (en) * 2009-12-31 2020-03-24 Bce Inc. Method, system network and computer-readable media for controlling outgoing telephony calls to cause initiation of call features
US8531992B2 (en) * 2009-12-31 2013-09-10 Bce Inc. Method, system, network and computer-readable media for controlling outgoing telephony calls to convey media messages to source devices
US20110164739A1 (en) * 2009-12-31 2011-07-07 Bce Inc. Method, call processing system and computer-readable media for conveying an audio stream to a source device during an outgoing call
US9152707B2 (en) * 2010-01-04 2015-10-06 Martin Libich System and method for creating and providing media objects in a navigable environment
US20110283172A1 (en) * 2010-05-13 2011-11-17 Tiny Prints, Inc. System and method for an online memories and greeting service
US20120023454A1 (en) * 2010-07-20 2012-01-26 Sap Ag Schedule management using linked events
US8947547B1 (en) 2010-09-12 2015-02-03 Thomas Nathan Millikan Context and content based automated image and media sharing
JP4940345B2 (en) * 2010-10-29 2012-05-30 株式会社東芝 Electronic apparatus and image processing method
US8577965B2 (en) * 2011-02-25 2013-11-05 Blackberry Limited Knowledge base broadcasting
JP5779938B2 (en) * 2011-03-29 2015-09-16 ソニー株式会社 Playlist creation device, playlist creation method, and playlist creation program
US8831352B2 (en) * 2011-04-04 2014-09-09 Microsoft Corporation Event determination from photos
US9074901B1 (en) * 2011-09-22 2015-07-07 Google Inc. System and method for automatically generating an electronic journal
WO2013159176A1 (en) * 2012-04-27 2013-10-31 Research In Motion Limited Systems and methods for establishing and using a personal linking graph
EP2858536B1 (en) 2012-06-12 2020-11-25 Snap-On Incorporated Auditing and forensics for automated tool control systems
US8788587B2 (en) * 2012-06-15 2014-07-22 Be Labs, Llc System, method, and product for capturing memories
JP6004807B2 (en) * 2012-07-24 2016-10-12 キヤノン株式会社 Image processing apparatus, control method thereof, and program
CN103685714B (en) * 2012-09-26 2016-08-03 华为技术有限公司 Terminal daily record generates method and terminal
US20140176661A1 (en) * 2012-12-21 2014-06-26 G. Anthony Reina System and method for surgical telementoring and training with virtualized telestration and haptic holograms, including metadata tagging, encapsulation and saving multi-modal streaming medical imagery together with multi-dimensional [4-d] virtual mesh and multi-sensory annotation in standard file formats used for digital imaging and communications in medicine (dicom)
US9648129B2 (en) * 2013-03-13 2017-05-09 Facebook, Inc. Image filtering based on social context
US9892172B2 (en) 2013-03-15 2018-02-13 Dropbox, Inc. Date and time handling
US20140287779A1 (en) 2013-03-22 2014-09-25 aDesignedPath for UsabilitySolutions, LLC System, method and device for providing personalized mobile experiences at multiple locations
US9503532B2 (en) * 2013-09-03 2016-11-22 Western Digital Technologies, Inc. Rediscovery of past data
US10013639B1 (en) 2013-12-16 2018-07-03 Amazon Technologies, Inc. Analyzing digital images based on criteria
US9690771B2 (en) * 2014-05-30 2017-06-27 Nuance Communications, Inc. Automated quality assurance checks for improving the construction of natural language understanding systems
US10078781B2 (en) * 2014-06-13 2018-09-18 Google Llc Automatically organizing images
CN105590306B (en) * 2014-10-20 2018-01-16 杭州登虹科技有限公司 Photograph diary
CN105808542B (en) * 2014-12-29 2019-12-24 联想(北京)有限公司 Information processing method and information processing apparatus
US9836650B2 (en) 2015-02-09 2017-12-05 Empire Technology Development Llc Identification of a photographer based on an image
US10142795B2 (en) 2015-02-16 2018-11-27 Tourblend Innovations, Llc Providing digital content for multiple venues
US10817563B2 (en) 2015-02-16 2020-10-27 Tourblend Innovations, Llc Providing location based content to mobile devices
US9658837B1 (en) * 2015-11-06 2017-05-23 Sentry Insurance a Mutual Company Integration of independent platforms
US11170035B2 (en) * 2019-03-29 2021-11-09 Snap Inc. Context based media curation
US20230036109A1 (en) * 2020-02-27 2023-02-02 Panasonic Intellectual Property Management Co., Ltd. Image processing device and image processing method
CN112788266B (en) * 2020-10-28 2022-04-15 深圳真伊科技有限公司 Video recording method, system, terminal equipment and computer readable storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6108640A (en) * 1997-01-14 2000-08-22 Slotznick; Benjamin System for calculating occasion dates and converting between different calendar systems, and intelligent agent for using same
US20030050982A1 (en) * 2001-09-13 2003-03-13 Chang Sam M. Automatic annotation of audio and/or visual data
US6606411B1 (en) * 1998-09-30 2003-08-12 Eastman Kodak Company Method for automatically classifying images into events
US20030184653A1 (en) * 2002-03-29 2003-10-02 Akito Ohkubo Method, apparatus, and program for classifying images
US20040135904A1 (en) * 2002-12-27 2004-07-15 Kazuo Shiota Image sorting method, device, and program
US20040201740A1 (en) * 2002-03-15 2004-10-14 Canon Kabushiki Kaisha Automatic determination of image storage location
US6810146B2 (en) * 2001-06-01 2004-10-26 Eastman Kodak Company Method and system for segmenting and identifying events in images using spoken annotations
US7415662B2 (en) * 2000-01-31 2008-08-19 Adobe Systems Incorporated Digital media management apparatus and methods

Family Cites Families (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5164831A (en) 1990-03-15 1992-11-17 Eastman Kodak Company Electronic still camera providing multi-format storage of full and reduced resolution images
US6442527B1 (en) * 1995-03-17 2002-08-27 Kathy A. Worthington System and method for personalized and customized time management
US20030118216A1 (en) * 1996-09-04 2003-06-26 Goldberg David A. Obtaining person-specific images in a public venue
US5760917A (en) 1996-09-16 1998-06-02 Eastman Kodak Company Image distribution method and system
US20020087546A1 (en) * 2000-01-31 2002-07-04 Michael Slater Apparatus, methods, and systems for digital photo management
US20020052225A1 (en) * 2000-11-01 2002-05-02 Davis Derek L. Feature timer functionality for a wireless communication unit
US6930707B2 (en) 2000-12-22 2005-08-16 International Business Machines Corporation Digital camera apparatus with biometric capability
US20020101519A1 (en) * 2001-01-29 2002-08-01 Myers Jeffrey S. Automatic generation of information identifying an object in a photographic image
US20020140820A1 (en) * 2001-03-29 2002-10-03 Borden George R. Calendar based photo browser
US7301569B2 (en) * 2001-09-28 2007-11-27 Fujifilm Corporation Image identifying apparatus and method, order processing apparatus, and photographing system and method
US7102670B2 (en) * 2001-10-31 2006-09-05 Hewlett-Packard Development Company, L.P. Bookmarking captured digital images at an event to all present devices
JP2003333497A (en) * 2002-05-14 2003-11-21 Nikon Gijutsu Kobo:Kk Device and program for image management
US20040203644A1 (en) * 2002-06-13 2004-10-14 Anders Randal Alan Customized notification
US6907238B2 (en) 2002-08-30 2005-06-14 Qualcomm Incorporated Beacon for locating and tracking wireless terminals
US20040078389A1 (en) * 2002-10-17 2004-04-22 Hamilton David O. System and method for locating images
US20040075752A1 (en) * 2002-10-18 2004-04-22 Eastman Kodak Company Correlating asynchronously captured event data and images
US7158689B2 (en) * 2002-11-25 2007-01-02 Eastman Kodak Company Correlating captured images and timed event data
US6990333B2 (en) * 2002-11-27 2006-01-24 Microsoft Corporation System and method for timed profile changes on a mobile device
JP2004214759A (en) * 2002-12-27 2004-07-29 Fuji Photo Film Co Ltd Method and apparatus of classifying image, and program
US8026970B2 (en) * 2003-02-27 2011-09-27 Casio Computer Co., Ltd. Image reproduction apparatus capable of simultaneously reproducing plurality of images
GB2403304A (en) 2003-06-25 2004-12-29 Canon Kk Image and date information processing
US7398479B2 (en) 2003-08-20 2008-07-08 Acd Systems, Ltd. Method and system for calendar-based image asset organization
US7840892B2 (en) * 2003-08-29 2010-11-23 Nokia Corporation Organization and maintenance of images using metadata
US20050075096A1 (en) * 2003-10-03 2005-04-07 Aljuraid Nassir Abdulrahman GSM phone applet and method for controlling prayer timings
US7636733B1 (en) * 2003-10-03 2009-12-22 Adobe Systems Incorporated Time-based image management
US8046330B2 (en) * 2003-11-11 2011-10-25 Fujifilm Corporation Image accumulation device and image accumulation method
US20050206751A1 (en) * 2004-03-19 2005-09-22 East Kodak Company Digital video system for assembling video sequences
US7576772B2 (en) * 2004-03-31 2009-08-18 Fotomedia Technologies, Llc Method for specifying image handling for images on a portable device
US20060139709A1 (en) * 2004-12-29 2006-06-29 Louis Bifano System and method for automatically sorting digital photographs
US7853100B2 (en) * 2006-08-08 2010-12-14 Fotomedia Technologies, Llc Method and system for photo planning and tracking
JP2008090885A (en) 2006-09-29 2008-04-17 Oki Electric Ind Co Ltd Semiconductor integrated device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6108640A (en) * 1997-01-14 2000-08-22 Slotznick; Benjamin System for calculating occasion dates and converting between different calendar systems, and intelligent agent for using same
US6606411B1 (en) * 1998-09-30 2003-08-12 Eastman Kodak Company Method for automatically classifying images into events
US7415662B2 (en) * 2000-01-31 2008-08-19 Adobe Systems Incorporated Digital media management apparatus and methods
US6810146B2 (en) * 2001-06-01 2004-10-26 Eastman Kodak Company Method and system for segmenting and identifying events in images using spoken annotations
US20030050982A1 (en) * 2001-09-13 2003-03-13 Chang Sam M. Automatic annotation of audio and/or visual data
US20040201740A1 (en) * 2002-03-15 2004-10-14 Canon Kabushiki Kaisha Automatic determination of image storage location
US20030184653A1 (en) * 2002-03-29 2003-10-02 Akito Ohkubo Method, apparatus, and program for classifying images
US20040135904A1 (en) * 2002-12-27 2004-07-15 Kazuo Shiota Image sorting method, device, and program

Cited By (270)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090276535A1 (en) * 2002-08-20 2009-11-05 Microsoft Corporation Media streaming of web content data
US8200772B2 (en) 2002-08-20 2012-06-12 Richard William Saunders Media streaming of web content data
US7991803B2 (en) 2002-10-16 2011-08-02 Microsoft Corporation Navigating media content by groups
US8886685B2 (en) 2002-10-16 2014-11-11 Microsoft Corporation Navigating media content by groups
US20100114986A1 (en) * 2002-10-16 2010-05-06 Microsoft Corporation Navigating media content by groups
US20100223302A1 (en) * 2004-10-29 2010-09-02 Microsoft Corporation Features such as titles, transitions, and/or effects which vary according to positions
US9445016B2 (en) 2004-10-29 2016-09-13 Microsoft Technology Licensing, Llc Features such as titles, transitions, and/or effects which vary according to positions
US8098896B2 (en) 2005-03-15 2012-01-17 Fujifilm Corporation Album generating apparatus, album generating method and computer readable medium
US8086612B2 (en) 2005-03-15 2011-12-27 Fujifilm Corporation Album generating apparatus, album generating method and program
US8055080B2 (en) 2005-03-15 2011-11-08 Fujifilm Corporation Album generating apparatus, album generating method and computer readable medium
US20060221779A1 (en) * 2005-03-15 2006-10-05 Fuji Photo Film Co., Ltd. Album generating apparatus, album generating method and program
US8631322B2 (en) * 2005-03-15 2014-01-14 Fujifilm Corporation Album creating apparatus facilitating appropriate image allocation, album generating method and program
US20060220983A1 (en) * 2005-03-15 2006-10-05 Fuji Photo Film Co., Ltd. Album creating apparatus, album generating method and program
US8423559B2 (en) 2005-03-15 2013-04-16 Fujifilm Corporation Album generating apparatus, album generating method and program
US8856149B2 (en) 2005-03-15 2014-10-07 Fujifilm Corporation Album generating apparatus, album generating method and program
US8260827B2 (en) 2005-03-15 2012-09-04 Fujifilm Corporation Album generating apparatus, album generating method and program
US7908547B2 (en) * 2005-04-28 2011-03-15 Fujifilm Corporation Album creating apparatus, album creating method and program
US20060244765A1 (en) * 2005-04-28 2006-11-02 Fuji Photo Film Co., Ltd. Album creating apparatus, album creating method and program
US20060271855A1 (en) * 2005-05-27 2006-11-30 Microsoft Corporation Operating system shell management of video files
US20070061759A1 (en) * 2005-08-05 2007-03-15 Realnetworks, Inc., System and method for chronologically presenting data
US20070033109A1 (en) * 2005-08-05 2007-02-08 Microsoft Corporation Informal trust relationship to facilitate data sharing
US7853483B2 (en) * 2005-08-05 2010-12-14 Microsoft Coporation Medium and system for enabling content sharing among participants associated with an event
US7639943B1 (en) * 2005-11-15 2009-12-29 Kalajan Kevin E Computer-implemented system and method for automated image uploading and sharing from camera-enabled mobile devices
US20070124333A1 (en) * 2005-11-29 2007-05-31 General Instrument Corporation Method and apparatus for associating metadata with digital photographs
US20070158405A1 (en) * 2005-12-22 2007-07-12 Samsung Electronics Co., Ltd. Method and apparatus for managing content in a portable terminal
US8644702B1 (en) 2005-12-28 2014-02-04 Xi Processing L.L.C. Computer-implemented system and method for notifying users upon the occurrence of an event
US9667581B2 (en) 2005-12-28 2017-05-30 Gula Consulting Limited Liability Company Computer-implemented system and method for notifying users upon the occurrence of an event
US9173009B2 (en) 2005-12-28 2015-10-27 Gula Consulting Limited Liability Company Computer-implemented system and method for notifying users upon the occurrence of an event
US9385984B2 (en) 2005-12-28 2016-07-05 Gula Consulting Limited Liability Company Computer-implemented system and method for notifying users upon the occurrence of an event
US20070208860A1 (en) * 2006-03-02 2007-09-06 Zellner Samuel N User specific data collection
US8526925B2 (en) 2006-03-02 2013-09-03 At&T Intellectual Property I, L.P. Environment independent user preference communication
US20070208861A1 (en) * 2006-03-02 2007-09-06 Zellner Samuel N User preference interpretation
US8375283B2 (en) * 2006-06-20 2013-02-12 Nokia Corporation System, device, method, and computer program product for annotating media files
US20070293265A1 (en) * 2006-06-20 2007-12-20 Nokia Corporation System, device, method, and computer program product for annotating media files
US8301995B2 (en) * 2006-06-22 2012-10-30 Csr Technology Inc. Labeling and sorting items of digital data by use of attached annotations
US20070297786A1 (en) * 2006-06-22 2007-12-27 Eli Pozniansky Labeling and Sorting Items of Digital Data by Use of Attached Annotations
US20080021928A1 (en) * 2006-07-24 2008-01-24 Yagnik Jay N Method and apparatus for automatically annotating images
US8065313B2 (en) * 2006-07-24 2011-11-22 Google Inc. Method and apparatus for automatically annotating images
US20080059618A1 (en) * 2006-08-30 2008-03-06 Research In Motion Limited, Automatic attachment of image and/or audio records to electronic calendar meeting event record in portable wireless devices
US20080133526A1 (en) * 2006-12-05 2008-06-05 Palm, Inc. Method and system for processing images using time and location filters
US9665597B2 (en) * 2006-12-05 2017-05-30 Qualcomm Incorporated Method and system for processing images using time and location filters
US20080229186A1 (en) * 2007-03-14 2008-09-18 Microsoft Corporation Persisting digital ink annotations as image metadata
US20100115399A1 (en) * 2007-03-15 2010-05-06 Koninklijke Philips Electronics N.V. Method and apparatus for generating an album of images
US10380202B2 (en) 2007-03-27 2019-08-13 Sholem Weisner Physical location history with URL and positioning system
US10565271B2 (en) 2007-03-27 2020-02-18 Sholem Weisner Method and system governing interaction between URL-possessing elements of a mobile web
US10685068B2 (en) 2007-03-27 2020-06-16 Sholem Weisner Targeting individuals for advertising using digital physical location histories
US20080243868A1 (en) * 2007-03-27 2008-10-02 Sholem Weisner Method and apparatus for a digital leg history
US20080244006A1 (en) * 2007-03-27 2008-10-02 Sholem Weisner Method and apparatus for a digital leg history
US10572554B2 (en) 2007-03-27 2020-02-25 Sholem Weisner Method and system governing interaction between URL-possessing element of a physical web that includes a mobile web
US10642911B2 (en) 2007-03-27 2020-05-05 Sholem Weisner Enhancing digital search results for a business in a target geographic area using URLs of location histories
US10146871B2 (en) * 2007-03-27 2018-12-04 Sholem Weisner Method and apparatus for a digital leg history
US10860667B2 (en) 2007-03-27 2020-12-08 Sholem Weisner Physical location history with key data using positioning system
WO2008118126A1 (en) * 2007-03-27 2008-10-02 Sholem Weisner Method and apparatus for a digital leg history
US11163839B2 (en) * 2007-03-27 2021-11-02 Sholem Weisner Mobile communication device with location histories configured to link individual member to vendor members of network
US10394906B2 (en) 2007-03-27 2019-08-27 Sholem Weisner Physical location history with digital member entries or location history entries
US10565270B2 (en) 2007-03-27 2020-02-18 Sholem Weisner Method and system governing interaction between URL-possessing elements of a physical web that includes a mobile web
US10642910B2 (en) 2007-03-27 2020-05-05 Sholem Weisner Accumulation of location history based on digital member entries from multiple devices of a mobile web
US10394904B2 (en) 2007-03-27 2019-08-27 Sholem Weisner Physical location history with advertising
US10394905B2 (en) 2007-03-27 2019-08-27 Sholem Weisner Method and apparatus for a digital leg history
US20100030755A1 (en) * 2007-04-10 2010-02-04 Olaworks Inc. Method for inferring personal relationship by using readable data, and method and system for attaching tag to digital data by using the readable data
US8402380B2 (en) 2007-04-30 2013-03-19 Microsoft Corporation Event highlighting and differentiation view
US20080270914A1 (en) * 2007-04-30 2008-10-30 Microsoft Corporation Event highlighting and differentiation view
US20080304808A1 (en) * 2007-06-05 2008-12-11 Newell Catherine D Automatic story creation using semantic classifiers for digital assets and associated metadata
US8934717B2 (en) 2007-06-05 2015-01-13 Intellectual Ventures Fund 83 Llc Automatic story creation using semantic classifiers for digital assets and associated metadata
US9535911B2 (en) * 2007-06-29 2017-01-03 Pulsepoint, Inc. Processing a content item with regard to an event
US20090019013A1 (en) * 2007-06-29 2009-01-15 Allvoices, Inc. Processing a content item with regard to an event
US9201880B2 (en) 2007-06-29 2015-12-01 Allvoices, Inc. Processing a content item with regard to an event and a location
US9146922B2 (en) * 2007-07-26 2015-09-29 Oracle International Corporation Mobile multimedia proxy database
US20090030911A1 (en) * 2007-07-26 2009-01-29 Oracle International Corporation Mobile multimedia proxy database
US20090132489A1 (en) * 2007-11-15 2009-05-21 Transcend Information , Inc. Method for managing digital photograph, apparatus for displaying digital photograph, and method for playing the same
US10353943B2 (en) * 2007-12-03 2019-07-16 Oath Inc. Computerized system and method for automatically associating metadata with media objects
US20170011034A1 (en) * 2007-12-03 2017-01-12 Yahoo! Inc. Computerized system and method for automatically associating metadata with media objects
EP2218020A4 (en) * 2007-12-03 2011-06-08 Yahoo Inc Associating metadata with media objects using time
US8570375B1 (en) * 2007-12-04 2013-10-29 Stoplift, Inc. Method and apparatus for random-access review of point of sale transactional video
US8019772B2 (en) * 2007-12-05 2011-09-13 International Business Machines Corporation Computer method and apparatus for tag pre-search in social software
US20090150342A1 (en) * 2007-12-05 2009-06-11 International Business Machines Corporation Computer Method and Apparatus for Tag Pre-Search in Social Software
US20110101270A1 (en) * 2008-04-22 2011-05-05 Atsutaka Manabe Liquid-crystalline medium
US20100054601A1 (en) * 2008-08-28 2010-03-04 Microsoft Corporation Image Tagging User Interface
US8867779B2 (en) 2008-08-28 2014-10-21 Microsoft Corporation Image tagging user interface
US9020183B2 (en) 2008-08-28 2015-04-28 Microsoft Technology Licensing, Llc Tagging images with labels
US8396246B2 (en) 2008-08-28 2013-03-12 Microsoft Corporation Tagging images with labels
US20150016691A1 (en) * 2008-08-28 2015-01-15 Microsoft Corporation Image Tagging User Interface
US20100054600A1 (en) * 2008-08-28 2010-03-04 Microsoft Corporation Tagging Images With Labels
US20100114856A1 (en) * 2008-10-31 2010-05-06 Canon Kabushiki Kaisha Information search apparatus, information search method, and storage medium
US20100121852A1 (en) * 2008-11-11 2010-05-13 Samsung Electronics Co., Ltd Apparatus and method of albuming content
EP2187322A1 (en) * 2008-11-11 2010-05-19 Samsung Electronics Co., Ltd. Apparatus and method of albuming content
WO2010061345A1 (en) * 2008-11-26 2010-06-03 Nokia Corporation An Apparatus and Method for Copying Data to a File
US20100191728A1 (en) * 2009-01-23 2010-07-29 James Francis Reilly Method, System Computer Program, and Apparatus for Augmenting Media Based on Proximity Detection
US9152292B2 (en) * 2009-02-05 2015-10-06 Hewlett-Packard Development Company, L.P. Image collage authoring
US20100199227A1 (en) * 2009-02-05 2010-08-05 Jun Xiao Image collage authoring
US20100235366A1 (en) * 2009-03-13 2010-09-16 Microsoft Corporation Data file aggregation with respect to user specific temporal range
US20100269062A1 (en) * 2009-04-15 2010-10-21 International Business Machines, Corpoation Presenting and zooming a set of objects within a window
US9335916B2 (en) * 2009-04-15 2016-05-10 International Business Machines Corporation Presenting and zooming a set of objects within a window
US20110161068A1 (en) * 2009-12-29 2011-06-30 Dynavox Systems, Llc System and method of using a sense model for symbol assignment
US11592959B2 (en) 2010-01-06 2023-02-28 Apple Inc. Device, method, and graphical user interface for navigating and displaying content in context
US11099712B2 (en) 2010-01-06 2021-08-24 Apple Inc. Device, method, and graphical user interface for navigating and displaying content in context
US10296166B2 (en) 2010-01-06 2019-05-21 Apple Inc. Device, method, and graphical user interface for navigating and displaying content in context
US10732790B2 (en) 2010-01-06 2020-08-04 Apple Inc. Device, method, and graphical user interface for navigating and displaying content in context
EP2521979A1 (en) * 2010-01-08 2012-11-14 Telefonaktiebolaget LM Ericsson (publ) A method and apparatus for social tagging of media files
EP2521979A4 (en) * 2010-01-08 2014-12-17 Ericsson Telefon Ab L M A method and apparatus for social tagging of media files
US8718386B2 (en) * 2010-02-23 2014-05-06 Intellectual Ventures Fund 83 Llc Adaptive event timeline in consumer image collections
US20110206284A1 (en) * 2010-02-23 2011-08-25 Madirakshi Das Adaptive event timeline in consumer image collections
US8798445B2 (en) 2010-02-26 2014-08-05 Blackberry Limited Enhanced banner advertisements
US20110211813A1 (en) * 2010-02-26 2011-09-01 Research In Motion Limited Enhanced banner advertisements
EP2369530A1 (en) * 2010-02-26 2011-09-28 Research In Motion Limited Enhanced banner advertisements
US20110252081A1 (en) * 2010-04-08 2011-10-13 Microsoft Corporation Metadata subscription registry
US20110261995A1 (en) * 2010-04-27 2011-10-27 Cok Ronald S Automated template layout system
US20110261994A1 (en) * 2010-04-27 2011-10-27 Cok Ronald S Automated template layout method
US8406461B2 (en) * 2010-04-27 2013-03-26 Intellectual Ventures Fund 83 Llc Automated template layout system
US8406460B2 (en) * 2010-04-27 2013-03-26 Intellectual Ventures Fund 83 Llc Automated template layout method
US20110269435A1 (en) * 2010-04-30 2011-11-03 Tim Dieckman Automatic iconic display of calendar events on computing devices by inspecting events text
WO2012018517A1 (en) 2010-07-26 2012-02-09 Eastman Kodak Company Automatic digital camera photography mode selection
US9686469B2 (en) 2010-07-26 2017-06-20 Apple Inc. Automatic digital camera photography mode selection
US8970720B2 (en) 2010-07-26 2015-03-03 Apple Inc. Automatic digital camera photography mode selection
CN102959944A (en) * 2010-07-26 2013-03-06 柯达公司 Automatic digital camera photography mode selection
US9270882B2 (en) 2010-07-26 2016-02-23 Apple Inc. System and method for contextual digital photography mode selection
US20120027303A1 (en) * 2010-07-27 2012-02-02 Eastman Kodak Company Automated multiple image product system
US20120030575A1 (en) * 2010-07-27 2012-02-02 Cok Ronald S Automated image-selection system
US8811755B2 (en) 2010-08-25 2014-08-19 Apple Inc. Detecting recurring events in consumer image collections
US8634662B2 (en) * 2010-08-25 2014-01-21 Apple Inc. Detecting recurring events in consumer image collections
WO2012027178A1 (en) * 2010-08-25 2012-03-01 Eastman Kodak Company Detecting recurring events in consumer image collections
US20120051644A1 (en) * 2010-08-25 2012-03-01 Madirakshi Das Detecting recurring events in consumer image collections
CN103069420A (en) * 2010-08-25 2013-04-24 伊斯曼柯达公司 Detecting recurring events in consumer image collections
US9336242B2 (en) * 2010-10-14 2016-05-10 Lg Electronics Inc. Mobile terminal and displaying method thereof
US20120094720A1 (en) * 2010-10-14 2012-04-19 Wonsik Choi Mobile terminal and displaying method thereof
US20120098837A1 (en) * 2010-10-20 2012-04-26 Luckybite Llp Apparatus for augmenting a handheld device
US20120102431A1 (en) * 2010-10-26 2012-04-26 Marc Krolczyk Digital media frame providing customized content
US10591921B2 (en) 2011-01-28 2020-03-17 Intouch Technologies, Inc. Time-dependent navigation of telepresence robots
US11468983B2 (en) 2011-01-28 2022-10-11 Teladoc Health, Inc. Time-dependent navigation of telepresence robots
US20130111373A1 (en) * 2011-05-07 2013-05-02 Ryouichi Kawanishi Presentation content generation device, presentation content generation method, presentation content generation program, and integrated circuit
US9710518B2 (en) * 2011-07-13 2017-07-18 Linkedin Corporation Method and system for semantic search against a document collection
US20150324431A1 (en) * 2011-07-13 2015-11-12 Linkedin Corporation Method and system for semantic search against a document collection
US20130038756A1 (en) * 2011-08-08 2013-02-14 Samsung Electronics Co., Ltd. Life-logging and memory sharing
US8634661B2 (en) * 2011-09-07 2014-01-21 Intellectual Ventures Fund 83 Llc Event classification method using light source detection
US8634660B2 (en) * 2011-09-07 2014-01-21 Intellectual Ventures Fund 83 Llc Event classification method using lit candle detection
US20130058577A1 (en) * 2011-09-07 2013-03-07 Peter O. Stubler Event classification method for related digital images
US20130058542A1 (en) * 2011-09-07 2013-03-07 Peter O. Stubler Event classification method using lit candle detection
US20130058583A1 (en) * 2011-09-07 2013-03-07 Andrew Charles Gallagher Event classification method using light source detection
US8831360B2 (en) 2011-10-21 2014-09-09 Intellectual Ventures Fund 83 Llc Making image-based product from digital image collection
US8761523B2 (en) 2011-11-21 2014-06-24 Intellectual Ventures Fund 83 Llc Group method for making event-related media collection
US8707152B2 (en) * 2012-01-17 2014-04-22 Apple Inc. Presenting images from slow image-event stream
US9672194B2 (en) * 2012-01-17 2017-06-06 Apple Inc. Presenting images from slow image-event stream
US20130185635A1 (en) * 2012-01-17 2013-07-18 Ron Barzel Presenting images from slow image-event stream
US20140189505A1 (en) * 2012-01-17 2014-07-03 Apple Inc. Presenting Images From Slow Image-Event Stream
US20150081369A1 (en) * 2012-04-27 2015-03-19 Blackberry Limited Systems and Methods for Providing Files in Relation to a Calendar Event
US10475000B2 (en) * 2012-04-27 2019-11-12 Blackberry Limited Systems and methods for providing files in relation to a calendar event
WO2013170023A1 (en) 2012-05-11 2013-11-14 Intellectual Ventures Fund 83 Llc Photo -album generaton from a photo-collection based on a scenario and on the statistical distribution of the types of photos.
US8917943B2 (en) 2012-05-11 2014-12-23 Intellectual Ventures Fund 83 Llc Determining image-based product from digital image collection
US10157217B2 (en) 2012-05-18 2018-12-18 California Institute Of Technology Systems and methods for the distributed categorization of source data
US10780582B2 (en) 2012-05-22 2020-09-22 Intouch Technologies, Inc. Social behavior rules for a medical telepresence robot
US11515049B2 (en) 2012-05-22 2022-11-29 Teladoc Health, Inc. Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US11453126B2 (en) 2012-05-22 2022-09-27 Teladoc Health, Inc. Clinical workflows utilizing autonomous and semi-autonomous telemedicine devices
US10328576B2 (en) 2012-05-22 2019-06-25 Intouch Technologies, Inc. Social behavior rules for a medical telepresence robot
US11628571B2 (en) 2012-05-22 2023-04-18 Teladoc Health, Inc. Social behavior rules for a medical telepresence robot
US10892052B2 (en) 2012-05-22 2021-01-12 Intouch Technologies, Inc. Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US10540319B2 (en) 2012-05-24 2020-01-21 Nant Holdings Ip, Llc Event archiving, systems and methods
WO2013177515A1 (en) * 2012-05-24 2013-11-28 Nant Holdings Ip, Llc Event archiving, systems and methods
US11061855B2 (en) 2012-05-24 2021-07-13 Nant Holdings Ip, Llc Event archiving, systems and methods
US10133742B2 (en) 2012-05-24 2018-11-20 Nant Holdings Ip, Llc Event archiving, systems and methods
US9898701B2 (en) * 2012-06-22 2018-02-20 California Institute Of Technology Systems and methods for the determining annotator performance in the distributed annotation of source data
US20160275418A1 (en) * 2012-06-22 2016-09-22 California Institute Of Technology Systems and Methods for the Determining Annotator Performance in the Distributed Annotation of Source Data
US9877059B1 (en) 2012-06-25 2018-01-23 Google Inc. Video broadcasting with geolocation
US9584834B1 (en) 2012-06-25 2017-02-28 Google Inc. Video broadcasting with geolocation
US9788063B1 (en) 2012-06-25 2017-10-10 Google Inc. Video broadcasting with geolocation
US11126213B2 (en) 2012-07-14 2021-09-21 Causam Enterprises, Inc. Method and apparatus for actively managing electric power supply for an electric power grid
US10429871B2 (en) 2012-07-14 2019-10-01 Causam Energy, Inc. Method and apparatus for actively managing electric power supply for an electric power grid
US11782470B2 (en) 2012-07-14 2023-10-10 Causam Enterprises, Inc. Method and apparatus for actively managing electric power supply for an electric power grid
US11625058B2 (en) 2012-07-14 2023-04-11 Causam Enterprises, Inc. Method and apparatus for actively managing electric power supply for an electric power grid
US10768654B2 (en) 2012-07-14 2020-09-08 Causam Energy, Inc. Method and apparatus for actively managing electric power supply for an electric power grid
US9460205B2 (en) 2012-07-20 2016-10-04 Google Inc. Crowdsourced video collaboration
US10692539B2 (en) 2012-07-20 2020-06-23 Google Llc Crowdsourced video collaboration
US9100273B2 (en) 2012-07-20 2015-08-04 Intel Corporation Calendar-aware devices
WO2014014588A1 (en) * 2012-07-20 2014-01-23 Intel Corporation Calendar-aware devices
US20140025755A1 (en) * 2012-07-20 2014-01-23 Google Inc. Inferring events based on mob source video
US9235760B2 (en) * 2012-09-14 2016-01-12 Fujifilm Corporation Image synthesizing system, image processing apparatus, and image processing method
US20140079322A1 (en) * 2012-09-14 2014-03-20 Fujifilm Corporation Image synthesizing system, image processing apparatus, and image processing method
US20230249631A1 (en) * 2012-09-28 2023-08-10 Digital Ally, Inc. Portable video and imaging system
US11910128B2 (en) 2012-11-26 2024-02-20 Teladoc Health, Inc. Enhanced video interaction for a user interface of a telepresence network
US10334205B2 (en) * 2012-11-26 2019-06-25 Intouch Technologies, Inc. Enhanced video interaction for a user interface of a telepresence network
US20170127019A1 (en) * 2012-11-26 2017-05-04 Intouch Technologies, Inc. Enhanced video interaction for a user interface of a telepresence network
US10924708B2 (en) 2012-11-26 2021-02-16 Teladoc Health, Inc. Enhanced video interaction for a user interface of a telepresence network
US9565334B2 (en) * 2012-12-05 2017-02-07 Aspekt R&D A/S Photo survey using smart device with camera
US20150326778A1 (en) * 2012-12-05 2015-11-12 Tue FRELTOFT Photo survey
US20140176419A1 (en) * 2012-12-21 2014-06-26 Nokia Corporation Method and apparatus for sharing content
US9075432B2 (en) * 2012-12-21 2015-07-07 Nokia Technologies Oy Method and apparatus for sharing content
US20140258297A1 (en) * 2013-03-07 2014-09-11 Shahram Davari Automatic grouping of photos into folders and naming the photo folders
US20140280561A1 (en) * 2013-03-15 2014-09-18 Fujifilm North America Corporation System and method of distributed event based digital image collection, organization and sharing
US10915868B2 (en) 2013-06-17 2021-02-09 Microsoft Technology Licensing, Llc Displaying life events while navigating a calendar
US10409858B2 (en) 2013-08-02 2019-09-10 Shoto, Inc. Discovery and sharing of photos between devices
US20170286808A1 (en) * 2013-08-07 2017-10-05 Google Inc. Systems and methods for inferential sharing of photos
US11301729B2 (en) 2013-08-07 2022-04-12 Google Llc Systems and methods for inferential sharing of photos
US9978001B2 (en) * 2013-08-07 2018-05-22 Google Llc Systems and methods for inferential sharing of photos
US10643110B2 (en) 2013-08-07 2020-05-05 Google Llc Systems and methods for inferential sharing of photos
US20150286722A1 (en) * 2014-04-07 2015-10-08 Sony Corporation Tagging of documents and other resources to enhance their searchability
US10503773B2 (en) * 2014-04-07 2019-12-10 Sony Corporation Tagging of documents and other resources to enhance their searchability
US20150331930A1 (en) * 2014-05-16 2015-11-19 Here Global B.V. Method and apparatus for classification of media based on metadata
US20150356121A1 (en) * 2014-06-04 2015-12-10 Commachine, Inc. Position location-enabled, event-based, photo sharing software and service
US11170037B2 (en) * 2014-06-11 2021-11-09 Kodak Alaris Inc. Method for creating view-based representations from multimedia collections
US20150363409A1 (en) * 2014-06-11 2015-12-17 Kodak Alaris Inc. Method for creating view-based representations from multimedia collections
US9824079B1 (en) 2014-07-11 2017-11-21 Google Llc Providing actions for mobile onscreen content
US9762651B1 (en) * 2014-07-11 2017-09-12 Google Inc. Redaction suggestion for sharing screen content
US9582482B1 (en) 2014-07-11 2017-02-28 Google Inc. Providing an annotation linking related entities in onscreen content
US11704136B1 (en) 2014-07-11 2023-07-18 Google Llc Automatic reminders in a mobile environment
US10080114B1 (en) 2014-07-11 2018-09-18 Google Llc Detection and ranking of entities from mobile onscreen content
US9798708B1 (en) 2014-07-11 2017-10-24 Google Inc. Annotating relevant content in a screen capture image
US10592261B1 (en) 2014-07-11 2020-03-17 Google Llc Automating user input from onscreen content
US9916328B1 (en) 2014-07-11 2018-03-13 Google Llc Providing user assistance from interaction understanding
US11907739B1 (en) 2014-07-11 2024-02-20 Google Llc Annotating screen content in a mobile environment
US9788179B1 (en) 2014-07-11 2017-10-10 Google Inc. Detection and ranking of entities from mobile onscreen content
US10652706B1 (en) 2014-07-11 2020-05-12 Google Llc Entity disambiguation in a mobile environment
US11573810B1 (en) 2014-07-11 2023-02-07 Google Llc Sharing screen content in a mobile environment
US10248440B1 (en) 2014-07-11 2019-04-02 Google Llc Providing a set of user input actions to a mobile device to cause performance of the set of user input actions
US10963630B1 (en) 2014-07-11 2021-03-30 Google Llc Sharing screen content in a mobile environment
US9886461B1 (en) 2014-07-11 2018-02-06 Google Llc Indexing mobile onscreen content
US10244369B1 (en) 2014-07-11 2019-03-26 Google Llc Screen capture image repository for a user
US9811352B1 (en) 2014-07-11 2017-11-07 Google Inc. Replaying user input actions using screen capture images
US11347385B1 (en) 2014-07-11 2022-05-31 Google Llc Sharing screen content in a mobile environment
US10491660B1 (en) 2014-07-11 2019-11-26 Google Llc Sharing screen content in a mobile environment
US10733716B2 (en) 2014-07-31 2020-08-04 Samsung Electronics Co., Ltd. Method and device for providing image
US10157455B2 (en) 2014-07-31 2018-12-18 Samsung Electronics Co., Ltd. Method and device for providing image
TWI637347B (en) * 2014-07-31 2018-10-01 三星電子股份有限公司 Method and device for providing image
US20160050704A1 (en) * 2014-08-12 2016-02-18 Lyve Minds, Inc. Image linking and sharing
US9965559B2 (en) 2014-08-21 2018-05-08 Google Llc Providing automatic actions for mobile onscreen content
US20160175008A1 (en) * 2014-12-18 2016-06-23 Medtronic, Inc. Open channel implant tool with additional lumen and implant techniques utilizing such tools
US11481854B1 (en) 2015-02-23 2022-10-25 ImageKeeper LLC Property measurement with automated document production
US11550960B2 (en) 2015-02-24 2023-01-10 ImageKeeper LLC Secure digital data collection
US11227070B2 (en) 2015-02-24 2022-01-18 ImageKeeper LLC Secure digital data collection
US10282562B1 (en) 2015-02-24 2019-05-07 ImageKeeper LLC Secure digital data collection
US9703541B2 (en) 2015-04-28 2017-07-11 Google Inc. Entity action suggestion on a mobile device
US10572132B2 (en) 2015-06-05 2020-02-25 Apple Inc. Formatting content for a reduced-size user interface
US20170039428A1 (en) * 2015-08-07 2017-02-09 Canon Kabushiki Kaisha Information processing method, information processing apparatus, and non-transitory computer-readable storage medium
US10965975B2 (en) * 2015-08-31 2021-03-30 Orcam Technologies Ltd. Systems and methods for recognizing faces using non-facial information
US10595072B2 (en) * 2015-08-31 2020-03-17 Orcam Technologies Ltd. Systems and methods for recognizing faces using non-facial information
US10970646B2 (en) 2015-10-01 2021-04-06 Google Llc Action suggestions for user-selected content
US11089457B2 (en) 2015-10-22 2021-08-10 Google Llc Personalized entity repository
US10178527B2 (en) 2015-10-22 2019-01-08 Google Llc Personalized entity repository
US11716600B2 (en) 2015-10-22 2023-08-01 Google Llc Personalized entity repository
US10733360B2 (en) 2015-11-18 2020-08-04 Google Llc Simulated hyperlinks on a mobile device
US10055390B2 (en) 2015-11-18 2018-08-21 Google Llc Simulated hyperlinks on a mobile device based on user intent and a centered selection of text
US10592750B1 (en) * 2015-12-21 2020-03-17 Amazon Technlogies, Inc. Video rule engine
US10324973B2 (en) 2016-06-12 2019-06-18 Apple Inc. Knowledge graph metadata network based on notable moments
US11334209B2 (en) 2016-06-12 2022-05-17 Apple Inc. User interfaces for retrieving contextually relevant media content
US10891013B2 (en) 2016-06-12 2021-01-12 Apple Inc. User interfaces for retrieving contextually relevant media content
US11681408B2 (en) 2016-06-12 2023-06-20 Apple Inc. User interfaces for retrieving contextually relevant media content
US10073584B2 (en) 2016-06-12 2018-09-11 Apple Inc. User interfaces for retrieving contextually relevant media content
US11941223B2 (en) 2016-06-12 2024-03-26 Apple Inc. User interfaces for retrieving contextually relevant media content
US10362219B2 (en) 2016-09-23 2019-07-23 Apple Inc. Avatar creation and editing
US10535005B1 (en) 2016-10-26 2020-01-14 Google Llc Providing contextual actions for mobile onscreen content
US11734581B1 (en) 2016-10-26 2023-08-22 Google Llc Providing contextual actions for mobile onscreen content
US11237696B2 (en) 2016-12-19 2022-02-01 Google Llc Smart assist for repeated actions
US11860668B2 (en) 2016-12-19 2024-01-02 Google Llc Smart assist for repeated actions
US20180203825A1 (en) * 2017-01-16 2018-07-19 Seiko Epson Corporation Electronic apparatus, electronic system, method of controlling electronic apparatus, and computer-readable recording medium
WO2018145015A1 (en) 2017-02-06 2018-08-09 Kodak Alaris Inc. Method for creating audio tracks for accompanying visual imagery
US11449788B2 (en) 2017-03-17 2022-09-20 California Institute Of Technology Systems and methods for online annotation of source data using skill estimation
US10277714B2 (en) * 2017-05-10 2019-04-30 Facebook, Inc. Predicting household demographics based on image data
US11257044B2 (en) * 2017-06-20 2022-02-22 Microsoft Technology Licensing, Llc Automatic association and sharing of photos with calendar events
EP3791288A1 (en) * 2018-05-07 2021-03-17 Apple Inc. Automatic digital asset sharing suggestions
US11782575B2 (en) 2018-05-07 2023-10-10 Apple Inc. User interfaces for sharing contextually relevant media content
WO2019217202A1 (en) * 2018-05-07 2019-11-14 Apple, Inc. Automatic digital asset sharing suggestions
US11243996B2 (en) * 2018-05-07 2022-02-08 Apple Inc. Digital asset search user interface
US11086935B2 (en) 2018-05-07 2021-08-10 Apple Inc. Smart updates from historical database changes
US11212416B2 (en) 2018-07-06 2021-12-28 ImageKeeper LLC Secure digital media capture and analysis
US10846343B2 (en) 2018-09-11 2020-11-24 Apple Inc. Techniques for disambiguating clustered location identifiers
US11775590B2 (en) 2018-09-11 2023-10-03 Apple Inc. Techniques for disambiguating clustered location identifiers
US10803135B2 (en) 2018-09-11 2020-10-13 Apple Inc. Techniques for disambiguating clustered occurrence identifiers
US11671493B2 (en) * 2019-12-23 2023-06-06 Apple Inc. Timeline generation
US11468198B2 (en) 2020-04-01 2022-10-11 ImageKeeper LLC Secure digital media authentication and analysis
US11838475B2 (en) 2020-08-31 2023-12-05 ImageKeeper LLC Secure document certification and execution system
US11553105B2 (en) 2020-08-31 2023-01-10 ImageKeeper, LLC Secure document certification and execution system
US20220382811A1 (en) * 2021-06-01 2022-12-01 Apple Inc. Inclusive Holidays

Also Published As

Publication number Publication date
US8358358B2 (en) 2013-01-22
US9049388B2 (en) 2015-06-02
WO2007008386A3 (en) 2007-05-24
US20110099478A1 (en) 2011-04-28
US20100245625A1 (en) 2010-09-30
EP1902392A2 (en) 2008-03-26
JP5225082B2 (en) 2013-07-03
US20140160315A1 (en) 2014-06-12
EP2161670A1 (en) 2010-03-10
US8717461B2 (en) 2014-05-06
WO2007008386A2 (en) 2007-01-18
JP2009500982A (en) 2009-01-08
EP2287755A1 (en) 2011-02-23

Similar Documents

Publication Publication Date Title
US9049388B2 (en) Methods and systems for annotating images based on special events
US8315463B2 (en) User interface for face recognition
US7663671B2 (en) Location based image classification with map segmentation
US9904723B2 (en) Event based metadata synthesis
JP5570079B2 (en) Data processing apparatus and data processing method
KR100641791B1 (en) Tagging Method and System for Digital Data
US20040264810A1 (en) System and method for organizing images
US20060259511A1 (en) Media object organization across information management services
US20060044635A1 (en) Image file processing method and related technique thereof
US20030033296A1 (en) Digital media management apparatus and methods
US10503777B2 (en) Method and device relating to information management
JP2013225327A (en) Camera user input based image value index
US20080085032A1 (en) Supplying digital images from a collection
JP4638366B2 (en) Representative image selection device and representative image selection method
JP2012068805A (en) Electronic mail apparatus, electronic mail system, program, and mail text creation method

Legal Events

Date Code Title Description
AS Assignment

Owner name: EASTMAN KODAK COMPANY, NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GALLAGHER, ANDREW C.;FRYER, SAMUEL M.;LOUI, ALEXANDER C.;AND OTHERS;REEL/FRAME:016752/0493;SIGNING DATES FROM 20050621 TO 20050711

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: FAR EAST DEVELOPMENT LTD., NEW YORK

Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001

Effective date: 20130201

Owner name: FPC INC., CALIFORNIA

Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001

Effective date: 20130201

Owner name: KODAK PHILIPPINES, LTD., NEW YORK

Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001

Effective date: 20130201

Owner name: CREO MANUFACTURING AMERICA LLC, WYOMING

Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001

Effective date: 20130201

Owner name: KODAK REALTY, INC., NEW YORK

Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001

Effective date: 20130201

Owner name: NPEC INC., NEW YORK

Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001

Effective date: 20130201

Owner name: QUALEX INC., NORTH CAROLINA

Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001

Effective date: 20130201

Owner name: KODAK AMERICAS, LTD., NEW YORK

Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001

Effective date: 20130201

Owner name: KODAK PORTUGUESA LIMITED, NEW YORK

Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001

Effective date: 20130201

Owner name: EASTMAN KODAK INTERNATIONAL CAPITAL COMPANY, INC.,

Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001

Effective date: 20130201

Owner name: LASER-PACIFIC MEDIA CORPORATION, NEW YORK

Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001

Effective date: 20130201

Owner name: KODAK AVIATION LEASING LLC, NEW YORK

Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001

Effective date: 20130201

Owner name: KODAK (NEAR EAST), INC., NEW YORK

Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001

Effective date: 20130201

Owner name: PAKON, INC., INDIANA

Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001

Effective date: 20130201

Owner name: EASTMAN KODAK COMPANY, NEW YORK

Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001

Effective date: 20130201

Owner name: KODAK IMAGING NETWORK, INC., CALIFORNIA

Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001

Effective date: 20130201

AS Assignment

Owner name: INTELLECTUAL VENTURES FUND 83 LLC, NEVADA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:EASTMAN KODAK COMPANY;REEL/FRAME:031108/0430

Effective date: 20130201

AS Assignment

Owner name: MONUMENT PEAK VENTURES, LLC, TEXAS

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:INTELLECTUAL VENTURES FUND 83 LLC;REEL/FRAME:064599/0304

Effective date: 20230728