WO2002057959A2 - Digital media management apparatus and methods - Google Patents

Digital media management apparatus and methods Download PDF

Info

Publication number
WO2002057959A2
WO2002057959A2 PCT/US2002/001530 US0201530W WO02057959A2 WO 2002057959 A2 WO2002057959 A2 WO 2002057959A2 US 0201530 W US0201530 W US 0201530W WO 02057959 A2 WO02057959 A2 WO 02057959A2
Authority
WO
WIPO (PCT)
Prior art keywords
objects
distribution
match group
computer program
program product
Prior art date
Application number
PCT/US2002/001530
Other languages
French (fr)
Other versions
WO2002057959A3 (en
Inventor
Kenneth Rothmuller
Laurie Vertelney
Michael Slater
Bernard Peuto
Original Assignee
Adobe Systems Incorporated
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US09/774,523 external-priority patent/US20020087546A1/en
Application filed by Adobe Systems Incorporated filed Critical Adobe Systems Incorporated
Publication of WO2002057959A2 publication Critical patent/WO2002057959A2/en
Publication of WO2002057959A3 publication Critical patent/WO2002057959A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/54Browsing; Visualisation therefor

Definitions

  • the present invention relates to methods and apparatus for storing, cataloguing, managing, organizing, finding and displaying objects such as digital images.
  • the invention includes methods for associating ("tagging") fields of text and numeric data ("metadata”) with individual objects such as images or photos, storing the objects and associated metadata as records in a relational database, and selecting, sorting, organizing and finding the objects based on their tagged metadata content.
  • Default metadata tags can be specified, and new metadata tags can be defined and created through a tag editor by naming the tag, selecting its tag type, optionally selecting a graphical icon that represents the tag, and filling in any remaining fields or attributes that are unique to and define the tag type.
  • Tags can be readily associated with an object by adding a record containing the tag information or metadata to a database, and relating the tagged metadata record to a database record containing the object or a pointer to the object.
  • Tags can also be graphically associated with an object by, for example, dragging and dropping a graphical icon representing the tag onto a graphical representation of the object. In the latter case, database records containing the tag metadata are automatically created and related to the database record containing the target object or a pointer to the target object.
  • these search criteria can include, but are not limited to, the date and time the photos were taken, textual information that is associated with the photos such as the names of the people who are in the photos or the places or events where the photos were taken, designations of the photos as favorite photos, and designation of the photos as photos that have been printed, shared with others, or archived on a certain date.
  • the matching objects can be viewed or arranged according to the degree to which they have associated metadata that matches the search criteria.
  • objects that match all of the search criteria can be displayed first, followed by objects that match one or more of the search criteria, and finally by objects that match none of the search criteria.
  • Objects in the different match groups can be differentiated from one another in the display area by visual cues, such as being displayed in front of different background colors or patterns.
  • objects matching all of the search criteria can be displayed in front of a white background, while objects matching some of the search criteria can be displayed in front of a blue background, and objects matching none of the search criteria can be displayed in front of a gray background.
  • the distribution of the objects stored in the database can be displayed as a histogram along a timeline.
  • Time bands can be set along the timeline to indicate a time period that can be used to search for matching objects in the database, or to limit the search results for a given tag search to objects having temporal metadata within the indicated time period.
  • the timeline displays not only the temporal distribution of all objects in the database over the indicated time period, but also the temporal distribution of all objects in the database matching the specified tag search criteria over the indicated time period.
  • the temporal distribution of objects in the database can be represented in a calendar view such that the days of the calendar indicate the number of objects having metadata associated with a given day of the week in a given week of the month.
  • the calendar view can also be used to limit the search results for a tag search, in which case the calendar view will indicate all of the days of the month associated with objects that match all of the tagged search criteria, match some of the tagged search criteria, and match none of the tagged search criteria.
  • Figs. 1 illustrates one embodiment of a user interface for a computer program product in accordance with the present invention.
  • Fig. 2 illustrates an image displayed with its associated metadata, including its tags, in accordance with the present invention.
  • Fig. 3 illustrates a timeline view of the data in accordance with the present invention.
  • Fig. 4 illustrates a calendar view of the data in accordance with the present invention.
  • Fig. 5 illustrates a map view of the data in accordance with the present invention.
  • Fig. 6 illustrates the display of different media types that are stored in accordance with the present invention.
  • the present invention provides a method for users to organize and find digital images and photos by tagging them.
  • photos Before being tagged, photos must be imported into a database where photographic metadata or information about the photos can be stored. While entire photos can be stored in the database, it is generally more efficient to store pointers to photos in the database rather than the photos themselves.
  • Photos can be imported into the database from any of a number of devices or sources including, but not limited to, a digital camera, a flash memory device, a hard disk drive, a floppy drive, a CD-ROM, or a networked computer or file server.
  • the photos Once imported into the database, the photos can be tagged with one or more objects containing metadata that identifies the unique or important properties of the photo such as when or where the photo was taken, or who or what is the subject of the photo.
  • tags 350 can be applied to photos by dragging and dropping graphical icons representing the tags onto one or more photos 1-4 that are displayed in an image area 100.
  • the database record that contains a pointer to the photo is updated to contain or point to metadata that is associated with the tag that has been dropped onto the photo.
  • This metadata can include when the photo was taken, where it was taken, the nature of the event at which it was taken, the subject of the photo, and whether the user considers the photo one of his or her favorites.
  • photos with specific tags or combinations of tags can be readily found in the database by searching the database for all records that contain the same metadata as the metadata that is associated with the one or more search tags.
  • Tags can be created and modified in a tag editor.
  • the tag editor allows a user to specify a tag name and tag type, and to enter metadata in the form of tag attributes that can be stored in tags of the specified tag type.
  • tags can be divided into one or more tag categories. For example, in one embodiment tags are divided into people, events, places and miscellaneous tag categories. Tags in the different tag categories generally have different tag attributes to distinguish between themselves and tags in other tag categories.
  • a tag's attributes do not need to be filled in to associate a tag with a photo.
  • the tag itself is a form of metadata that can be associated with the photo, regardless of whether the tag's possible attributes are also associated with the photo. However, when a tag's attributes are completely or partially filled in, more metadata is associated with the tagged photo, thereby making the photo easier to search for and find.
  • the people tag category includes default tag types for family and friends, and can be customized to include other groups of people such as business associates, classmates, co-workers, and neighbors, and particular individuals such as a spouse, daughter, or friend.
  • Tags in the people category can contain attributes such as a person's name, sex, birthdate, anniversary, postal and/or email address(es), phone number(s), a sharing profile specifying which if any pictures can be shared with the people associated with the tag, and the relationships between the people associated with the tag and other tagged individuals.
  • the events tag category includes default tag types for parties and vacations, and can be customized to include tag types for particular types of events such as concerts, plays, shows and sporting events, and for particular events such as the 2002 Boston Marathon.
  • tags in the events category can include pre-defined calendar events such as New Years Eve, and customized calendar events such as birthdays and anniversaries.
  • tags in the event tag category can contain attributes corresponding to the names, locations, and dates of the underlying events associated with the tags.
  • the places tag category can be customized to include tag types for particular places such as a home, an office, an art museum, or a vacation destination.
  • Tags in the places tag category can contain attributes corresponding to specific locations that are associated with photos, including the name of the location (e.g., The Metropolitan Opera House), the names of the city, state, country and region of the world in which the photos were taken or which are the subject of the photos, and the geographical coordinates (e.g., longitude and latitude) for those places.
  • miscellaneous tag category is as a customizable catchall for tags that cannot be easily grouped into a meaningful global category with other tags.
  • miscellaneous tag types include tags for an apartment or home search, tags for artistic or photos, and tags for particular cars or types of cars.
  • Miscellaneous tags can contain attributes corresponding to the name of the subject of the photo, and where and when the photo was taken.
  • the metadata that is associated with a photo can be viewed and edited directly by displaying the photo together with its associated metadata.
  • Fig. 2 shows a photo entitled "Lori on the road at Legoland" associated with a customized people tag, Lori R., and a customized places tag, San Diego. The tags and title indicate this is a photo of Lori R. taken on a trip to Legoland in San Diego, CA. This photo can be retrieved from the database in any number of different ways, together with different photos that are related to this photo in different ways, as discussed below.
  • photos in the database that have been tagged with one or more tags can be searched for and sorted by querying the database for all photos having tags that match one or more search tags or the metadata contained within the one or more search tags.
  • These metadata can include, but are not limited to, data indicating whether photos are favorites; frequently viewed; similar to currently selected photos; untagged; taken on a particular day or recurring event; shared with or received from certain people; imported from certain places; and printed or exported on certain dates.
  • the metadata can include the subject of the photo, whether a person, place, or event; as well as the place and/or event at which the photo was taken. For example, the photo of Lori R.
  • Legoland in Legoland can be retrieved from the database by querying the database for all photos tagged with a Lori R. tag. This search will pull up all photos of Lori R., including the Legoland photo, regardless of where the photos were taken.
  • the Legoland photo can be retrieved by searching the database for all photos tagged with a San Diego tag. This search will pull up all photos taken in or of San Diego, including the Legoland photo, regardless of who is in the photo.
  • the Legoland photo can be retrieved by searching the database for all photos tagged with both a Lori R. tag and a San Diego tag. This search will pull up all photos taken in or of San Diego that include Lori R, including the Legoland photo.
  • the database search for photos that match certain tags or groups of tags can be graphically constructed by dragging various icons representative of tags 350 into a graphical query builder or lens 220, and searching the database for records with matching tags or metadata.
  • search criteria are applied to the photos in the database, the order in which the photos are displayed is updated so that "best match” photos or photos that match all of the search criteria are displayed at the top of an image area 100 in front of a first background color or pattern, while "close match” photos that match one or more but not all of the search criteria are displayed after the "best match” photos and are visually distinguished from them by, for example, being displayed in front of a second background color or pattern, and "no match" photos that fail to match any of the search criteria are displayed at the bottom of the image area in front of a third background color or pattern.
  • the easiest search to conduct on tagged photos is a search for photos taken on a certain date, or within a certain period of time.
  • information indicating the date and time a photo was taken This information is often automatically associated with a photo when the photo is created or when the photo is scanned into a digital scanner. If the photo is created on a digital camera, the camera will generally tag the photo with the date and time the photo was taken. If the photo is scanned into a digital scanner, the scanner will generally tag the photo with the date and time it was scanned. If for any reasons neither the digital camera nor digital scanner tags the photo with the date and time information, the database will tag the photo with the information when it is first imported.
  • the temporal metadata associated with the photos can be used to present a histogram of photos in the form of a timeline 250 as shown in Fig. 1.
  • the timeline 250 can show the number of photos taken as a function of time over some period of time that can range from the time the first photo in the database was taken to the present.
  • the timeline 250 can be used by itself, or with other tags 350 to specify the criteria used to search for matching photos.
  • the timeline includes adjustable time bands 251 that can be moved to allow timeline 250 to specify the time period that is used to find matching photos.
  • the adjustable time bands 251 can be moved to find all photos in the database that are tagged with a date or timestamp that falls within the range indicated by the adjustable time bands 251. Photos falling within this range are designated "best match" photos, and can be viewed as such in image area 100.
  • the timeline 250 can be used by itself to find all photos taken between Jan. 1, 2000 and Feb. 28, 2000 by moving the adjustable time bands 251 to these two respective dates.
  • the photos in the database that have been tagged with a timestamp falling between these two dates can be retrieved from the database, and displayed in the "best match" section of image area 100.
  • the timeline 250 can be used with other metadata to limit search tag results. For example, if the adjustable time bands 251 of timeline 250 indicate the period of interest extends from Jan. 1, 2000 to Feb. 28, 2000, searching the database for all photos having a San Diego tag will return the photo "Lori on the road at Legoland” as a "best match” photo, and display the photo in image area 100, only if the photo was taken sometime between Jan. 1, 2000 and Feb. 28, 2000. If the photo was taken outside of this time period, it would only appear as a "close match" photo in image area 100.
  • the timeline displays the total number of photos in the database per unit time period in a first color which may be a solid color, and the total number of photos in the database that match the tagged search criteria as "best" or "close” matches in a second color which may be a hatched pattern or color.
  • the timeline 250 shown in Fig. 3 does not display the exact number of photos taken during a given period of time, but rather displays a vertical bar graph with bar heights that are representative of the number of photos taken during a given period of time normalized to the average number of photos taken during all such similar periods of time in the database.
  • the displayed vertical bar can have a height of 0 when no photos have been taken during that period; 1 when one to five photos have been taken during that period; 2 when the normalized number of photos taken during that period was up to 50% of the average number of photos taken during all time periods; 3 when the normalized number of photos taken during that period was between 50% and 80% of the average number of photos taken during all time periods; 4 when the normalized number of photos taken during that period was between 80% and 120% of the average number of photos taken during all time periods; 5 when the normalized number of photos taken during that period was between 120% and 150% of the average number of photos taken during all time periods; 6 when the normalized number of photos taken during that period was between 150% and 200% of the average number of photos taken during all time periods; and 7 when the normalized number of photos taken during that period was more than 200% of the average number of photos taken during all time periods.
  • photos taken on a particular day or during a particular month can also be found by displaying the photos in a 2-D histogram or scatter plot such as the calendar view shown in the figure.
  • the calendar view displays all of the photos that have been taken, scanned, or imported into the database on any day in a given month as a function of the day of the week the photos were taken, and the week in the month. If a particular day of the month is selected in the calendar view, all photos taken on that day can be retrieved from the database as "best match" photos.
  • Fig. 3 shows that during the month of June, 2001 two sets of photos were taken. The first set contains a single photo taken on June 8, while the second set contains 10 photos taken on June 18. By selecting the June 18 calendar day, the 10 photos taken on June 18 are selected as the "best match" photos, and can be displayed in image area 100.
  • the calendar view can also display the results of a tag search in the month-at- a-glance mode.
  • each day in the calendar can indicate not only whether any photos were taken on that day, but whether the photos taken on that day fall into the "best match", "close match”, or "no match” group with respect to the tagged search criteria. For example, if the Legoland photo described in Fig. 2 was one often photos of Lori R. taken in San Diego on June 18, 2001, and a search were done for all photos having a San Diego tag, then the June 18, 2001 square in Fig. 3 would indicate that day as having photos in the "best match” group. If, however, a search were done for all photos having a New York tag, the June 18, 2001 square in Fig.
  • calendar days containing one or more photos in the "best match” group can be presented as white squares
  • calendar days containing one or more photos in the "close match” group and no photos in the "best match” group can be presented as blue squares
  • calendar days containing no photos in either the "best match” or "close match” groups can be presented as gray squares.
  • data can be searched for and displayed in an alternate 2-D histogram or scatter plot such as a map view.
  • the place tag metadata is used to display the geographic distribution of photos in the database.
  • the map view can be used to search for photos either by itself, or in conjunction with one or more tag searches. If the map view is used by itself to search for photos, icons representing the places where photos in the database have been taken are displayed on a map such as the world map shown in Fig. 5. When a location on the map is selected, photos taken in that location can be retrieved from the database as photos in the "best match" group. For example, if the location of Kenya on the map in Fig. 5 is selected, photos taken in Kenya can be selected from the database, and can be displayed in image area 100 as "best match" photos.
  • the map view can also be used in conjunction with a tag search.
  • the map view will display not only the distribution of photos as a function of geographic location, but whether the photos taken at the various geographic locations fall in the "best match", "close match”, or "no match” group with respect to the tagged search criteria. For example, if a search for all photos having an African tag were displayed in the map view, the map view would indicate that photos taken from the Kenya Safari fall into the "best match” group, while all of the other photos shown in Fig. 5 fall into the "no match” group.
  • the particular group into which a set of photos taken from a given location falls can be indicated on the map using the same color based indication scheme used to indicate matching photo groups that are displayed in image area 100.
  • locations containing one or more photos in the "best match” group can be presented as a white area
  • locations containing one or more photos in the "close match” group and no photos in the "best match” group can be presented as a blue area
  • locations containing no photos in either the "best match” or "close match” groups can be presented as a gray area.
  • the map view can be varied in size and shape to accommodate the geographic extent of the photos that are either in the database or that are responsive to a tag search conducted on the photos in the database. Thus, if a map view is used to display or further search among database photos having a North American tag, the map view can be limited to a view of the North American continent.
  • the map view can also be varied in size and shape by selecting particular regions of a map, such as the western region of the United States, or by zooming in and out of the currently displayed map region. Photos corresponding to particular locations within a map view, like San Diego, can be geographically found directly from the map view.
  • photos matching a given set of tags can be selected as a group, and various photo management functions such as printing, sharing, or exporting the photos to a slide show or to a photo album can be performed on the group.
  • various photo management functions such as printing, sharing, or exporting the photos to a slide show or to a photo album can be performed on the group.
  • all photos in the "best match” group are selected when selecting photos that match a given set of tag search criteria.
  • the default can be changed such that all photos in both the "close match” and "best match” groups are selected when selecting photos that match a given set of tag search criteria.
  • the invention can equally be used to manage, catalogue, search for and find other types of digital media such as video files, audio files, photo slide shows, and photo albums.
  • These different types of media can be distinguished from one another with a tag of tag type media.
  • the media tag when applied to a media object, can graphically indicate the type of media object that is stored in the database.
  • a video file 600 can be stored in the database and identified by displaying its first frame together with an overlaid video file icon.
  • an audio file 610 can be stored in the database and identified by displaying the title of the audio file together with an audio file icon. Audio files can be associated with and stored as a component part of a slide show or photo album, and can be played as a soundtrack whenever the slide show or photo album is viewed. Slide shows such as slide show 620, and photo albums such as photo album 630 can also be stored in the database, and iconically identified as shown Fig. 6. Each of these objects can be tagged, searched for, and manipulated using the same tools that are used to tag, search for, and manipulate digital photos, as previously discussed.
  • Apparatus of the invention can be implemented in a computer program product tangibly embodied in a machine-readable storage device for execution by a programmable processor; and method steps of the invention can be performed by a programmable processor executing a program of instructions to perform functions of the invention by operating on input data and generating output.
  • the invention can be implemented advantageously in one or more computer programs that are executable on a programmable system including at least one programmable processor coupled to receive data and instructions from, and to transmit data and instructions to, a data storage system, at least one input device, and at least one output device.
  • Each computer program can be implemented in a high-level procedural or object-oriented programming language, or in assembly or machine language if desired; and in any case, the language can be a compiled or interpreted language.
  • Suitable processors can include both general and special purpose microprocessors.
  • a processor will receive instructions and data from a read-only memory and/or a random access memory.
  • a computer will include one or more mass storage devices for storing data files; such devices include magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; and optical disks.
  • Storage devices suitable for tangibly embodying computer program instructions and data include all forms of non- volatile memory, including by way of example semiconductor memory devices, such as EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM disks. Any of the foregoing can be supplemented by, or incorporated in, application-specific integrated circuits (ASICs).
  • semiconductor memory devices such as EPROM, EEPROM, and flash memory devices
  • magnetic disks such as internal hard disks and removable disks
  • magneto-optical disks and CD-ROM disks.
  • CD-ROM disks CD-ROM disks

Abstract

Methods and apparatus for managing, finding and displaying objects such as digital images. Objects are tagged ('associated') with descriptive textual and numeric data ('metadata'), and stored in a relational database from which they can be selected, sorted, and found. Tags can be defined by name, tag type, and associated attributes. Objects can be tagged by dropping a tag onto the object, or relating a database record for the tag to a database record for the object. Tagged objects can be searched for and displayed according to the degree to which their metadata matches the search criteria. Visual cues can indicate whether displayed objects match all, some but not all, or none of the search criteria. Database object distributions can be displayed as histograms or scatter plots, including timelines, calendars or maps. Object distributions can be used to search for objects or to limit search results for a previous search.

Description

DIGITAL MEDIA MANAGEMENT APPARATUS AND METHODS
Background
With the advent of digital photography and the world- wide-web, there has been an exponential growth in the creation and storage of digital photographic images. As the number of digital photographs taken and stored has grown, so too has the need for a convenient method of archiving, cataloguing, searching, and retrieving them. Modern methods of archiving and storing digital images typically require users to remember large amounts of information merely to locate photos that are of particular interest to them. For example, many users currently store their digital images in the hierarchical, directory-based file system structure that is native to personal computers. To find particular photos stored in such a hierarchical directory tree or structure, users must know the full pathname to the directory in which their photographs are stored.
There are other disadvantages to storing digital photographs in a hierarchical, directory-based file system. For example, cataloguing and storing groups of photos by categories such as vacation photos or wedding photos requires creating different directories for each of the desired categories. This further increases the amount of information that must be remembered in order to locate desired photos. In addition, in order to store photos in two or more overlapping categories, such as photos that include your favorite aunt and photos from your cousin's wedding, users must either store duplicate photographs, or master the concepts of directory trees and file pointers. While these are not difficult concepts for sophisticated computer users, they can be troublesome for less sophisticated users, thereby limiting the useful ways these users can store and retrieve digital photographs and photographic information.
Summary
The present invention relates to methods and apparatus for storing, cataloguing, managing, organizing, finding and displaying objects such as digital images. The invention includes methods for associating ("tagging") fields of text and numeric data ("metadata") with individual objects such as images or photos, storing the objects and associated metadata as records in a relational database, and selecting, sorting, organizing and finding the objects based on their tagged metadata content. Default metadata tags can be specified, and new metadata tags can be defined and created through a tag editor by naming the tag, selecting its tag type, optionally selecting a graphical icon that represents the tag, and filling in any remaining fields or attributes that are unique to and define the tag type. Tags can be readily associated with an object by adding a record containing the tag information or metadata to a database, and relating the tagged metadata record to a database record containing the object or a pointer to the object. Tags can also be graphically associated with an object by, for example, dragging and dropping a graphical icon representing the tag onto a graphical representation of the object. In the latter case, database records containing the tag metadata are automatically created and related to the database record containing the target object or a pointer to the target object.
Once objects have been tagged with metadata, they can be searched for according to one or more tagged search criteria. When the objects to be search for are photos, these search criteria can include, but are not limited to, the date and time the photos were taken, textual information that is associated with the photos such as the names of the people who are in the photos or the places or events where the photos were taken, designations of the photos as favorite photos, and designation of the photos as photos that have been printed, shared with others, or archived on a certain date.
When a database is searched for objects that match one or more tagged search criteria, the matching objects can be viewed or arranged according to the degree to which they have associated metadata that matches the search criteria. In particular, objects that match all of the search criteria can be displayed first, followed by objects that match one or more of the search criteria, and finally by objects that match none of the search criteria. Objects in the different match groups can be differentiated from one another in the display area by visual cues, such as being displayed in front of different background colors or patterns. Thus, objects matching all of the search criteria can be displayed in front of a white background, while objects matching some of the search criteria can be displayed in front of a blue background, and objects matching none of the search criteria can be displayed in front of a gray background. The distribution of the objects stored in the database can be displayed as a histogram along a timeline. Time bands can be set along the timeline to indicate a time period that can be used to search for matching objects in the database, or to limit the search results for a given tag search to objects having temporal metadata within the indicated time period. When the timeline is used to limit the search results for a tag search, the timeline displays not only the temporal distribution of all objects in the database over the indicated time period, but also the temporal distribution of all objects in the database matching the specified tag search criteria over the indicated time period.
In addition to timelines, the temporal distribution of objects in the database can be represented in a calendar view such that the days of the calendar indicate the number of objects having metadata associated with a given day of the week in a given week of the month. The calendar view can also be used to limit the search results for a tag search, in which case the calendar view will indicate all of the days of the month associated with objects that match all of the tagged search criteria, match some of the tagged search criteria, and match none of the tagged search criteria.
The details of one or more embodiments of the invention are set forth in the accompanying drawings and the description below. Other objects, features, and advantages of the invention will be apparent from the description and drawings, and the claims.
Brief Description of Drawings
Figs. 1 illustrates one embodiment of a user interface for a computer program product in accordance with the present invention.
Fig. 2 illustrates an image displayed with its associated metadata, including its tags, in accordance with the present invention.
Fig. 3 illustrates a timeline view of the data in accordance with the present invention.
Fig. 4 illustrates a calendar view of the data in accordance with the present invention.
Fig. 5 illustrates a map view of the data in accordance with the present invention. Fig. 6 illustrates the display of different media types that are stored in accordance with the present invention.
Detailed Description
The present invention provides a method for users to organize and find digital images and photos by tagging them. Before being tagged, photos must be imported into a database where photographic metadata or information about the photos can be stored. While entire photos can be stored in the database, it is generally more efficient to store pointers to photos in the database rather than the photos themselves. Photos can be imported into the database from any of a number of devices or sources including, but not limited to, a digital camera, a flash memory device, a hard disk drive, a floppy drive, a CD-ROM, or a networked computer or file server. Once imported into the database, the photos can be tagged with one or more objects containing metadata that identifies the unique or important properties of the photo such as when or where the photo was taken, or who or what is the subject of the photo.
As shown in Figs. 1, in one embodiment tags 350 can be applied to photos by dragging and dropping graphical icons representing the tags onto one or more photos 1-4 that are displayed in an image area 100. When a tag is dropped onto a photo, the database record that contains a pointer to the photo is updated to contain or point to metadata that is associated with the tag that has been dropped onto the photo. This metadata can include when the photo was taken, where it was taken, the nature of the event at which it was taken, the subject of the photo, and whether the user considers the photo one of his or her favorites. Once tagged, photos with specific tags or combinations of tags can be readily found in the database by searching the database for all records that contain the same metadata as the metadata that is associated with the one or more search tags.
Tags, and the metadata they contain, can be created and modified in a tag editor. The tag editor allows a user to specify a tag name and tag type, and to enter metadata in the form of tag attributes that can be stored in tags of the specified tag type. For convenience, tags can be divided into one or more tag categories. For example, in one embodiment tags are divided into people, events, places and miscellaneous tag categories. Tags in the different tag categories generally have different tag attributes to distinguish between themselves and tags in other tag categories. In general, a tag's attributes do not need to be filled in to associate a tag with a photo. The tag itself is a form of metadata that can be associated with the photo, regardless of whether the tag's possible attributes are also associated with the photo. However, when a tag's attributes are completely or partially filled in, more metadata is associated with the tagged photo, thereby making the photo easier to search for and find.
The people tag category includes default tag types for family and friends, and can be customized to include other groups of people such as business associates, classmates, co-workers, and neighbors, and particular individuals such as a spouse, daughter, or friend. Tags in the people category can contain attributes such as a person's name, sex, birthdate, anniversary, postal and/or email address(es), phone number(s), a sharing profile specifying which if any pictures can be shared with the people associated with the tag, and the relationships between the people associated with the tag and other tagged individuals.
The events tag category includes default tag types for parties and vacations, and can be customized to include tag types for particular types of events such as concerts, plays, shows and sporting events, and for particular events such as the 2002 Boston Marathon. In addition, tags in the events category can include pre-defined calendar events such as New Years Eve, and customized calendar events such as birthdays and anniversaries. Tags in the event tag category can contain attributes corresponding to the names, locations, and dates of the underlying events associated with the tags.
The places tag category can be customized to include tag types for particular places such as a home, an office, an art museum, or a vacation destination. Tags in the places tag category can contain attributes corresponding to specific locations that are associated with photos, including the name of the location (e.g., The Metropolitan Opera House), the names of the city, state, country and region of the world in which the photos were taken or which are the subject of the photos, and the geographical coordinates (e.g., longitude and latitude) for those places.
Finally, the miscellaneous tag category is as a customizable catchall for tags that cannot be easily grouped into a meaningful global category with other tags. Examples of miscellaneous tag types include tags for an apartment or home search, tags for artistic or photos, and tags for particular cars or types of cars. Miscellaneous tags can contain attributes corresponding to the name of the subject of the photo, and where and when the photo was taken.
As shown in Fig. 2, the metadata that is associated with a photo can be viewed and edited directly by displaying the photo together with its associated metadata. Fig. 2 shows a photo entitled "Lori on the road at Legoland" associated with a customized people tag, Lori R., and a customized places tag, San Diego. The tags and title indicate this is a photo of Lori R. taken on a trip to Legoland in San Diego, CA. This photo can be retrieved from the database in any number of different ways, together with different photos that are related to this photo in different ways, as discussed below.
In general, photos in the database that have been tagged with one or more tags can be searched for and sorted by querying the database for all photos having tags that match one or more search tags or the metadata contained within the one or more search tags. These metadata can include, but are not limited to, data indicating whether photos are favorites; frequently viewed; similar to currently selected photos; untagged; taken on a particular day or recurring event; shared with or received from certain people; imported from certain places; and printed or exported on certain dates. In addition, the metadata can include the subject of the photo, whether a person, place, or event; as well as the place and/or event at which the photo was taken. For example, the photo of Lori R. in Legoland can be retrieved from the database by querying the database for all photos tagged with a Lori R. tag. This search will pull up all photos of Lori R., including the Legoland photo, regardless of where the photos were taken. Alternatively, the Legoland photo can be retrieved by searching the database for all photos tagged with a San Diego tag. This search will pull up all photos taken in or of San Diego, including the Legoland photo, regardless of who is in the photo. Finally, the Legoland photo can be retrieved by searching the database for all photos tagged with both a Lori R. tag and a San Diego tag. This search will pull up all photos taken in or of San Diego that include Lori R, including the Legoland photo.
The database search for photos that match certain tags or groups of tags can be graphically constructed by dragging various icons representative of tags 350 into a graphical query builder or lens 220, and searching the database for records with matching tags or metadata. When search criteria are applied to the photos in the database, the order in which the photos are displayed is updated so that "best match" photos or photos that match all of the search criteria are displayed at the top of an image area 100 in front of a first background color or pattern, while "close match" photos that match one or more but not all of the search criteria are displayed after the "best match" photos and are visually distinguished from them by, for example, being displayed in front of a second background color or pattern, and "no match" photos that fail to match any of the search criteria are displayed at the bottom of the image area in front of a third background color or pattern.
Perhaps the easiest search to conduct on tagged photos is a search for photos taken on a certain date, or within a certain period of time. As previously mentioned, among the metadata that can be stored with a photo is information indicating the date and time a photo was taken. This information is often automatically associated with a photo when the photo is created or when the photo is scanned into a digital scanner. If the photo is created on a digital camera, the camera will generally tag the photo with the date and time the photo was taken. If the photo is scanned into a digital scanner, the scanner will generally tag the photo with the date and time it was scanned. If for any reasons neither the digital camera nor digital scanner tags the photo with the date and time information, the database will tag the photo with the information when it is first imported.
As shown in Fig. 3, when photos are imported into a database, the temporal metadata associated with the photos can be used to present a histogram of photos in the form of a timeline 250 as shown in Fig. 1. The timeline 250 can show the number of photos taken as a function of time over some period of time that can range from the time the first photo in the database was taken to the present. The timeline 250 can be used by itself, or with other tags 350 to specify the criteria used to search for matching photos. The timeline includes adjustable time bands 251 that can be moved to allow timeline 250 to specify the time period that is used to find matching photos.
When the timeline 250 is used by itself to search for matching photos, the adjustable time bands 251 can be moved to find all photos in the database that are tagged with a date or timestamp that falls within the range indicated by the adjustable time bands 251. Photos falling within this range are designated "best match" photos, and can be viewed as such in image area 100. For example, the timeline 250 can be used by itself to find all photos taken between Jan. 1, 2000 and Feb. 28, 2000 by moving the adjustable time bands 251 to these two respective dates. The photos in the database that have been tagged with a timestamp falling between these two dates can be retrieved from the database, and displayed in the "best match" section of image area 100.
In addition to finding photos according to their timestamp, the timeline 250 can be used with other metadata to limit search tag results. For example, if the adjustable time bands 251 of timeline 250 indicate the period of interest extends from Jan. 1, 2000 to Feb. 28, 2000, searching the database for all photos having a San Diego tag will return the photo "Lori on the road at Legoland" as a "best match" photo, and display the photo in image area 100, only if the photo was taken sometime between Jan. 1, 2000 and Feb. 28, 2000. If the photo was taken outside of this time period, it would only appear as a "close match" photo in image area 100. When tag searches are conducted in conjunction with timeline 250, the timeline displays the total number of photos in the database per unit time period in a first color which may be a solid color, and the total number of photos in the database that match the tagged search criteria as "best" or "close" matches in a second color which may be a hatched pattern or color.
In one embodiment, the timeline 250 shown in Fig. 3 does not display the exact number of photos taken during a given period of time, but rather displays a vertical bar graph with bar heights that are representative of the number of photos taken during a given period of time normalized to the average number of photos taken during all such similar periods of time in the database. For example, for a given period of time, the displayed vertical bar can have a height of 0 when no photos have been taken during that period; 1 when one to five photos have been taken during that period; 2 when the normalized number of photos taken during that period was up to 50% of the average number of photos taken during all time periods; 3 when the normalized number of photos taken during that period was between 50% and 80% of the average number of photos taken during all time periods; 4 when the normalized number of photos taken during that period was between 80% and 120% of the average number of photos taken during all time periods; 5 when the normalized number of photos taken during that period was between 120% and 150% of the average number of photos taken during all time periods; 6 when the normalized number of photos taken during that period was between 150% and 200% of the average number of photos taken during all time periods; and 7 when the normalized number of photos taken during that period was more than 200% of the average number of photos taken during all time periods.
As shown in Fig. 4, in addition to timeline 250, photos taken on a particular day or during a particular month can also be found by displaying the photos in a 2-D histogram or scatter plot such as the calendar view shown in the figure. The calendar view displays all of the photos that have been taken, scanned, or imported into the database on any day in a given month as a function of the day of the week the photos were taken, and the week in the month. If a particular day of the month is selected in the calendar view, all photos taken on that day can be retrieved from the database as "best match" photos. For example, Fig. 3 shows that during the month of June, 2001 two sets of photos were taken. The first set contains a single photo taken on June 8, while the second set contains 10 photos taken on June 18. By selecting the June 18 calendar day, the 10 photos taken on June 18 are selected as the "best match" photos, and can be displayed in image area 100.
The calendar view can also display the results of a tag search in the month-at- a-glance mode. When so used, each day in the calendar can indicate not only whether any photos were taken on that day, but whether the photos taken on that day fall into the "best match", "close match", or "no match" group with respect to the tagged search criteria. For example, if the Legoland photo described in Fig. 2 was one often photos of Lori R. taken in San Diego on June 18, 2001, and a search were done for all photos having a San Diego tag, then the June 18, 2001 square in Fig. 3 would indicate that day as having photos in the "best match" group. If, however, a search were done for all photos having a New York tag, the June 18, 2001 square in Fig. 3 would indicate that day as having photos in the "no match" group. Finally, if a search were done for all photos having a New York tag and a Lori R. tag, the June 18, 2001 square in Fig. 3 would indicate that day as having photos in the "close match" group.
The particular group into which a set of photos taken on a given calendar day falls can be indicated on the calendar using the same color based indication scheme used to indicate matching photo groups that are displayed in the viewing area. Thus, calendar days containing one or more photos in the "best match" group can be presented as white squares, while calendar days containing one or more photos in the "close match" group and no photos in the "best match" group can be presented as blue squares, and calendar days containing no photos in either the "best match" or "close match" groups can be presented as gray squares.
As shown in Fig. 5, in addition to the timeline 250 and calendar views, data can be searched for and displayed in an alternate 2-D histogram or scatter plot such as a map view. In the map view, the place tag metadata is used to display the geographic distribution of photos in the database. Like the timeline 250 and calendar views, the map view can be used to search for photos either by itself, or in conjunction with one or more tag searches. If the map view is used by itself to search for photos, icons representing the places where photos in the database have been taken are displayed on a map such as the world map shown in Fig. 5. When a location on the map is selected, photos taken in that location can be retrieved from the database as photos in the "best match" group. For example, if the location of Kenya on the map in Fig. 5 is selected, photos taken in Kenya can be selected from the database, and can be displayed in image area 100 as "best match" photos.
The map view can also be used in conjunction with a tag search. When so used, the map view will display not only the distribution of photos as a function of geographic location, but whether the photos taken at the various geographic locations fall in the "best match", "close match", or "no match" group with respect to the tagged search criteria. For example, if a search for all photos having an African tag were displayed in the map view, the map view would indicate that photos taken from the Kenya Safari fall into the "best match" group, while all of the other photos shown in Fig. 5 fall into the "no match" group. As with the calendar and timeline 250 views, the particular group into which a set of photos taken from a given location falls can be indicated on the map using the same color based indication scheme used to indicate matching photo groups that are displayed in image area 100. Thus, locations containing one or more photos in the "best match" group can be presented as a white area, while locations containing one or more photos in the "close match" group and no photos in the "best match" group can be presented as a blue area, and locations containing no photos in either the "best match" or "close match" groups can be presented as a gray area.
The map view can be varied in size and shape to accommodate the geographic extent of the photos that are either in the database or that are responsive to a tag search conducted on the photos in the database. Thus, if a map view is used to display or further search among database photos having a North American tag, the map view can be limited to a view of the North American continent. The map view can also be varied in size and shape by selecting particular regions of a map, such as the western region of the United States, or by zooming in and out of the currently displayed map region. Photos corresponding to particular locations within a map view, like San Diego, can be geographically found directly from the map view.
Once photos matching a given set of tags are found, they can be selected as a group, and various photo management functions such as printing, sharing, or exporting the photos to a slide show or to a photo album can be performed on the group. As a default, all photos in the "best match" group are selected when selecting photos that match a given set of tag search criteria. However, the default can be changed such that all photos in both the "close match" and "best match" groups are selected when selecting photos that match a given set of tag search criteria.
It should be noted that while the invention has been described in terms of managing, cataloguing, searching, and finding digital images and photographs, the invention can equally be used to manage, catalogue, search for and find other types of digital media such as video files, audio files, photo slide shows, and photo albums. These different types of media can be distinguished from one another with a tag of tag type media. The media tag, when applied to a media object, can graphically indicate the type of media object that is stored in the database.
As shown in Fig. 6, a video file 600 can be stored in the database and identified by displaying its first frame together with an overlaid video file icon.
Similarly, an audio file 610 can be stored in the database and identified by displaying the title of the audio file together with an audio file icon. Audio files can be associated with and stored as a component part of a slide show or photo album, and can be played as a soundtrack whenever the slide show or photo album is viewed. Slide shows such as slide show 620, and photo albums such as photo album 630 can also be stored in the database, and iconically identified as shown Fig. 6. Each of these objects can be tagged, searched for, and manipulated using the same tools that are used to tag, search for, and manipulate digital photos, as previously discussed.
While the invention has been described as a computer program or algorithm, the invention can be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in any combinations of them. Apparatus of the invention can be implemented in a computer program product tangibly embodied in a machine-readable storage device for execution by a programmable processor; and method steps of the invention can be performed by a programmable processor executing a program of instructions to perform functions of the invention by operating on input data and generating output. The invention can be implemented advantageously in one or more computer programs that are executable on a programmable system including at least one programmable processor coupled to receive data and instructions from, and to transmit data and instructions to, a data storage system, at least one input device, and at least one output device. Each computer program can be implemented in a high-level procedural or object-oriented programming language, or in assembly or machine language if desired; and in any case, the language can be a compiled or interpreted language. Suitable processors can include both general and special purpose microprocessors. Generally, a processor will receive instructions and data from a read-only memory and/or a random access memory. Generally, a computer will include one or more mass storage devices for storing data files; such devices include magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; and optical disks. Storage devices suitable for tangibly embodying computer program instructions and data include all forms of non- volatile memory, including by way of example semiconductor memory devices, such as EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM disks. Any of the foregoing can be supplemented by, or incorporated in, application-specific integrated circuits (ASICs).
While the invention has been described in terms of particular embodiments, it should be understood that other embodiments are possible as would be apparent to one of ordinary skill in the art. Accordingly, these and other embodiments are within the scope of the following claims.
What is claimed is:

Claims

1. A method for graphically constructing a database query, comprising: receiving a collection of objects, wherein each object in the collection is associated with metadata that describes one or more attributes of the object; generating a visual representation of a distribution of objects in the collection as a function of at least a portion of the descriptive metadata associated with the objects; and receiving user input defining a selection in the visual representation of the distribution of objects to construct a database query.
2. The method of claim 1 , wherein the step of generating a visual representation of a distribution of objects comprises generating a histogram representing the distribution of objects as a function of at least a portion of the descriptive metadata associated with the objects.
3. The method of claim 2, wherein the step of generating a histogram representing the distribution of objects as a function of at least a portion of the descriptive metadata associated with the objects comprises generating a timeline representing the distribution of objects as a function of temporal metadata associated with the objects.
4. The method of claim 3, wherein the temporal metadata is a date or timestamp associated with each of the objects.
5. The method of claim 2, wherein the step of generating a histogram representing the distribution of objects as a function of at least a portion of the descriptive metadata associated with the objects comprises generating a map showing the spatial distribution of the objects.
6. The method of claim 2, wherein the step of generating a histogram representing the distribution of objects as a function of at least a portion of the descriptive metadata associated with the objects comprises generating a calendar showing the temporal distribution of the objects.
7. The method of claim 1 , wherein the step of generating a visual representation of a distribution of objects as a function of at least a portion of the descriptive metadata associated with the objects comprises generating a scatter plot representing the distribution of objects as a function of at least a portion of the descriptive metadata associated with the objects.
8. The method of claim 7, wherein the step of generating a scatter plot representing the distribution of objects as a function of at least a portion of the descriptive metadata associated with the objects comprises generating a map showing the spatial distribution of the objects.
9. The method of claim 7, wherein the step of generating a scatter plot representing the distribution of objects as a function of at least a portion of the descriptive metadata associated with the objects comprises generating a calendar showing the temporal distribution of the objects.
10. The method of claim 1 , wherein the step of receiving user input defining a selection of the visual representation of the distribution of objects to construct a database query comprises receiving user input selecting a portion of the visual representation of the distribution of objects.
11. The method of claim 1 , wherein the step of receiving user input defining a selection of the visual representation of the distribution of objects to construct a database query comprises receiving user input limiting the display range of the visual representation of the distribution of objects.
12. A method for managing a collection of objects, comprising: using one or more search tags to query a database for objects having metadata that matches the metadata associated with the one or more search tags; categorizing the collection of objects in the database into at least a best match group and a no match group according to the results of the search tag query; and displaying a representation of the objects from both the best match group and the no match group while distinguishing the objects in the best match group from the objects in the no match group.
13. The method of claim 12, wherein the step of distinguishing the objects in the best match group from the objects in the no match group comprises providing a visual cue to distinguish the objects in the best match group from the objects in the no match group.
14. The method of claim 13, wherein providing a visual cue to distinguish the objects in the best match group from the objects in the no match group comprises displaying the objects in the best match group in front of a background having a first background color, and displaying the objects in the no match group in front of a background having a second background color.
15. The method of claim 12, further comprising categorizing the objects into a close match group; and displaying the objects from the best match group, close match group, and no match group while distinguishing the objects in each group from the objects in each of the remaining groups.
16. The method of claim 12, further comprising generating a visual representation of a distribution of the objects in each of the best match and no match groups as a function of at least a portion of the descriptive metadata associated with the objects in each of the groups.
17. The method of claim 16, wherein the step of generating a visual representation of a distribution of the objects in each of the best match groups and no match groups further comprises displaying a visual representation of a distribution of the objects in the best match group and no match group on a histogram.
18. The method of claim 17, wherein the histogram is a timeline representing a temporal distribution of the objects in the best match group and no match group.
19. The method of claim 17, wherein the histogram is a map showing the spatial distribution of the objects in the best match group and no match group.
20. The method of claim 17, wherein the histogram is a calendar showing the temporal distribution of the objects in the best match group and no match group.
21. The. method of claim 16, wherein the step of generating a visual representation of a distribution of the objects in each of the best match groups and no match groups further comprises displaying a visual representation of a distribution of the objects in the best match group and no match group on a scatter plot.
22. The method of claim 21 , wherein the step of generating a scatter plot representing the distribution of objects in the best match and no match groups comprises generating a map showing the spatial distribution of objects in the best match and no match groups.
23. The method of claim 21 , wherein the step of generating a scatter plot representing the distribution of objects in the best match and no match groups comprises generating a calendar showing the distribution of objects in the best match and no match groups as a function of date.
24. A method for displaying a collection of objects, comprising: using one or more search tags to query a database for objects having metadata that matches the metadata associated with the one or more search tags; and displaying a visual representation of a distribution of the objects responsive to the search tag query as a function of at least a portion of the metadata.
25. The method of claim 24, wherein the visual representation of the distribution of the objects responsive to the search tag query is a histogram.
26. The method of claim 25, wherein the histogram is a timeline representing a temporal distribution of the objects responsive to the search tag query.
27. The method of claim 25, wherein the histogram is a map showing the spatial distribution of the objects in the best match group and no match group.
28. The method of claim 25, wherein the histogram is a calendar showing the temporal distribution of the objects in the best match group and no match group.
29. The method of claim 24, wherein the visual representation of the distribution of the objects responsive to the search tag query is a scatter plot.
30. The method of claim 29, wherein the scatter plot is a calendar representing the distribution of the objects responsive to the search tag query as a function of date.
31. The method of claim 29, wherein the scatter plot is a map representing a spatial distribution of the objects responsive to the search tag query.
32. A computer program product for graphically constructing a database query, the computer program product comprising instructions operable to cause a programmable processor to: receive a collection of objects, wherein each object in the collection is associated with metadata that describes one or more attributes of the object; generate a visual representation of a distribution of objects in the collection as a function of at least a portion of the descriptive metadata associated with the objects; and receive user input defining a selection in the visual representation of the distribution of objects to construct a database query.
33. The computer program product of claim 32, wherein the instructions to generate a visual representation of a distribution of objects comprise instructions to generate a histogram representing the distribution of objects as a function of at least a portion of the descriptive metadata associated with the objects.
34. The computer program product of claim 33, wherein the instructions to generate a histogram representing the distribution of objects as a function of at least a portion of the descriptive metadata associated with the objects comprise instructions to generate a timeline representing the distribution of objects as a function of temporal metadata associated with the objects.
35. The computer program product of claim 34, wherein the temporal metadata is a date or timestamp associated with each of the objects.
36. The computer program product of claim 33, wherein the instructions to generate a histogram representing the distribution of objects as a function of at least a portion of the descriptive metadata associated with the objects comprise instructions to generate a map showing the spatial distribution of the objects.
37. The computer program product of claim 33, wherein the instructions to generate a histogram representing the distribution of objects as a function of at least a portion of the descriptive metadata associated with the objects comprise instructions to generate a calendar showing the temporal distribution of the objects.
38. The computer program product of claim 32, wherein the instructions to generate a visual representation of a distribution of objects as a function of at least a portion of the descriptive metadata associated with the objects comprise instructions to generate a scatter plot representing the distribution of objects as a function of at least a portion of the descriptive metadata associated with the objects.
39. The computer program product of claim 38, wherein the instructions to generate a scatter plot representing the distribution of objects as a function of at least a portion of the descriptive metadata associated with the objects comprise instructions to generate a map showing the spatial distribution of the objects.
40. The computer program product of claim 38, wherein the instructions to generate a scatter plot representing the distribution of objects as a function of at least a portion of the descriptive metadata associated with the objects comprise instructions to generate a calendar showing the temporal distribution of the objects.
41. The computer program product of claim 32, wherein the instructions to receive user input defining a selection of the visual representation of the distribution of objects to construct a database query comprise instructions to receive user input selecting a portion of the visual representation of the distribution of objects.
42. The computer program product of claim 32, wherein the instructions to receive user input defining a selection of the visual representation of the distribution of objects to construct a database query comprise instructions to receive user input limiting the display range of the visual representation of the distribution of objects.
43. A computer program product for managing a collection of obj ects, the computer program product comprising instructions operable to cause a programmable processor to: use one or more search tags to query a database for objects having metadata that matches the metadata associated with the one or more search tags; categorize the collection of objects in the database into at least a best match group and a no match group according to the results of the search tag query; and display a representation of the objects from both the best match group and the no match group while distinguishing the objects in the best match group from the objects in the no match group.
44. The computer program product of claim 43, wherein the instructions to distinguish the objects in the best match group from the objects in the no match group comprise instructions to provide a visual cue to distinguish the objects in the best match group from the objects in the no match group.
45. The computer program product of claim 44, wherein the instructions to provide a visual cue to distinguish the objects in the best match group from the objects in the no match group comprise instructions to display the objects in the best match group in front of a background having a first background color, and to display the objects in the no match group in front of a background having a second background color.
46. The computer program product of claim 43, further comprising instructions operable to cause the programmable processor to categorize the objects into a close match group; and to display the objects from the best match group, close match group, and no match group while distinguishing the objects in each group from the objects in each of the remaining groups.
47. The computer program product of claim 43, further comprising instructions operable to cause a programmable processor to generate a visual representation of a distribution of the objects in each of the best match and no match groups as a function of at least a portion of the descriptive metadata associated with the objects in each of the groups.
48. The computer program product of claim 47, wherein the instructions to generate a visual representation of a distribution of the objects in each of the best match groups and no match groups further comprise instructions to display a visual representation of a distribution of the objects in the best match group and no match group on a histogram.
49. The computer program product of claim 48, wherein the histogram is a timeline representing a temporal distribution of the objects in the best match group and no match group.
50. The computer program product of claim 48, wherein the histogram is a map showing the spatial distribution of the objects in the best match group and no match group.
51. The computer program product of claim 48, wherein the histogram is a calendar showing the temporal distribution of the objects in the best match group and no match group.
52. The computer program product of claim 47, wherein the instructions to generate a visual representation of a distribution of the objects in each of the best match groups and no match groups further comprise instructions to display a visual representation of a disfribution of the objects in the best match group and no match group on a scatter plot.
53. The computer program product of claim 52, wherein the instructions to generate a scatter plot representing the distribution of objects in the best match and no match groups comprise instructions to generate a map showing the spatial distribution of objects in the best match and no match groups.
54. The computer program product of claim 52, wherein the instructions to generate a scatter plot representing the distribution of objects in the best match and no match groups comprise instructions to generate a calendar showing the distribution of objects in the best match and no match groups as a function of date.
55. A computer program product for displaying a collection of obj ects, the computer program product comprising instructions operable to cause a programmable processor to: use one or more search tags to query a database for objects having metadata that matches the metadata associated with the one or more search tags; and display a visual representation of a distribution of the objects responsive to the search tag query as a function of at least a portion of the metadata.
56. The computer program product of claim 55, wherein the visual representation of the distribution of the objects responsive to the search tag query is a histogram.
57. The computer program product of claim 56, wherein the histogram is a timeline representing a temporal distribution of the objects responsive to the search tag query.
58. The computer program product of claim 56, wherein the histogram is a map showing the spatial distribution of the objects in the best match group and no match group.
59. The computer program product of claim 56, wherein the histogram is a calendar showing the temporal distribution of the objects in the best match group and no match group.
60. The computer program product of claim 55, wherein the visual representation of the distribution of the objects responsive to the search tag query is a scatter plot.
61. The computer program product of claim 60, wherein the scatter plot is a calendar representing the distribution of the objects responsive to the search tag query as a function of date.
62. The computer program product of claim 60, wherein the scatter plot is a map representing a spatial distribution of the objects responsive to the search tag query.
PCT/US2002/001530 2001-01-16 2002-01-16 Digital media management apparatus and methods WO2002057959A2 (en)

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
US26189701P 2001-01-16 2001-01-16
US60/261,897 2001-01-16
US17937901P 2001-01-31 2001-01-31
US09/774,523 2001-01-31
US60/179,379 2001-01-31
US09/774,523 US20020087546A1 (en) 2000-01-31 2001-01-31 Apparatus, methods, and systems for digital photo management
US99741201A 2001-11-30 2001-11-30
US09/997,412 2001-11-30

Publications (2)

Publication Number Publication Date
WO2002057959A2 true WO2002057959A2 (en) 2002-07-25
WO2002057959A3 WO2002057959A3 (en) 2003-03-06

Family

ID=27497334

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2002/001530 WO2002057959A2 (en) 2001-01-16 2002-01-16 Digital media management apparatus and methods

Country Status (1)

Country Link
WO (1) WO2002057959A2 (en)

Cited By (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2394811A (en) * 2002-10-17 2004-05-05 Hewlett Packard Development Co A method for locating images
EP1416398A1 (en) * 2002-11-01 2004-05-06 Abb Research Ltd. Method and system for selecting objects in a software system
WO2004059532A3 (en) * 2002-12-30 2004-10-28 British Telecomm Data retrieval method and apparatus
WO2005010775A1 (en) * 2003-07-17 2005-02-03 Ivis Group Limited Improved search engine
WO2005031601A1 (en) * 2003-10-02 2005-04-07 Nokia Corporation Method for clustering and querying media items
EP1531411A1 (en) * 2003-11-17 2005-05-18 Nokia Corporation Time bar navigation in a media diary application
EP1531403A2 (en) * 2003-11-17 2005-05-18 Nokia Corporation Bookmarking and annotating in a media diary application
EP1531598A2 (en) 2003-11-17 2005-05-18 Nokia Corporation Speed browsing of media items in a media diary application
EP1533714A3 (en) * 2003-11-17 2005-08-17 Nokia Corporation Multimedia diary application for use with a digital device
WO2005076156A1 (en) * 2004-02-09 2005-08-18 Nokia Corporation Representation of media items in a media file management application for use with a digital device
EP1531404A3 (en) * 2003-11-17 2005-08-24 Nokia Corporation Topographic presentation of media files in a diary application
EP1455518A3 (en) * 2003-03-04 2006-03-15 Ricoh Company, Ltd. Image forming apparatus and image processing apparatus
EP1513080A3 (en) * 2003-08-29 2006-04-05 Nokia Corporation Organization and maintenance using metadata
EP1667033A1 (en) * 2004-10-07 2006-06-07 Sony Corporation Content management system, content management method, and computer program
EP1679879A2 (en) 2005-01-07 2006-07-12 Apple Computer, Inc. Image management tool with calendar interface
WO2006077512A1 (en) * 2005-01-20 2006-07-27 Koninklijke Philips Electronics N.V. A user interface for browsing image
US7109848B2 (en) 2003-11-17 2006-09-19 Nokia Corporation Applications and methods for providing a reminder or an alert to a digital media capture device
EP1956501A1 (en) 2002-09-13 2008-08-13 British Telecommunications Public Limited Company Media article composition
WO2008118298A1 (en) * 2007-03-26 2008-10-02 Eastman Kodak Company Digital object information via category-based histograms
CN100424613C (en) * 2004-12-16 2008-10-08 国际商业机器公司 Method and system for conveying a changing local time zone in an electronic calendar
US7634158B2 (en) 2005-01-28 2009-12-15 Canon Kabushiki Kaisha Image processing apparatus, control method therefor, computer program, and computer-readable storage medium
US7711315B2 (en) 2003-10-27 2010-05-04 Nokia Corporation Method and mobile terminal for accessing a service portal via bi-directional network
CN1855272B (en) * 2005-04-19 2010-05-12 株式会社日立制作所 Recording and reproducing apparatus, and recording and reproducing method
WO2010054119A2 (en) * 2008-11-07 2010-05-14 Yahoo! Inc. Image relevance by identifying experts
EP2192498A1 (en) * 2008-11-28 2010-06-02 Sony Corporation Image processing apparatus, image displaying method, and image displaying program
CN101751468A (en) * 2008-12-10 2010-06-23 三星电子株式会社 Method and apparatus for searching contents
CN101764965A (en) * 2009-12-21 2010-06-30 康佳集团股份有限公司 Method for recording diaries on television
AU2006200426B2 (en) * 2003-07-17 2010-07-29 Ivis Group Limited Improved search engine
US7774718B2 (en) 2003-12-17 2010-08-10 Nokia Corporation Time handle in a media diary application for accessing media files
US7788267B2 (en) 2007-02-26 2010-08-31 Seiko Epson Corporation Image metadata action tagging
EP2315137A1 (en) * 2009-10-26 2011-04-27 DAD Solutions Limited Searching of databases
US8037105B2 (en) 2004-03-26 2011-10-11 British Telecommunications Public Limited Company Computer apparatus
EP2407896A1 (en) * 2010-07-16 2012-01-18 Research In Motion Limited Systems and methods of user interface for image display
EP2458513A1 (en) * 2010-11-26 2012-05-30 HTC Corporation Note management methods and systems
EP1990744B1 (en) * 2007-05-09 2013-01-23 Research In Motion Limited User interface for editing photo tags
US8375283B2 (en) * 2006-06-20 2013-02-12 Nokia Corporation System, device, method, and computer program product for annotating media files
EP2690588A1 (en) * 2012-07-24 2014-01-29 Samsung Electronics Co., Ltd Function based on a cloud service
US8667384B2 (en) 2007-05-09 2014-03-04 Blackberry Limited User interface for editing photo tags
US9122645B1 (en) 2006-12-20 2015-09-01 Qurio Holdings, Inc. Method and system for tagging within virtual groups
WO2015200120A1 (en) * 2014-06-27 2015-12-30 Amazon Technologies, Inc. System, method and apparatus for organizing photographs stored on a mobile computing device
EP1876526B1 (en) * 2006-05-01 2019-11-06 Sony Corporation Information processing apparatus, information processing method, information processing program, and mobile terminal device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5241671A (en) * 1989-10-26 1993-08-31 Encyclopaedia Britannica, Inc. Multimedia search system using a plurality of entry path means which indicate interrelatedness of information
US5734888A (en) * 1993-06-04 1998-03-31 International Business Machines Corporation Apparatus and method of modifying a database query
US5844572A (en) * 1995-06-07 1998-12-01 Binaryblitz Method and apparatus for data alteration by manipulation of representational graphs
US5898431A (en) * 1996-12-31 1999-04-27 International Business Machines Corporation Database graphical user interface with calendar view
US6085205A (en) * 1997-11-12 2000-07-04 Ricoh Company Limited Calendar incorporating document retrieval interface
US6408301B1 (en) * 1999-02-23 2002-06-18 Eastman Kodak Company Interactive image storage, indexing and retrieval system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3630721B2 (en) * 1994-07-13 2005-03-23 キヤノン株式会社 Multimedia data processing method, multimedia data processing device, attribute information registration device, and attribute information registration method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5241671A (en) * 1989-10-26 1993-08-31 Encyclopaedia Britannica, Inc. Multimedia search system using a plurality of entry path means which indicate interrelatedness of information
US5241671C1 (en) * 1989-10-26 2002-07-02 Encyclopaedia Britannica Educa Multimedia search system using a plurality of entry path means which indicate interrelatedness of information
US5734888A (en) * 1993-06-04 1998-03-31 International Business Machines Corporation Apparatus and method of modifying a database query
US5844572A (en) * 1995-06-07 1998-12-01 Binaryblitz Method and apparatus for data alteration by manipulation of representational graphs
US5898431A (en) * 1996-12-31 1999-04-27 International Business Machines Corporation Database graphical user interface with calendar view
US6085205A (en) * 1997-11-12 2000-07-04 Ricoh Company Limited Calendar incorporating document retrieval interface
US6408301B1 (en) * 1999-02-23 2002-06-18 Eastman Kodak Company Interactive image storage, indexing and retrieval system

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
AHLBERG C: "SPOTFIRE: AN INFORMATION EXPLORATION ENVIRONMENT" SIGMOD RECORD, ASSOCIATION FOR COMPUTING MACHINERY, NEW YORK, US, vol. 25, no. 4, 1 December 1996 (1996-12-01), pages 25-29, XP000702066 *
BEAZA-YATES, R., RIBEIRO-NETO, B. (EDS.): "Modern Information retrieval" , ADDISON WESLEY , USA XP002210866 ISBN: 0-201-39829-X * Chapter 10: User Interfaces and Visualization by Marti A. Hearst page 257, line 1 -page 339, last line *
HEARST, MARTI A.: "Next Generation Web Search: Setting Our Sites" BULLETIN OF THE TECHNICAL COMMITTEE ON DATA ENGINEERING, vol. 23, no. 3, September 2000 (2000-09), pages 38-48, XP002210864 USA *
PATENT ABSTRACTS OF JAPAN vol. 1996, no. 06, 28 June 1996 (1996-06-28) & JP 08 030763 A (CANON INC), 2 February 1996 (1996-02-02) *
ROTH S F ET AL: "Toward an information visualization workspace: combining multiple means of expression" HUMAN-COMPUTER INTERACTION, 1997, LAWRENCE ERLBAUM ASSOCIATES, USA, vol. 12, no. 1-2, pages 131-185, XP002210865 ISSN: 0737-0024 *
SHNEIDERMAN, BEN: "Designing the user interface: Strategies for effective human-computer interaction" , ADDISON WESLEY , USA XP002210867 ISBN: 0-201-69497-2 * Chapter 15: Information Search and Visualization page 519, line 1 -page 521, last line page 526, line 3 -page 541, line 8 figures B2-B6 *

Cited By (71)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8838590B2 (en) 2002-09-13 2014-09-16 British Telecommunications Public Limited Company Automatic media article composition using previously written and recorded media object relationship data
EP1956501A1 (en) 2002-09-13 2008-08-13 British Telecommunications Public Limited Company Media article composition
GB2394811A (en) * 2002-10-17 2004-05-05 Hewlett Packard Development Co A method for locating images
EP1416398A1 (en) * 2002-11-01 2004-05-06 Abb Research Ltd. Method and system for selecting objects in a software system
WO2004059532A3 (en) * 2002-12-30 2004-10-28 British Telecomm Data retrieval method and apparatus
US7578441B2 (en) 2002-12-30 2009-08-25 British Telecommunications Plc Data retrieval method and apparatus
EP1761024A1 (en) * 2003-03-04 2007-03-07 Ricoh Company, Ltd. Image forming apparatus and image processing apparatus
EP1455518A3 (en) * 2003-03-04 2006-03-15 Ricoh Company, Ltd. Image forming apparatus and image processing apparatus
WO2005010775A1 (en) * 2003-07-17 2005-02-03 Ivis Group Limited Improved search engine
US8005815B2 (en) 2003-07-17 2011-08-23 Ivis Group Limited Search engine
AU2006200426B2 (en) * 2003-07-17 2010-07-29 Ivis Group Limited Improved search engine
US7739257B2 (en) 2003-07-17 2010-06-15 Ivis Group Limited Search engine
GB2420647A (en) * 2003-07-17 2006-05-31 Ivis Group Ltd Improved search engine
EP1513080A3 (en) * 2003-08-29 2006-04-05 Nokia Corporation Organization and maintenance using metadata
US7840892B2 (en) 2003-08-29 2010-11-23 Nokia Corporation Organization and maintenance of images using metadata
US7313574B2 (en) 2003-10-02 2007-12-25 Nokia Corporation Method for clustering and querying media items
USRE43260E1 (en) 2003-10-02 2012-03-20 Nokia Corporation Method for clustering and querying media items
WO2005031601A1 (en) * 2003-10-02 2005-04-07 Nokia Corporation Method for clustering and querying media items
US7711315B2 (en) 2003-10-27 2010-05-04 Nokia Corporation Method and mobile terminal for accessing a service portal via bi-directional network
EP1531403A3 (en) * 2003-11-17 2006-10-18 Nokia Corporation Bookmarking and annotating in a media diary application
CN100412859C (en) * 2003-11-17 2008-08-20 诺基亚公司 Time bar navigation in a media diary application
KR100706186B1 (en) * 2003-11-17 2007-04-11 노키아 코포레이션 Time bar navigation in a media diary application
KR100711821B1 (en) * 2003-11-17 2007-05-02 노키아 코포레이션 Media diary application for use with digital device
US7109848B2 (en) 2003-11-17 2006-09-19 Nokia Corporation Applications and methods for providing a reminder or an alert to a digital media capture device
KR100746603B1 (en) * 2003-11-17 2007-08-06 노키아 코포레이션 Speed browsing of media items in a media diary application
EP1531411A1 (en) * 2003-11-17 2005-05-18 Nokia Corporation Time bar navigation in a media diary application
US8990255B2 (en) * 2003-11-17 2015-03-24 Nokia Corporation Time bar navigation in a media diary application
EP1531404A3 (en) * 2003-11-17 2005-08-24 Nokia Corporation Topographic presentation of media files in a diary application
EP1531598A3 (en) * 2003-11-17 2006-12-27 Nokia Corporation Speed browsing of media items in a media diary application
EP1531403A2 (en) * 2003-11-17 2005-05-18 Nokia Corporation Bookmarking and annotating in a media diary application
EP1533714A3 (en) * 2003-11-17 2005-08-17 Nokia Corporation Multimedia diary application for use with a digital device
EP1531598A2 (en) 2003-11-17 2005-05-18 Nokia Corporation Speed browsing of media items in a media diary application
US7774718B2 (en) 2003-12-17 2010-08-10 Nokia Corporation Time handle in a media diary application for accessing media files
CN1930568B (en) * 2004-02-09 2010-11-03 诺基亚公司 Representation of media items in a media file management application for use with a digital device
WO2005076156A1 (en) * 2004-02-09 2005-08-18 Nokia Corporation Representation of media items in a media file management application for use with a digital device
JP2007524168A (en) * 2004-02-09 2007-08-23 ノキア コーポレイション Media item representation in media file management applications for use on digital devices
US8037105B2 (en) 2004-03-26 2011-10-11 British Telecommunications Public Limited Company Computer apparatus
US9690787B2 (en) 2004-10-07 2017-06-27 Saturn Licensing Llc Contents management system, contents management method, and computer program
EP1667033A1 (en) * 2004-10-07 2006-06-07 Sony Corporation Content management system, content management method, and computer program
CN100424613C (en) * 2004-12-16 2008-10-08 国际商业机器公司 Method and system for conveying a changing local time zone in an electronic calendar
EP1679879A3 (en) * 2005-01-07 2007-05-09 Apple Computer, Inc. Image management tool with calendar interface
US7643706B2 (en) 2005-01-07 2010-01-05 Apple Inc. Image management tool with calendar interface
EP1679879A2 (en) 2005-01-07 2006-07-12 Apple Computer, Inc. Image management tool with calendar interface
WO2006077512A1 (en) * 2005-01-20 2006-07-27 Koninklijke Philips Electronics N.V. A user interface for browsing image
US7949209B2 (en) 2005-01-28 2011-05-24 Canon Kabushiki Kaisha Image processing apparatus, control method therefor, computer program, and computer-readable storage medium
US7756362B2 (en) 2005-01-28 2010-07-13 Canon Kabushiki Kaisha Image processing apparatus, control method therefor, computer program, and computer-readable storage medium
US7634158B2 (en) 2005-01-28 2009-12-15 Canon Kabushiki Kaisha Image processing apparatus, control method therefor, computer program, and computer-readable storage medium
CN1855272B (en) * 2005-04-19 2010-05-12 株式会社日立制作所 Recording and reproducing apparatus, and recording and reproducing method
EP1876526B1 (en) * 2006-05-01 2019-11-06 Sony Corporation Information processing apparatus, information processing method, information processing program, and mobile terminal device
US8375283B2 (en) * 2006-06-20 2013-02-12 Nokia Corporation System, device, method, and computer program product for annotating media files
US9122645B1 (en) 2006-12-20 2015-09-01 Qurio Holdings, Inc. Method and system for tagging within virtual groups
US7788267B2 (en) 2007-02-26 2010-08-31 Seiko Epson Corporation Image metadata action tagging
WO2008118298A1 (en) * 2007-03-26 2008-10-02 Eastman Kodak Company Digital object information via category-based histograms
US8019155B2 (en) 2007-03-26 2011-09-13 Eastman Kodak Company Digital object information via category-based histograms
EP1990744B1 (en) * 2007-05-09 2013-01-23 Research In Motion Limited User interface for editing photo tags
US8667384B2 (en) 2007-05-09 2014-03-04 Blackberry Limited User interface for editing photo tags
WO2010054119A2 (en) * 2008-11-07 2010-05-14 Yahoo! Inc. Image relevance by identifying experts
WO2010054119A3 (en) * 2008-11-07 2010-07-29 Yahoo! Inc. Image relevance by identifying experts
US8988347B2 (en) 2008-11-28 2015-03-24 Sony Corporation Image processing apparatus, image displaying method, and image displaying program
EP2192498A1 (en) * 2008-11-28 2010-06-02 Sony Corporation Image processing apparatus, image displaying method, and image displaying program
CN101751468A (en) * 2008-12-10 2010-06-23 三星电子株式会社 Method and apparatus for searching contents
EP2315137A1 (en) * 2009-10-26 2011-04-27 DAD Solutions Limited Searching of databases
CN101764965B (en) * 2009-12-21 2015-04-29 康佳集团股份有限公司 Method for recording diaries on television
CN101764965A (en) * 2009-12-21 2010-06-30 康佳集团股份有限公司 Method for recording diaries on television
CN102393846A (en) * 2010-07-16 2012-03-28 捷讯研究有限公司 Systems and methods of user interface for image display
EP2407896A1 (en) * 2010-07-16 2012-01-18 Research In Motion Limited Systems and methods of user interface for image display
EP2458513A1 (en) * 2010-11-26 2012-05-30 HTC Corporation Note management methods and systems
US9208222B2 (en) 2010-11-26 2015-12-08 Htc Corporation Note management methods and systems
EP2690588A1 (en) * 2012-07-24 2014-01-29 Samsung Electronics Co., Ltd Function based on a cloud service
WO2015200120A1 (en) * 2014-06-27 2015-12-30 Amazon Technologies, Inc. System, method and apparatus for organizing photographs stored on a mobile computing device
AU2015280393B2 (en) * 2014-06-27 2018-03-01 Amazon Technologies, Inc. System, method and apparatus for organizing photographs stored on a mobile computing device

Also Published As

Publication number Publication date
WO2002057959A3 (en) 2003-03-06

Similar Documents

Publication Publication Date Title
US8229931B2 (en) Digital media management apparatus and methods
US7415662B2 (en) Digital media management apparatus and methods
WO2002057959A2 (en) Digital media management apparatus and methods
US11636150B2 (en) Method and apparatus for managing digital files
US6948124B2 (en) Graphical user interface utilizing three-dimensional scatter plots for visual navigation of pictures in a picture database
US7636733B1 (en) Time-based image management
US7398479B2 (en) Method and system for calendar-based image asset organization
US7286723B2 (en) System and method for organizing images
US20060259511A1 (en) Media object organization across information management services
US20050050043A1 (en) Organization and maintenance of images using metadata
US20050108644A1 (en) Media diary incorporating media and timeline views
US20080098316A1 (en) User Interface for Browsing Image
US20060155761A1 (en) Enhanced organization and retrieval of digital images
US20030088582A1 (en) Visual history multi-media database software
JP2003298991A (en) Image arranging method and apparatus, and program
JP2004213129A (en) Method, device and program for classifying picture
JP2004120420A (en) Image adjusting device and program
JP2003099434A (en) Electronic album device
US20070168386A1 (en) Device and method for managing multimedia content in portable digital apparatus

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): JP

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
AK Designated states

Kind code of ref document: A3

Designated state(s): JP

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase in:

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP