US20100287476A1 - System and interface for mixing media content - Google Patents

System and interface for mixing media content Download PDF

Info

Publication number
US20100287476A1
US20100287476A1 US12/777,168 US77716810A US2010287476A1 US 20100287476 A1 US20100287476 A1 US 20100287476A1 US 77716810 A US77716810 A US 77716810A US 2010287476 A1 US2010287476 A1 US 2010287476A1
Authority
US
United States
Prior art keywords
stream
visual
icon
display
audio
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/777,168
Inventor
Ryutaro Sakai
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Sony Electronics Inc
Original Assignee
Sony Corp
Sony Electronics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp, Sony Electronics Inc filed Critical Sony Corp
Priority to US12/777,168 priority Critical patent/US20100287476A1/en
Publication of US20100287476A1 publication Critical patent/US20100287476A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

A visual display interface involving an associated display icon. The display icon is dragged across the visual display interface in a movement so as to cause the mixing of media streams. The output is a new mix of the various media streams.

Description

    CROSS-REFERENCE TO RELATED APPLICATION(S)
  • This document is a continuation application that is related to, and claims priority from U.S. patent application Ser. No. 11/385,469, entitled “System and Method for Mixing Media Content,” and filed on Mar. 21, 2006, which is commonly owned, and which is hereby incorporated by this reference in its entirety.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The field of the invention relates to providing media services and more specifically to mixing media content for these services.
  • 2. Discussion of the Related Art
  • Various types of media provide different content in varied formats to users. For example, movies, DVDs, video cassettes, audio cassettes, audio compact discs, and digital photographs provide various types of audio and visual content to today's consumers.
  • The different types of media may be created in a number of ways. For instance, two or more existing media streams are sometimes combined or mixed together to create a new media stream. The mixing of the different streams of information often creates a product having desirable characteristics with a unique appeal to various types of consumers.
  • Previous systems and approaches provided for the mixing of audio and visual streams of information. For instance, previous video editors provided for the mixing of different video streams, such as movies or video tape. Unfortunately, these editors often required the use of a complex user interface that employed multiple, non-intuitive instructions to facilitate the mixing. In addition, these previous approaches often relied upon non-intuitive instruction sequences to perform the mixing. Consequently, the user frequently was forced to consult with complicated manuals or seek outside advice in order to correctly complete the mixing.
  • All these problems with previous approaches led to the mixing requiring a substantial amount of time and effort. In addition, if performed in a commercial setting, worker efficiency was often significantly reduced. Expensive working training was also often needed in order to properly and effectively operate the editor. Even if the editor were used in a non-commercial environment, the amount of time and effort required to create desirable results frequently led to user frustration with the system and the mixing experience.
  • SUMMARY OF THE INVENTION
  • A system and method are provided that allow for the convenient and intuitive mixing of audio and visual streams of information. An interface is provided that allows a user to easily, quickly, and correctly mix audio and visual content as desired. The approaches described herein provide the desired results without having to utilize complex editing programs, usage manuals, or training courses.
  • In many of these embodiments, a visual display interface is provided on a display device. The visual display interface includes an associated display icon that is moveable across the interface. In this regard, the display icon is dragged across the visual display interface so as to cause the mixing of the media streams. The mixing results in a new media stream that comprises a custom mix of the input media streams.
  • Any number or type of media may be mixed. For example, a first audio stream may be mixed with a second audio stream and a first visual stream may be mixed with a second visual stream. Other types and combinations of media may also be mixed.
  • Movement of the icon in particular directions on the interface affects how the various media streams are applied to the final mixed product. In addition, the relative position of the icon on the interface affects the percentage of a particular media stream in the mix. In one example, dragging the icon in a generally vertical direction causes the mixing of the first and second audio streams. In another example, dragging the display icon in a generally horizontal direction causes the mixing of the first and second visual streams. In still another example, the icon is dragged in both the vertical and horizontal directions. In this case, dragging the icon in the vertical direction causes the mixing of the first and second audio streams and dragging the controller icon in the horizontal direction causes the mixing of the first and second visual streams.
  • In many of these embodiments, a visual overlap slider is also provided. In one example, the visual overlap slider controls the degree of overlap between the two media streams, for instance, the first visual stream and the second visual stream mentioned previously. In one preferred approach, the slider may provide a range of overlap of the two video streams from no overlap (where the video images are side-by-side), to total overlap (where the video images are one on-top of the other).
  • The display icon can be moved either automatically or manually. For instance, the movement of the display icon can also be programmed ahead of time if automatic movement is desired. On the other hand, a user may manually drag the icon, for instance, by using their hands and fingers.
  • Thus, different media streams are conveniently mixed together using an intuitive user interface thereby providing a unique and desirable resultant media mix. The approaches described herein do not require the consultation of complex manuals or other sources of information in order to efficiently perform the mixing. In addition, since the approaches herein are intuitive for the user, training to perform the mixing is significantly reduced or eliminated.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a diagram of a system for mixing audio and visual content according to the present invention.
  • FIG. 2 is a diagram of an icon for allowing the mixing of audio and visual content according to the present invention.
  • FIG. 3 is a block diagram of a system for mixing audio and visual content according to the present invention.
  • FIG. 4 is a flowchart of one approach for mixing audio and visual content according to the present invention.
  • FIG. 5 is a diagram showing one example of an interface for mixing audio and visual content according to the present invention.
  • FIG. 6 is a diagram showing one example of an interface for mixing audio and visual content according to the present invention.
  • FIG. 7 is a diagram showing one example of an interface for mixing audio and visual content according to the present invention.
  • FIG. 8 is a diagram showing one example of an interface for mixing audio and visual content according to the present invention.
  • FIG. 9 is a diagram showing one interface for mixing audio and visual content according to the present invention.
  • Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions and/or relative positioning of some of the elements in the figures may be exaggerated relative to other elements to help to improve understanding of various embodiments of the present invention. It will further be appreciated that certain actions and/or steps may be described or depicted in a particular order of occurrence while those skilled in the art will understand that such specificity with respect to sequence is not actually required. It will also be understood that the terms and expressions used herein have the ordinary meaning as is accorded to such terms and expressions with respect to their corresponding respective areas of inquiry and study except where specific meanings have otherwise been set forth herein.
  • DETAILED DESCRIPTION
  • Referring now to FIG. 1, one example of a user interface 100 for facilitating the mixing of media streams (e.g., audio and visual content) is described. The user interface 100 may be any type of visual interface that allows for the mixing of media streams. In one example, the user interface 100 may utilize a computer screen, touch screen and/or soft keys to provide a graphical user interface to a user. In this regard, the user interface 100 may be associated with any suitable device such as a personal computer, cellular phone, pager, or personal digital assistant. Alternatively, the user interface 100 may be associated with a separate, dedicated device that is only used for mixing.
  • In the example of FIG. 1, the user interface 100 comprises media stream display areas 102, 104, 106, and 108. The display areas 102, 104, 106 and 108 may be of any size or dimension so that they fit within the boundaries of the interface. Preferably, the display areas 102, 104, 106, and 108 are provided in a visually pleasing arrangement that is conducive to viewing and interaction for users of the system.
  • Each of the display areas 102, 104, 106, and 108 provides information concerning a particular media stream and/or the media stream itself. For instance, when the media stream is a movie, the name of the movie may be displayed along with the movie itself. On the other hand, when the media stream is an audio track (such as a musical recording), only the name of the artist and song may be displayed. In other examples, the album jacket cover art, photo of an artist, or video of an artist could also be displayed.
  • The media streams that are mixed can provide any type of content. For example, the audio streams can be music recordings, voice recordings, or any type of audio information. Similarly, the visual content can originate from any visual source such as movies, television programs, home video recordings, digital photographs, other static images, or any other type of video recording. Other examples of audio and visual content are possible.
  • In the example shown in FIG. 1, the display area 102 displays information concerning a first audio stream (i.e., a first song title) and the display area 104 displays information concerning a second audio stream (i.e., a second song title). On the other hand, display area 106 displays information concerning a first visual stream (i.e., a first movie title and a first movie), and the display area 108 displays information concerning a second visual stream of information (i.e., a second movie title and a second movie).
  • An icon 110 is moved by a user 111 across the interface (e.g., across a display). The icon 110 is a visual computer graphic image of suitable dimensions so as to be easily seen and dragged by the user. In this example, the movement of the icon 110 is accomplished by touching the icon with the fingers of the user and then subsequently dragging the icon 110 in a particular movement across the user interface 100. In alternative examples, the icon 110 can be moved by any suitable means such as by a stylus, pointer, keypad, voice activation device, or any other suitable mechanism.
  • Movement of icon 110 causes the mixing of the media streams. In the example of FIG. 1, a first audio stream is mixed with a second audio stream and a first visual stream is mixed with a second visual stream. As shown, the display area 102 is on the top of the interface and opposite the display area 104 (on the bottom). Similarly, display area 106 is on the left side of the interface while the display area 108 is positioned on the right side of the display interface. Although the audio and visual display areas are positioned opposite in this example, other placements are possible.
  • Movement of the icon 110 between the opposite display areas causes the mixing of the media with the amount of a particular media in the mix related to the relative positioning of the icon 110. For example, with the icon 110 positioned closer to the top, the media associated with the display area 102 is a higher percentage of the mix. On the other hand, with the icon positioned near the bottom of the interface, a higher percentage of the media associated with display area 104 is used in the mix.
  • A visual overlap slider 112 is also provided to indicate the degree of overlap between media streams, for instance, the visual streams. In this regard, full screen overlap may be used (one image directly over the other image). In another example, no overlap is used (the images are side-by-side). In another example, partial overlap is used (the images have portions that overlap).
  • In addition, a horizontal position limit slider 114 indicates the amount of movement of the icon 110 allowed in the horizontal direction. A vertical position limit slider 116 may indicate the amount of movement allowed by the icon 110 in the vertical direction. Other examples of sliders and function keys may also be provided.
  • In one example of the operation of the system of FIG. 1, the icon 110 is dragged across the user interface 100 in a movement so as to cause the mixing of a plurality of media streams. The dragging of the icon 110 creates a new mix of the various media streams.
  • As mentioned, movement of the icon 110 in particular directions affects the amount of a particular media stream that is used in the final mix. In one example, dragging the icon 110 in a generally vertical direction causes the mixing of the first and second audio streams. In another example, dragging the icon 110 in a generally horizontal direction causes the mixing of the first and second visual streams. In still another example, the icon 110 is dragged in both the vertical and horizontal directions. In this case, dragging the icon 110 in the vertical direction causes the mixing of the first and second audio streams and dragging the icon 110 in a generally horizontal direction causes the mixing of the first and second visual streams.
  • The visual overlap slider 112 controls the degree of overlap between the first visual stream and the second visual stream. In this case, the overlap slider is in a far left position causing no overlap to occur (the images are side-by-side).
  • The icon 110 can be moved either automatically or manually. The movement of the display icon can also be programmed ahead of time in some of these examples. In other examples, a user manually drags the icon, for instance, by using their hands and/or fingers.
  • Referring now to FIG. 2, one example of an icon 200 as used on a display 202 is described. The icon 200 is moved across the display 202 to increase or decrease the percentage or amount of a particular track in a resultant mix. Each stream (or track of a stream) has an associated label (e.g., Track A, Track B, Track C, and Track D) that is provided on the display (or screen) 202. Moving the icon 200 generally towards the label increases the percentage of the stream (or track) associated with the label in the final mix while moving the icon 200 generally away from the label decreases the percentage of the stream (or track) associated with the label.
  • The icon 200 includes informational areas that show the relative percentage of a media stream in the mix. In the example of FIG. 2, the informational areas are a digital readout of the percentage of a media stream in the mix. Alternatively, the informational areas may be omitted from the icon 200 or positioned elsewhere on the screen 202.
  • In the example of FIG. 2, Tracks A and B are video streams and Tracks C and D are audio streams. Moving the icon 200 leftward increases the percentage of Track A while decreasing the percentage of Track B in the mix. Conversely, moving the icon 200 rightward across the screen 202 causes the percentage of Track A to decrease and the percentage of Track B to increase.
  • Moving the icon 200 upward causes the percentage of Track C in the mix to increase and the percentage of Track D to decrease. On the other hand, moving the icon 200 downward causes the percentage of Track C to decrease while the percentage of Track D increases. Moving the icon 200 to the middle of the screen 202 results in roughly equal percentages of Tracks A, B, C, and D being used in the resultant mix. In this particular example, opacity of the video streams and volume of the audio streams change in the mix.
  • Now turning to other examples, moving the icon 200 to a position 204 on the display causes Track A to become 100 percent of the video portion of the mix with track B being 0 percent. As for the audio portion of the mix, with the icon 200 at position 204, Track C is 50 percent of the mix and Track D is 50 percent of the mix.
  • Moving the icon 200 to position 206 causes the video portion to consist of 0 percent from Track A and 100 percent from Track B. At position 206, Track C is 50 percent of the audio portion of the mix, while Track D is 50 percent.
  • Moving the icon 200 to position 208 causes Track A to become 50 percent of the video portion of the mix with Track B being 50 percent. As for the audio portion of the mix, Track C is 100 percent and Track D is 0 percent of the mix.
  • Moving the icon to position 210 causes Track A to become 50 percent of the video portion of the mix with Track B being 50 percent. Track C is 0 percent of the audio portion of the mix and Track D is 100 percent.
  • Referring now to FIG. 3, one example of a system for facilitating the intuitive mixing of different media streams is described. A mixing device 300 includes a screen 302, user input indicators 303, a controller 304, an icon 305, receive and transmit (RX and TX) circuit 306, and a memory 309. A viewing device 322 comprises an antenna 324, a controller 325, and a screen 320.
  • At the mixing device 300, the screen 302 is coupled to the controller 304. The controller 304 is coupled to the 303 user input indicators, the memory 309, the RX and TX circuit 306, the media streams 326, 328, 330, and 332, and communicatively coupled to the icon 305 via the screen 302. The RX and TX circuit is coupled to the antenna 318.
  • At the viewing device 322, the antenna 324 is coupled to the controller 325. The controller 325 is coupled to the screen 320.
  • The screen 302 presents a visual interface including the icon 305. For example, the icon illustrated in FIG. 1 or 2 may be used. The controller 304 receives media streams 326, 328, 330 and 332 (and/or different tracks from these streams), converts them into an appropriate format, and stores these in the memory 309. For example, media stream 326 may be a first movie; media stream 328 may be a second movie; media stream 330 may be a first audio track (e.g., a song or portion from a song); and media stream 332 may be a second audio track.
  • The controller 304 also determines and provides information on the screen 302 concerning the different streams so that a user can perform mixing. Thus, the controller 304 may show the titles and images of the movies on portions of the screen 302. In addition, the controller 304 may provide the titles of the audio tracks on other portions of the screen 302. It will be appreciated that the number and types of media, any information concerning the media, and the exact positioning of the information concerning the media on the screen 302 may vary according to the needs of the user and the limits of the system. The controller 304 receives an indication of movement of the icon 305 and is programmed to selectively cause the mixing of a plurality of media streams based upon the indication of the movement of the display icon.
  • User input indicators 303 may also be provided. For example, these may include an overlap slider (for controlling the amount of overlap of streams); horizontal and vertical position sliders (to control the extent of horizontal and vertical movement of the icon); a record button (to record movement of the icon 305 and store this in the memory 309); and an auto mix button (to supply a controller generated automatic movement for the icon 305).
  • In many situations, it is desirable that the media be displayed or the final mix viewed on a screen larger than the screen 302. For example, the user may not wish to look at a small screen, but may wish to look at the display and watch and hear the mixing results on a larger device such as a large screen television or the like. In this regard, the controller 304 transmits the mix to the RX and TX circuit 306. The RX and TX circuit 306 converts the information into a format and form that facilitates transmission.
  • Thereafter, the antenna 318 transmits the mix to the second antenna 324 at a viewing device 322 so that the information can be displayed on a larger screen at the viewing device 322. Specifically, the controller 325 receives the information from the antenna, formats the information for display, and forwards the information to the display 320 for viewing.
  • Referring now to FIG. 4, one example of an approach for mixing media content interactively is described. This example assumes that first and second audio tracks are mixed together and first and second visual tracks are mixed together on a graphical user interface. An icon is provided and moved to facilitate the mixing of the audio and visual tracks. In this example, moving the icon in a generally horizontal direction across the user interface causes the first and second visual tracks to be mixed and moving the icon in the vertical direction across the interface causes the first and second audio tracks to be mixed.
  • As mentioned previously, the icon may be a visual symbol on the face of a video screen that is moved by the hand of a user. Alternatively, a stylus or similar mechanism may be used to move the icon. In still another example, computer keys may be used to program the position of the icon. Other mechanisms and approaches for providing for icon movement are possible.
  • At step 402, an indication of the direction and amount of the movement of the icon is received. The indication may be in the form of an electrical signal that includes components indicative of the amount of vertical and horizontal movement of the icon across the interface. The generation of the electrical signal is accomplished using techniques and approaches that are known to those skilled in the art and will not be discussed in greater detail herein.
  • At step 404, the amount (i.e., percentage) of each visual stream to be applied to the mix is determined by processing the signal. For example, the components of the electrical signal indicating the horizontal position are extracted, and an amount (i.e., percentage) of each visual track to be applied to the mix is determined based upon the determined horizontal position of the icon.
  • At step 406, the amount (i.e., percentage) of each audio stream to be applied to the mix of movement is determined by processing the signal. For example, the components of the electrical signal indicating the vertical position are extracted, and an amount (percentage) of each audio track to be applied to the mix is determined based upon the vertical position of the icon. At step 408, the first and second visual streams are mixed together according to the percentages determined at step 404. At step 410, the first and second audio streams are mixed according to the percentages determined at step 406.
  • At step 412, any other inputs are received that affect the presentation of the final mix to the users. For example, an overlap slider may indicate the degree of overlap of the first and second visual streams. Other examples of sliders and function keys may also be used and their inputs processed.
  • At step 414, the mixed resultant stream of information is presented to the user on the visual interface for viewing. The resultant stream may also be stored in memory or any other storage media or device for future use.
  • Referring now to FIGS. 5-9, further examples of graphical user interfaces are described. In each case, two video streams (tracks) A and B and two audio streams (tracks) C and D are being mixed. The interfaces are video screens and display a moveable graphical icon. In these examples, movement of the icon is accomplished by having a user touch the icon with their hand or finger and manually drag the icon to the desired position on the interface. The icon has associated digital display areas that show the percentage of a particular track being used in the resultant mix.
  • Referring now specifically to FIG. 5, an example of a user interface 500 that facilitates mixing of media streams is described. The user interface 500 comprises an icon 502 and an overlap slider 504. The icon 502 is positioned at the center left of the user interface 500 and indicates that for the audio mix, track C constitutes 50 percent and track D constitutes 50 percent of the audio mix. As for the video mix, the icon 502 shows that track A constitutes 100 percent of the mix and track B constitutes 0 percent of the video mix.
  • The overlap slider 504 determines the degree of visual source overlap of the visual streams. For example, as shown in FIG. 5, full screen overlap may be used (one image directly over the other image) when the slider 504 is positioned at the far right of its range. In another example, no overlap is provided (the images are side-by-side) when the slider 504 is positioned at the far left of its range. In another example, partial overlap is provided (the images have portions that overlap) when the slider 504 is positioned in between the extremes of its allowed movement.
  • Referring now specifically to FIG. 6, another example of a user interface 600 is described. The user interface 600 comprises an icon 602 and an overlap slider 604. The icon 602 is positioned at the top center of the user interface 600 and indicates that for the audio mix, track C constitutes 100 percent and track D constitutes 0 percent of the audio mix. As for the video mix, the icon 602 shows that track A constitutes 50 percent and track B constitutes 50 percent of the video mix. The overlap slider 604 is positioned at the far left of its sliding range and indicates that there is no overlap of the video segments (i.e., the video segments are placed one next to the other).
  • Referring now specifically to FIG. 7, another example of a user interface 700 is described. The user interface 700 comprises an icon 702 and an overlap slider 704. The icon 702 is positioned at the top center of the user interface 700 and indicates that for the audio mix, track C constitutes 51 percent and track D constitutes 49 percent of the audio mix. As for the video portion of the mix, the icon 702 shows that track A constitutes 0 percent and track B constitutes 100 percent of the video mix. The overlap slider 704 is positioned at the far left of its sliding range and indicates that there is no overlap of the video segments (i.e., the video segments are placed one next to the other).
  • Referring now specifically to FIG. 8, another example of a user interface 800 is described. The user interface 800 comprises an icon 802 and a visual overlap slider 804. The icon 802 is positioned at the top center of the interface 800 and indicates that for the audio mix, track C constitutes 50 percent and track D constitutes 50 percent of the audio mix. As for the video mix, the icon 802 shows that track A constitutes 100 percent and track B constitutes 0 percent of the video mix. The overlap slider 804 is positioned at the far left of its sliding range and indicates that there is no overlap of the video segments (i.e., the video segments are placed one next to the other).
  • Referring now specifically to FIG. 9, another example of a user interface 900 is described. The user interface 900 comprises an icon 902, an overlap slider 904, a horizontal position slider 906, a vertical position slider 908, a speed slider 910, a record button 912, and an automix button 914. The icon 902 is positioned at the top center of the interface 900 and indicates that for the audio mix, track C constitutes 18 percent and track D constitutes 82 percent of the audio mix. As for the video mix, the icon 902 shows that track A constitutes 88 percent and track B constitutes 12 percent of the video mix.
  • The overlap slider 904 is positioned at the far right side of its sliding range and indicates that there is full overlap of the video segments (i.e., the video segments are placed one on top of the other). In addition, the horizontal position slider 906 restricts the movement of the icon 902 in the horizontal direction. In this case, the position of the horizontal position slider 906 indicates that full horizontal movement of the icon 902 across the interface 900 is permitted. Furthermore, the vertical position slider 908 restricts movement of the icon 902 in the vertical direction. In this example, the position of the vertical position slider 908 indicates full movement of the slider is permitted in the vertical direction.
  • The speed slider 910 determines how fast an icon can be moved across the display (e.g., when in playback mode). As mentioned, the position of the icon 902 defines how the media is mixed. Consequently, the degree of the mix changes quite rapidly when the icon is set to move fast by the speed slider 910. In contrast, the degree of the mix changes slowly when the icon 902 is set to move slowly by the speed slider 910.
  • The record button 912 may be used to record movements of the icon for future playback. Finally, the auto mix 914 button may generate an automatic (e.g., random) mixing of media streams.
  • Thus, audio and visual content are mixed together using intuitive approaches where a human user can quickly and conveniently mix audio and visual content as desired. The approaches described herein are easy to use and understand and do not require extensive training or consultation with outside informational sources such as user manuals.
  • Those skilled in the art will recognize that a wide variety of modifications, alterations, and combinations can be made with respect to the above described embodiments without departing from the spirit and scope of the invention, and that such modifications, alterations, and combinations are to be viewed as being within the scope of the invention.

Claims (14)

1. An interface for mixing audio and visual streams, comprising:
a display having a plurality of display areas adapted to display information about a plurality of media streams; and
an icon visible on the display adapted to be moveable across the display,
wherein a position of the icon on the display determines a mix of the plurality of media streams including an amount of each of the plurality of media streams in the mix.
2. The interface of claim 1, wherein the plurality of media streams comprises a first audio stream, a second audio stream, a first visual stream, and a second visual stream.
3. The interface of claim 2, further comprising a first display area for displaying the first visual stream and a second display area for displaying the second visual stream.
4. The interface of claim 3, further comprising a visual overlap slider to control a degree of overlap of the first visual stream and the second visual stream on the first display area and the second display area.
5. The interface of claim 2, further comprising an auto mix button for generating a random movement of the icon.
6. A system for mixing audio and visual streams, comprising:
a display screen;
an icon associated with the display screen having a plurality of display areas adapted to display information about a plurality of media streams; and
a controller in communication with the icon, the controller adapted to receive an indication of a movement of the icon and programmed to mix a plurality of media streams based upon the indication,
wherein the position of the icon indicates an amount of each of the plurality of media streams in the mix.
7. The system of claim 6, wherein the plurality of media streams comprises a first audio stream, a second audio stream, a first visual stream and a second visual stream.
8. The system of claim 7, wherein the controller is programmed to cause the mixing of the first audio stream and the second audio stream as the display icon is moved in generally a vertical direction.
9. The system of claim 7, wherein the controller is programmed to cause the mixing of the first visual stream and the second visual stream as the display icon is moved in generally a horizontal direction.
10. The system of claim 7, wherein the controller is programmed to cause the mixing of the first audio stream and the second audio stream as the display icon is moved in a generally vertical direction and to cause the mixing of the first visual stream and the second visual stream as the display icon is moved in generally a horizontal direction.
11. The system of claim 7, further comprising a visual overlap slider to control a degree of overlap of the first visual stream and the second visual stream.
12. The system of claim 7, wherein the first visual stream and the second visual stream are selected from a group consisting of: a first video stream, a second video stream, a first static image and a second static image.
13. An interface for mixing audio and visual streams, comprising:
a display having a plurality of display areas adapted to display information about a plurality of media streams;
an icon visible on the display adapted to be moveable across the display;
a first display area for displaying the first visual stream and a second display area for displaying the second visual stream;
a visual overlap slider to control a degree of overlap of the first visual stream and the second visual stream on the first display area and the second display area; and
an auto mix button for generating a random movement of the icon,
wherein a position of the icon on the display determines a mix of the plurality of media streams including an amount of each of the plurality of media streams in the mix,
wherein the plurality of media streams comprises a first audio stream, a second audio stream, a first visual stream, and a second visual stream.
14. The system of claim 6, further comprising a visual overlap slider to control a degree of overlap of the first visual stream and the second visual stream,
wherein the plurality of media streams comprises a first audio stream, a second audio stream, a first visual stream, and a second visual stream,
wherein the controller is programmed to cause the mixing of the first audio stream and the second audio stream as the display icon is moved in generally a vertical direction,
wherein the controller is programmed to cause the mixing of the first visual stream and the second visual stream as the display icon is moved in generally a horizontal direction,
wherein the controller is programmed to cause the mixing of the first audio stream and the second audio stream as the display icon is moved in a generally vertical direction and to cause the mixing of the first visual stream and the second visual stream as the display icon is moved in generally a horizontal direction, and
wherein the first visual stream and the second visual stream are selected from a group consisting of: a first video stream, a second video stream, a first static image, and a second static image.
US12/777,168 2006-03-21 2010-05-10 System and interface for mixing media content Abandoned US20100287476A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/777,168 US20100287476A1 (en) 2006-03-21 2010-05-10 System and interface for mixing media content

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/385,469 US7774706B2 (en) 2006-03-21 2006-03-21 System and method for mixing media content
US12/777,168 US20100287476A1 (en) 2006-03-21 2010-05-10 System and interface for mixing media content

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/385,469 Continuation US7774706B2 (en) 2006-03-21 2006-03-21 System and method for mixing media content

Publications (1)

Publication Number Publication Date
US20100287476A1 true US20100287476A1 (en) 2010-11-11

Family

ID=38535053

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/385,469 Expired - Fee Related US7774706B2 (en) 2006-03-21 2006-03-21 System and method for mixing media content
US12/777,168 Abandoned US20100287476A1 (en) 2006-03-21 2010-05-10 System and interface for mixing media content

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/385,469 Expired - Fee Related US7774706B2 (en) 2006-03-21 2006-03-21 System and method for mixing media content

Country Status (1)

Country Link
US (2) US7774706B2 (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9721611B2 (en) 2015-10-20 2017-08-01 Gopro, Inc. System and method of generating video from video clips based on moments of interest within the video clips
US9754159B2 (en) 2014-03-04 2017-09-05 Gopro, Inc. Automatic generation of video from spherical content using location-based metadata
US9794632B1 (en) 2016-04-07 2017-10-17 Gopro, Inc. Systems and methods for synchronization based on audio track changes in video editing
US9812175B2 (en) 2016-02-04 2017-11-07 Gopro, Inc. Systems and methods for annotating a video
US9836853B1 (en) 2016-09-06 2017-12-05 Gopro, Inc. Three-dimensional convolutional neural networks for video highlight detection
US9838731B1 (en) * 2016-04-07 2017-12-05 Gopro, Inc. Systems and methods for audio track selection in video editing with audio mixing option
US9966108B1 (en) 2015-01-29 2018-05-08 Gopro, Inc. Variable playback speed template for video editing application
US9984293B2 (en) 2014-07-23 2018-05-29 Gopro, Inc. Video scene classification by activity
US10083718B1 (en) 2017-03-24 2018-09-25 Gopro, Inc. Systems and methods for editing videos based on motion
US10096341B2 (en) 2015-01-05 2018-10-09 Gopro, Inc. Media identifier generation for camera-captured media
US10109319B2 (en) 2016-01-08 2018-10-23 Gopro, Inc. Digital media editing
US10127943B1 (en) 2017-03-02 2018-11-13 Gopro, Inc. Systems and methods for modifying videos based on music
US10185891B1 (en) 2016-07-08 2019-01-22 Gopro, Inc. Systems and methods for compact convolutional neural networks
US10186012B2 (en) 2015-05-20 2019-01-22 Gopro, Inc. Virtual lens simulation for video and photo cropping
US10185895B1 (en) 2017-03-23 2019-01-22 Gopro, Inc. Systems and methods for classifying activities captured within images
US10187690B1 (en) 2017-04-24 2019-01-22 Gopro, Inc. Systems and methods to detect and correlate user responses to media content
US10192585B1 (en) 2014-08-20 2019-01-29 Gopro, Inc. Scene and activity identification in video summary generation based on motion detected in a video
US10204273B2 (en) 2015-10-20 2019-02-12 Gopro, Inc. System and method of providing recommendations of moments of interest within video clips post capture
US10262639B1 (en) 2016-11-08 2019-04-16 Gopro, Inc. Systems and methods for detecting musical features in audio content
US10284809B1 (en) 2016-11-07 2019-05-07 Gopro, Inc. Systems and methods for intelligently synchronizing events in visual content with musical features in audio content
US10341712B2 (en) 2016-04-07 2019-07-02 Gopro, Inc. Systems and methods for audio track selection in video editing
US10360945B2 (en) 2011-08-09 2019-07-23 Gopro, Inc. User interface for editing digital media objects
US10534966B1 (en) 2017-02-02 2020-01-14 Gopro, Inc. Systems and methods for identifying activities and/or events represented in a video

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8930002B2 (en) * 2006-10-11 2015-01-06 Core Wireless Licensing S.A.R.L. Mobile communication terminal and method therefor
US8678896B2 (en) 2007-06-14 2014-03-25 Harmonix Music Systems, Inc. Systems and methods for asynchronous band interaction in a rhythm action game
EP2206539A1 (en) 2007-06-14 2010-07-14 Harmonix Music Systems, Inc. Systems and methods for simulating a rock band experience
US20090213140A1 (en) * 2008-02-26 2009-08-27 Masaru Ito Medical support control system
US8663013B2 (en) 2008-07-08 2014-03-04 Harmonix Music Systems, Inc. Systems and methods for simulating a rock band experience
US8219536B2 (en) 2008-11-25 2012-07-10 At&T Intellectual Property I, L.P. Systems and methods to select media content
US8527877B2 (en) * 2008-11-25 2013-09-03 At&T Intellectual Property I, L.P. Systems and methods to select media content
US8156435B2 (en) 2008-11-25 2012-04-10 At&T Intellectual Property I, L.P. Systems and methods to select media content
US8449360B2 (en) 2009-05-29 2013-05-28 Harmonix Music Systems, Inc. Displaying song lyrics and vocal cues
US8465366B2 (en) 2009-05-29 2013-06-18 Harmonix Music Systems, Inc. Biasing a musical performance input to a part
WO2011056657A2 (en) 2009-10-27 2011-05-12 Harmonix Music Systems, Inc. Gesture-based user interface
US9981193B2 (en) 2009-10-27 2018-05-29 Harmonix Music Systems, Inc. Movement based recognition and evaluation
US8636572B2 (en) 2010-03-16 2014-01-28 Harmonix Music Systems, Inc. Simulating musical instruments
US8562403B2 (en) 2010-06-11 2013-10-22 Harmonix Music Systems, Inc. Prompting a player of a dance game
US20110306397A1 (en) 2010-06-11 2011-12-15 Harmonix Music Systems, Inc. Audio and animation blending
US9358456B1 (en) 2010-06-11 2016-06-07 Harmonix Music Systems, Inc. Dance competition game
US9024166B2 (en) 2010-09-09 2015-05-05 Harmonix Music Systems, Inc. Preventing subtractive track separation
KR101832463B1 (en) * 2010-12-01 2018-02-27 엘지전자 주식회사 Method for controlling a screen display and display apparatus thereof
US20140173519A1 (en) * 2011-05-24 2014-06-19 Nokia Corporation Apparatus with an audio equalizer and associated method
KR101875743B1 (en) * 2012-01-10 2018-07-06 엘지전자 주식회사 Mobile terminal and control method therof
US9696884B2 (en) * 2012-04-25 2017-07-04 Nokia Technologies Oy Method and apparatus for generating personalized media streams
JP7039179B2 (en) * 2017-04-13 2022-03-22 キヤノン株式会社 Information processing equipment, information processing system, information processing method and program

Citations (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5212733A (en) * 1990-02-28 1993-05-18 Voyager Sound, Inc. Sound mixing device
US5559301A (en) * 1994-09-15 1996-09-24 Korg, Inc. Touchscreen interface having pop-up variable adjustment displays for controllers and audio processing systems
US5999173A (en) * 1992-04-03 1999-12-07 Adobe Systems Incorporated Method and apparatus for video editing with video clip representations displayed along a time line
US6031529A (en) * 1997-04-11 2000-02-29 Avid Technology Inc. Graphics design software user interface
US6154600A (en) * 1996-08-06 2000-11-28 Applied Magic, Inc. Media editor for non-linear editing system
US6281885B1 (en) * 1997-10-24 2001-08-28 Sony United Kingdom Limited Audio processing
US20020109710A1 (en) * 1998-12-18 2002-08-15 Parkervision, Inc. Real time video production system and method
US6490359B1 (en) * 1992-04-27 2002-12-03 David A. Gibson Method and apparatus for using visual images to mix sound
US20040030425A1 (en) * 2002-04-08 2004-02-12 Nathan Yeakel Live performance audio mixing system with simplified user interface
US20040199395A1 (en) * 2003-04-04 2004-10-07 Egan Schulz Interface for providing modeless timelines based selection of an audio or video file
US20050024488A1 (en) * 2002-12-20 2005-02-03 Borg Andrew S. Distributed immersive entertainment system
US20050259532A1 (en) * 2004-05-13 2005-11-24 Numark Industries, Llc. All-in-one disc jockey media player with fixed storage drive and mixer
US6983420B1 (en) * 1999-03-02 2006-01-03 Hitachi Denshi Kabushiki Kaisha Motion picture information displaying method and apparatus
US20060022956A1 (en) * 2003-09-02 2006-02-02 Apple Computer, Inc. Touch-sensitive electronic apparatus for media applications, and methods therefor
US20060026521A1 (en) * 2004-07-30 2006-02-02 Apple Computer, Inc. Gestures for touch sensitive input devices
US20060059426A1 (en) * 2004-09-15 2006-03-16 Sony Corporation Image processing apparatus, method, and program, and program storage medium
US20060055700A1 (en) * 2004-04-16 2006-03-16 Niles Gregory E User interface for controlling animation of an object
US7019205B1 (en) * 1999-10-14 2006-03-28 Sony Computer Entertainment Inc. Entertainment system, entertainment apparatus, recording medium, and program
US20060168521A1 (en) * 2003-06-13 2006-07-27 Fumio Shimizu Edition device and method
US7123728B2 (en) * 2001-08-15 2006-10-17 Apple Computer, Inc. Speaker equalization tool
US20060251260A1 (en) * 2005-04-05 2006-11-09 Yamaha Corporation Data processing apparatus and parameter generating apparatus applied to surround system
US7158844B1 (en) * 1999-10-22 2007-01-02 Paul Cancilla Configurable surround sound system
US20070236475A1 (en) * 2006-04-05 2007-10-11 Synaptics Incorporated Graphical scroll wheel
US7325199B1 (en) * 2000-10-04 2008-01-29 Apple Inc. Integrated time line for editing
US20080165141A1 (en) * 2007-01-05 2008-07-10 Apple Inc. Gestures for controlling, manipulating, and editing of media files using touch sensitive devices
US7653883B2 (en) * 2004-07-30 2010-01-26 Apple Inc. Proximity detector in handheld device

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5524195A (en) 1993-05-24 1996-06-04 Sun Microsystems, Inc. Graphical user interface for interactive television with an animated agent
US5952995A (en) 1997-02-10 1999-09-14 International Business Machines Corporation Scroll indicating cursor
US6259436B1 (en) 1998-12-22 2001-07-10 Ericsson Inc. Apparatus and method for determining selection of touchable items on a computer touchscreen by an imprecise touch
US6504530B1 (en) 1999-09-07 2003-01-07 Elo Touchsystems, Inc. Touch confirming touchscreen utilizing plural touch sensors
US6714215B1 (en) 2000-05-19 2004-03-30 Microsoft Corporation System and method for displaying media interactively on a video display device
GB2370739A (en) 2000-12-27 2002-07-03 Nokia Corp Flashlight cursor for set-top boxes
US7480864B2 (en) 2001-10-12 2009-01-20 Canon Kabushiki Kaisha Zoom editor
US6710754B2 (en) 2001-11-29 2004-03-23 Palm, Inc. Moveable output device
US7219308B2 (en) 2002-06-21 2007-05-15 Microsoft Corporation User interface for media player program
KR100688414B1 (en) 2003-04-25 2007-03-02 애플 컴퓨터, 인코포레이티드 Method and system for network-based purchase and distribution of media
CA2525587C (en) 2003-05-15 2015-08-11 Comcast Cable Holdings, Llc Method and system for playing video
US7176902B2 (en) 2003-10-10 2007-02-13 3M Innovative Properties Company Wake-on-touch for vibration sensing touch input devices
GB2417107A (en) 2004-06-08 2006-02-15 Pranil Ram Computer apparatus with added functionality
US7509593B2 (en) 2005-05-12 2009-03-24 Microsoft Corporation Mouse sound volume control

Patent Citations (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5212733A (en) * 1990-02-28 1993-05-18 Voyager Sound, Inc. Sound mixing device
US5999173A (en) * 1992-04-03 1999-12-07 Adobe Systems Incorporated Method and apparatus for video editing with video clip representations displayed along a time line
US6490359B1 (en) * 1992-04-27 2002-12-03 David A. Gibson Method and apparatus for using visual images to mix sound
US5559301A (en) * 1994-09-15 1996-09-24 Korg, Inc. Touchscreen interface having pop-up variable adjustment displays for controllers and audio processing systems
US6154600A (en) * 1996-08-06 2000-11-28 Applied Magic, Inc. Media editor for non-linear editing system
US6031529A (en) * 1997-04-11 2000-02-29 Avid Technology Inc. Graphics design software user interface
US6281885B1 (en) * 1997-10-24 2001-08-28 Sony United Kingdom Limited Audio processing
US20020109710A1 (en) * 1998-12-18 2002-08-15 Parkervision, Inc. Real time video production system and method
US6983420B1 (en) * 1999-03-02 2006-01-03 Hitachi Denshi Kabushiki Kaisha Motion picture information displaying method and apparatus
US7019205B1 (en) * 1999-10-14 2006-03-28 Sony Computer Entertainment Inc. Entertainment system, entertainment apparatus, recording medium, and program
US7158844B1 (en) * 1999-10-22 2007-01-02 Paul Cancilla Configurable surround sound system
US7325199B1 (en) * 2000-10-04 2008-01-29 Apple Inc. Integrated time line for editing
US7123728B2 (en) * 2001-08-15 2006-10-17 Apple Computer, Inc. Speaker equalization tool
US20040030425A1 (en) * 2002-04-08 2004-02-12 Nathan Yeakel Live performance audio mixing system with simplified user interface
US20050024488A1 (en) * 2002-12-20 2005-02-03 Borg Andrew S. Distributed immersive entertainment system
US20040199395A1 (en) * 2003-04-04 2004-10-07 Egan Schulz Interface for providing modeless timelines based selection of an audio or video file
US20060168521A1 (en) * 2003-06-13 2006-07-27 Fumio Shimizu Edition device and method
US20060022956A1 (en) * 2003-09-02 2006-02-02 Apple Computer, Inc. Touch-sensitive electronic apparatus for media applications, and methods therefor
US20060055700A1 (en) * 2004-04-16 2006-03-16 Niles Gregory E User interface for controlling animation of an object
US20050259532A1 (en) * 2004-05-13 2005-11-24 Numark Industries, Llc. All-in-one disc jockey media player with fixed storage drive and mixer
US20060026521A1 (en) * 2004-07-30 2006-02-02 Apple Computer, Inc. Gestures for touch sensitive input devices
US7653883B2 (en) * 2004-07-30 2010-01-26 Apple Inc. Proximity detector in handheld device
US20060059426A1 (en) * 2004-09-15 2006-03-16 Sony Corporation Image processing apparatus, method, and program, and program storage medium
US20060251260A1 (en) * 2005-04-05 2006-11-09 Yamaha Corporation Data processing apparatus and parameter generating apparatus applied to surround system
US20070236475A1 (en) * 2006-04-05 2007-10-11 Synaptics Incorporated Graphical scroll wheel
US20080165141A1 (en) * 2007-01-05 2008-07-10 Apple Inc. Gestures for controlling, manipulating, and editing of media files using touch sensitive devices

Cited By (59)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10360945B2 (en) 2011-08-09 2019-07-23 Gopro, Inc. User interface for editing digital media objects
US10084961B2 (en) 2014-03-04 2018-09-25 Gopro, Inc. Automatic generation of video from spherical content using audio/visual analysis
US9754159B2 (en) 2014-03-04 2017-09-05 Gopro, Inc. Automatic generation of video from spherical content using location-based metadata
US9760768B2 (en) 2014-03-04 2017-09-12 Gopro, Inc. Generation of video from spherical content using edit maps
US10776629B2 (en) 2014-07-23 2020-09-15 Gopro, Inc. Scene and activity identification in video summary generation
US11069380B2 (en) 2014-07-23 2021-07-20 Gopro, Inc. Scene and activity identification in video summary generation
US9984293B2 (en) 2014-07-23 2018-05-29 Gopro, Inc. Video scene classification by activity
US10074013B2 (en) 2014-07-23 2018-09-11 Gopro, Inc. Scene and activity identification in video summary generation
US11776579B2 (en) 2014-07-23 2023-10-03 Gopro, Inc. Scene and activity identification in video summary generation
US10339975B2 (en) 2014-07-23 2019-07-02 Gopro, Inc. Voice-based video tagging
US10262695B2 (en) 2014-08-20 2019-04-16 Gopro, Inc. Scene and activity identification in video summary generation
US10192585B1 (en) 2014-08-20 2019-01-29 Gopro, Inc. Scene and activity identification in video summary generation based on motion detected in a video
US10643663B2 (en) 2014-08-20 2020-05-05 Gopro, Inc. Scene and activity identification in video summary generation based on motion detected in a video
US10559324B2 (en) 2015-01-05 2020-02-11 Gopro, Inc. Media identifier generation for camera-captured media
US10096341B2 (en) 2015-01-05 2018-10-09 Gopro, Inc. Media identifier generation for camera-captured media
US9966108B1 (en) 2015-01-29 2018-05-08 Gopro, Inc. Variable playback speed template for video editing application
US11688034B2 (en) 2015-05-20 2023-06-27 Gopro, Inc. Virtual lens simulation for video and photo cropping
US10186012B2 (en) 2015-05-20 2019-01-22 Gopro, Inc. Virtual lens simulation for video and photo cropping
US10535115B2 (en) 2015-05-20 2020-01-14 Gopro, Inc. Virtual lens simulation for video and photo cropping
US10529052B2 (en) 2015-05-20 2020-01-07 Gopro, Inc. Virtual lens simulation for video and photo cropping
US10529051B2 (en) 2015-05-20 2020-01-07 Gopro, Inc. Virtual lens simulation for video and photo cropping
US10679323B2 (en) 2015-05-20 2020-06-09 Gopro, Inc. Virtual lens simulation for video and photo cropping
US10395338B2 (en) 2015-05-20 2019-08-27 Gopro, Inc. Virtual lens simulation for video and photo cropping
US10817977B2 (en) 2015-05-20 2020-10-27 Gopro, Inc. Virtual lens simulation for video and photo cropping
US11164282B2 (en) 2015-05-20 2021-11-02 Gopro, Inc. Virtual lens simulation for video and photo cropping
US10748577B2 (en) 2015-10-20 2020-08-18 Gopro, Inc. System and method of generating video from video clips based on moments of interest within the video clips
US10789478B2 (en) 2015-10-20 2020-09-29 Gopro, Inc. System and method of providing recommendations of moments of interest within video clips post capture
US10204273B2 (en) 2015-10-20 2019-02-12 Gopro, Inc. System and method of providing recommendations of moments of interest within video clips post capture
US9721611B2 (en) 2015-10-20 2017-08-01 Gopro, Inc. System and method of generating video from video clips based on moments of interest within the video clips
US10186298B1 (en) 2015-10-20 2019-01-22 Gopro, Inc. System and method of generating video from video clips based on moments of interest within the video clips
US11468914B2 (en) 2015-10-20 2022-10-11 Gopro, Inc. System and method of generating video from video clips based on moments of interest within the video clips
US10607651B2 (en) 2016-01-08 2020-03-31 Gopro, Inc. Digital media editing
US11049522B2 (en) 2016-01-08 2021-06-29 Gopro, Inc. Digital media editing
US10109319B2 (en) 2016-01-08 2018-10-23 Gopro, Inc. Digital media editing
US10769834B2 (en) 2016-02-04 2020-09-08 Gopro, Inc. Digital media editing
US10424102B2 (en) 2016-02-04 2019-09-24 Gopro, Inc. Digital media editing
US10083537B1 (en) 2016-02-04 2018-09-25 Gopro, Inc. Systems and methods for adding a moving visual element to a video
US11238635B2 (en) 2016-02-04 2022-02-01 Gopro, Inc. Digital media editing
US10565769B2 (en) 2016-02-04 2020-02-18 Gopro, Inc. Systems and methods for adding visual elements to video content
US9812175B2 (en) 2016-02-04 2017-11-07 Gopro, Inc. Systems and methods for annotating a video
US9794632B1 (en) 2016-04-07 2017-10-17 Gopro, Inc. Systems and methods for synchronization based on audio track changes in video editing
US10341712B2 (en) 2016-04-07 2019-07-02 Gopro, Inc. Systems and methods for audio track selection in video editing
US9838731B1 (en) * 2016-04-07 2017-12-05 Gopro, Inc. Systems and methods for audio track selection in video editing with audio mixing option
US10185891B1 (en) 2016-07-08 2019-01-22 Gopro, Inc. Systems and methods for compact convolutional neural networks
US9836853B1 (en) 2016-09-06 2017-12-05 Gopro, Inc. Three-dimensional convolutional neural networks for video highlight detection
US10560657B2 (en) 2016-11-07 2020-02-11 Gopro, Inc. Systems and methods for intelligently synchronizing events in visual content with musical features in audio content
US10284809B1 (en) 2016-11-07 2019-05-07 Gopro, Inc. Systems and methods for intelligently synchronizing events in visual content with musical features in audio content
US10546566B2 (en) 2016-11-08 2020-01-28 Gopro, Inc. Systems and methods for detecting musical features in audio content
US10262639B1 (en) 2016-11-08 2019-04-16 Gopro, Inc. Systems and methods for detecting musical features in audio content
US10534966B1 (en) 2017-02-02 2020-01-14 Gopro, Inc. Systems and methods for identifying activities and/or events represented in a video
US10679670B2 (en) 2017-03-02 2020-06-09 Gopro, Inc. Systems and methods for modifying videos based on music
US10991396B2 (en) 2017-03-02 2021-04-27 Gopro, Inc. Systems and methods for modifying videos based on music
US11443771B2 (en) 2017-03-02 2022-09-13 Gopro, Inc. Systems and methods for modifying videos based on music
US10127943B1 (en) 2017-03-02 2018-11-13 Gopro, Inc. Systems and methods for modifying videos based on music
US10185895B1 (en) 2017-03-23 2019-01-22 Gopro, Inc. Systems and methods for classifying activities captured within images
US10789985B2 (en) 2017-03-24 2020-09-29 Gopro, Inc. Systems and methods for editing videos based on motion
US10083718B1 (en) 2017-03-24 2018-09-25 Gopro, Inc. Systems and methods for editing videos based on motion
US11282544B2 (en) 2017-03-24 2022-03-22 Gopro, Inc. Systems and methods for editing videos based on motion
US10187690B1 (en) 2017-04-24 2019-01-22 Gopro, Inc. Systems and methods to detect and correlate user responses to media content

Also Published As

Publication number Publication date
US7774706B2 (en) 2010-08-10
US20070226607A1 (en) 2007-09-27

Similar Documents

Publication Publication Date Title
US7774706B2 (en) System and method for mixing media content
US11334217B2 (en) Method for providing graphical user interface (GUI), and multimedia apparatus applying the same
US11474666B2 (en) Content presentation and interaction across multiple displays
US7669126B2 (en) Playback device, and method of displaying manipulation menu in playback device
CN100587763C (en) On-vehicle device and content providing method
CN101583919B (en) Graphical user interface for audio-visual browsing
EP1748439A1 (en) Playback apparatus, menu display method, and recording medium recording program implementing menu display method
CN110291787B (en) Storage medium, terminal device control method, terminal device, and server
Jago Adobe Premiere Pro CC classroom in a book
JP2012175281A (en) Video recording apparatus and television receiver
WO2013096701A1 (en) Systems and methods involving features of creation/viewing/utilization of information modules
Brenneis et al. Final Cut Pro X: Visual QuickStart Guide
CN101937303A (en) Method for controlling multimedia play interface
Harrington et al. An Editor's Guide to Adobe Premiere Pro

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION