|Numéro de publication||USRE46481 E1|
|Type de publication||Octroi|
|Numéro de demande||US 14/539,887|
|Date de publication||18 juil. 2017|
|Date de dépôt||12 nov. 2014|
|Date de priorité||17 févr. 2006|
|Autre référence de publication||CN101028562A, CN101028562B, EP1821309A1, US8311654, US20070204744|
|Numéro de publication||14539887, 539887, US RE46481 E1, US RE46481E1, US-E1-RE46481, USRE46481 E1, USRE46481E1|
|Inventeurs||Yoichiro Sako, Kenichi Makino, Akane Sano, Katsuya Shirai, Motoyuki Takai, Makoto Inoue|
|Cessionnaire d'origine||Sony Corporation|
|Exporter la citation||BiBTeX, EndNote, RefMan|
|Citations de brevets (233), Citations hors brevets (9), Classifications (5), Événements juridiques (1)|
|Liens externes: USPTO, Cession USPTO, Espacenet|
The present invention contains subject matter related to Japanese Patent Application JP 2006-040052 filed in the Japanese Patent Office on Feb. 17, 2006, the entire contents of which being incorporated herein by reference.This is a reissue of U.S. application Ser. No. 11/702,483, filed Feb. 5, 2007, now U.S. Pat. No. 8,311,654, titled “CONTENT REPRODUCING APPARATUS, AUDIO REPRODUCING APPARATUS AND CONTENT REPRODUCING METHOD,” which claims priority to Japanese Patent Application No. 2006-040052, filed Feb. 17, 2006, which is hereby incorporated by reference in its entirety.
1. Field of the Invention
The present invention relates to a content reproducing apparatus, an audio reproducing apparatus, and a content reproducing method.
2. Description of the Related Art
In recent years, growing numbers of people, increasingly conscious of their health conditions, have come to take up walking, jogging, or running as a preferred way to maintain and improve their health or stay generally in shape. To obtain a certain level of salutary effects from such activities usually demands the people to spend suitably prolonged periods of time on their athletic pursuit.
There have been proposed a number of audio reproducing apparatuses designed to support people in walking or running. Some of the proposed apparatuses are disclosed illustratively in Japanese Patent Laid-Open Nos. 2001-299980, 2003-177749, and 2005-156641. One such apparatus is structured to be easy to carry by a user and stores songs of variable tempos. When the user takes a walk, for example, the apparatus detects the tempo of the walking and lets the user listen to songs of the tempo fit for the detected pace of walking. The tempo of walking is represented illustratively by the number of steps per unit time (e.g., per minute) and the tempo of songs by the number of beats per minute.
For example, if the walking tempo is 120 bpm (beats per minute), then the apparatus reproduces songs at a tempo of 120 bpm, such as marches. This type of audio reproducing apparatus allows the user to walk rhythmically in keeping with the tempo of the songs being played. The apparatus is thus supposed to afford the user a pleasing walking experience.
In this specification, the terms “walking” and “running” will be used separately only if these two activities need to be distinguished from each other. If there is no specific need to separate these activities, they may be simply referred to as walking or walk/run.
Walking-support audio reproducing apparatuses of the above-outlined type generally utilize acceleration sensors to detect the user's bodily movements in terms of acceleration. The acceleration thus detected and output by the sensor is analyzed so as to determine the tempo of the user's walking.
In these waveforms, the peaks indicated with small circles represent changes in acceleration caused by the impact of the user's foot hitting the ground. The periodicity of these peaks thus corresponds to the tempo of walking. The peaks with no circles attached stand for changes in acceleration caused by the audio reproducing apparatus swaying by itself or hitting the user's body during swing motion. As such, the latter peaks may be regarded as noise. With these characteristics taken into consideration, analyses of the waveforms in
In practice, however, most apparatuses of the above type do not take into account the noise experienced in analyzing the walking tempo based on the detection output as shown in
The present invention has been made in view of the above circumstances and provides arrangements for overcoming the above and other deficiencies of the related art.
In carrying out the invention and according to one embodiment thereof, there is provided a content reproducing apparatus including: a sensor; a discrimination circuit configured to discriminate whether a movement of a user is a first movement or a second movement based on a detection output from the sensor; a storage configured to store contents; a reproduction circuit configured to reproduce the contents; and a control circuit configured to supply the reproduction circuit with contents retrieved from the storage in accordance with a discrimination output from the discrimination circuit.
Preferably, the content reproducing apparatus may further include an analysis circuit configured to analyze tempos of the first movement or the second movement of the user in accordance with the detection output from the sensor. The analysis circuit may change analysis algorithms for analyzing the tempos based on the discrimination output from the discrimination circuit and the control circuit may retrieve contents from the storage in accordance with the tempo analyzed by the analysis circuit.
Preferably, the first movement and the second movement of the user may be walking and running respectively.
According to another embodiment of the present invention, there is provided a content reproducing method including the steps of: discriminating whether a movement of a user is a first movement or a second movement based on a detection output from a sensor; and supplying a reproduction circuit with contents retrieved from a storage storing the contents in accordance with a discrimination output from the discriminating step.
According to a further embodiment of the present invention, there is provided a storage medium which stores a computer-readable program for causing a computer to execute a procedure including the steps of: discriminating whether a movement of a user is a first movement or a second movement based on a detection output from a sensor; and supplying a reproduction circuit with contents retrieved from a storage storing the contents in accordance with a discrimination output from the discriminating step.
According to an embodiment of the present invention, as outlined above, the analysis algorithms in use are changed between walking and running. That means an optimal algorithm can be selected to analyze the tempos of walking or running. The selective algorithm usage translates into appreciably fewer errors in the result of the analysis than before.
In the past, analyzing the detection output from the acceleration sensor often led to errors as mentioned above. That was because the tempos of the user's walk/run were obtained using the same analysis algorithm regardless of the difference between walking and running in terms of waveforms derived from the acceleration sensor detection output, as illustrated in
In view of such circumstances, the present invention envisages brining about the following four major phases:
(A) The detection output from the acceleration sensor is analyzed to discriminate whether the user's movement is walking or running.
(B) The detection output from the acceleration sensor is analyzed to obtain the tempos of the user's walking or running.
(C) Upon analysis in phase (B) above, analysis algorithms are changed between walking and running.
(D) The changing of the analysis algorithms in phase (C) above is based on a discrimination output from phase (A) above.
As shown in
(2-1) Difference in Peak Periodicity
Generally, the speed of walking is 50 to 100 m/min and the speed of running is 140 m/min or higher. The average human step is 70 cm for men and 65 cm for women.
It is therefore determined that the average man is walking if the number of steps taken is fewer than 143 per minute and is running if the step count is 200 per minute or larger. Likewise it is determined that the average woman is walking if the number of steps taken is fewer than 153 per minute and is running if the step count is 215 per minute or larger.
(2-2) Difference in Waveform Amplitude
The magnitude of the impact on the user's body from the user's physical activity is about 1.1 to 1.2 times the user's weight during walking and about three to four times the body weight during running. The difference in impact between the two modes of activity is attributable to the fact at least one of the user's feet is on the ground during walking while the user's both feet can be momentarily off the ground during running. It follows that walking and running can be distinguished from each other by detecting the varying amplitude in waveforms derived from the acceleration sensor detection output.
(2-3) Difference in Waveform Pattern
The periodic waveform patterns derived from the acceleration sensor detection output prove to be distinctly different between walking and running when subjected to autocorrelation calculations. Performing autocorrelation calculations on the waveforms stemming from the acceleration sensor detection output allows noise and fluctuations to be removed from the waveforms.
(2-4) How to Discriminate Between Walking and Running
According to an embodiment of the present invention, the techniques outlined in paragraphs (2-1) through (2-3) above are used to discriminate between walking and running. The result from using each of the techniques is evaluated for further discrimination between walking and running. Given the result of such discrimination, it is possible to determine an optimal algorithm for acquiring the tempos of walking or running through analysis of the acceleration sensor detection output.
One preferred embodiment of the present invention is a walking-support audio reproducing apparatus furnished with play lists. With the tempo of the user's walking detected, the audio reproducing apparatus may reproduce songs from the play list that corresponds to the detected walking temp.
(3-1) Typical Structure of the Audio Reproducing Apparatus
The audio reproducing apparatus 100 has a system control circuit 10 composed of a microcomputer. The control circuit 10 includes a CPU 11 for executing programs, a ROM (read only memory) 12 that holds various data, a RAM (random access memory) 13 that provides a work area, and a nonvolatile memory 14. The memories 12, 13 and 14 are connected to the CPU 11 via a system bus 19.
In the above setup, the nonvolatile memory 14 serves to retain diverse information about the audio reproducing apparatus 100 and its user. The nonvolatile memory 14 is illustratively made up of a flash memory and contains a conversion table such as one (CNVTBL) shown in
The conversion table CNVTBL is used illustratively to convert the tempos of the user's walking and of songs into tempo numbers TN. In the conversion table CNVTBL, the tempos of the user's walking and of songs are classified into seven categories represented by serial tempo numbers TN (=1 to 7), as shown in
With the conversion table CNVTBL of
The nonvolatile memory 14 also contains play lists PL(1) through PL(7) as shown in
More specifically, songs A1 through Aa with their tempos falling between zero and 69 bpm (TN=1) are registered in the play list PL(1); songs B1 through Bb with their tempos between 70 and 119 bpm (TN=2) are registered in the play list PL(2); and so on. Songs G1 through Gg with their tempos at or higher than 210 bpm (TN=7) are registered in the play list PL(7).
The audio reproducing apparatus 100 also has a storage 21. The storage 21 accumulates or stores music data and digital audio data to be reproduced as songs. For that purpose, the storage 21 is constituted by a large-capacity flash memory or by a small hard disk drive. Illustratively, the music data held in the storage 21 is digital audio data compressed in MP3 (MPEG-1/Audio Layer 3, MPEG means Motion Picture Experts Group) format.
The storage 21 is connected to the system bus 19. A reproduction circuit 22 is also connected to the system bus 19. The reproduction circuit 22 is made up of a decoder circuit, a D/A (digital to analog) converter circuit, and an output amplifier. The decoder circuit decompresses compressed music data back to the original audio data. The D/A converter circuit converts the digital audio data into an analog audio signal.
Music data retrieved from the storage 21 is supplied to the reproduction circuit 22. The reproduction circuit 22 decompresses the supplied music data and converts the decompressed data to an analog audio signal. Following the D/A conversion, the analog audio signal is output to a headphone jack 23 that is connected with headphones 60.
An interface circuit 24 is also connected to the system bus 19. Music data is fed into the control circuit 10 from an externally furnished personal computer 70 through an input connector 25 and the interface circuit 24 to be stored into the storage 21.
This embodiment of the invention is furnished with a three-dimensional acceleration sensor 31 as a detection device that detects the walking tempo of the user carrying the audio reproducing apparatus 100 around. The acceleration sensor 31 detects the motions, acceleration, vibrations, and swaying of the audio reproducing apparatus 100 representative of the user's bodily movements (i.e., in terms of acceleration). A detection output S31 from the acceleration sensor 31 is fed to a discrimination/analysis circuit 32.
The discrimination/analysis circuit 32, as will be discussed later in more detail, analyzes the detection output S31 coming from the acceleration sensor 31 so as to detect the user's walk/run tempo. Upon analysis, the discrimination/analysis circuit 32 discriminates between walking and running using the procedure discussed in the paragraph (2) above in order to effect switchover to an optimal algorithm for analyzing the walking or running.
Various operation keys are connected to the system bus 19. The system bus 19 is further connected with a display device such as an LCD (liquid crystal display) 43 by way of a display control circuit 42. In this setup, the operation keys are used illustratively to accomplish the following: selecting the audio reproducing apparatus 100 either as a general-purpose portable music player or as a walking-support apparatus; selecting any one of different operation modes; selecting songs to play; and making other settings. The LCD 43 serves to display results of the operation keys 41 having been operated and information about the song being reproduced.
(3-2-1) Storing the Songs
The music data of a song desired to be stored into the audio reproducing apparatus 100 is prepared beforehand in compressed format on the personal computer 70. With the personal computer 70 connected to the audio reproducing apparatus 100, a suitable transfer program is carried out on the PC to designate transfer of the music data in question.
The music data prepared on the personal computer 70 is then supplied to the audio reproducing apparatus 100 through the connector 25. The supplied music data is admitted into the audio reproducing apparatus 100 through the interface circuit 24 under control of the CPU (central processing unit) 11. The data is stored into the storage 21.
(3-2-2) Creating the Play Lists PL(1) Through PL(7)
Giving a command to create play lists causes the audio reproducing apparatus 100 to create skeleton play lists PL(1) through PL(7) (i.e., play lists with no contents inside). The tempo of the song placed into the storage 21 is analyzed using the procedure discussed in the paragraph (3-2-1) above. The analyzed tempo is converted to a tempo number TN by use of the conversion table CNVTBL. The analyzed song is registered in the play list PL(TN) corresponding to the tempo number resulting from the conversion from among the play lists PL(1) through PL(7).
Illustratively, if an analysis of a given song reveals that it has a tempo of 80 bpm, the tempo is converted by the conversion table CNVTBL into TN=2. The song having that tempo is then registered in the play list PL(2).
The tempo of a given song is acquired by performing a spectrum analysis of its music data and by obtaining an autocorrelation function of the data. When the music data of a song is prepared on the personal computer 70, information indicative of the tempo of that song may be added to the music data as meta information that may later be used to identify the tempo. When the song is to be registered into any one of the play lists PL(1) through PL(7), the registration may be carried out using a file name of the corresponding music data together with the song title and the name of the artist involved.
(3-2-3) Using the Embodiment as a General-Purpose Portable Music Player, for Music Reproduction
In this case, giving a command to reproduce a stored song causes the audio reproducing apparatus 100 to retrieve the applicable music data from the storage 21. The retrieved music data is supplied to the reproduction circuit 22 for data decompression and digital-to-analog conversion.
The reproduction circuit 22 thus outputs an analog audio signal derived from the retrieved music data. The analog audio signal is fed to the headphones 60 allowing the user to listen to the reproduced song. The title of the song being reproduced is displayed on the LCD 43.
Retrieval of music data from the storage 21 is controlled in accordance with a currently established reproduction mode. That is, the retrieved music data may be subjected illustratively to single-song reproduction, all-song continuous reproduction, random reproduction, or repeat reproduction. In this manner, the audio reproducing apparatus 100 can be utilized as a general-purpose portable music player.
A command may also be given to designate one of the play lists PL(1) through PL(7) for reproduction. In such a case, only the songs registered in the designated play list are reproduced selectively. Illustratively, when going to bed, the user might want to designate the play list PL(1) to reproduce songs of slow tempos.
(3-2-4) Using the Embodiment as a Walking-Support Apparatus for Music Reproduction
In this case, the audio reproducing apparatus 100 is used to reproduce songs having tempos commensurate with the user's walking speed. Giving a command to reproduce such songs causes the acceleration sensor 31 and discrimination/analysis circuit 32 to detect the tempo of the user's walking. The walking tempo thus detected is converted by the conversion table CNVTBL into a corresponding tempo number TN. Of the play lists PL(1) through PL(7), the play list PL(TN) corresponding to the tempo number TN derived from the conversion is selected. Then one of the songs registered in the selected play list PL(TN) is selected.
The music data of the selected song is retrieved from the storage 21 and sent to the reproduction circuit 22 for data decompression and digital-to-analog conversion. By the same procedure as that discussed in the paragraph (3-2-3) above, the selected song is reproduced and listened to by use of the headphones 60. Because the tempo of the song being reproduced is commensurate with the user's walking speed, the user can walk rhythmically and pleasantly in time with the song.
During the walking, the current tempo number TN is compared with the preceding tempo number TN. A difference detected in the comparison between the two numbers indicates a change in the walking tempo. In that case, another play list PL(TN) corresponding to the current walking tempo TN is selected and songs are reproduced selectively from the newly selected play list PL(TN).
As will be discussed later, the analysis of the user's walking tempo by the discrimination/analysis circuit 32 is supplemented by the determination of whether the user's current activity is walking or running. That is, when play lists or songs are to be selected by the above-described procedure, the result of the determination of whether the user's motion comes from walking or running may be additionally taken into consideration.
The detection output S31 from the acceleration sensor 31 is also supplied to the discrimination circuit 32B. In this setup, the discrimination circuit 32B is constituted by a period detection circuit 321, an amplitude detection circuit 322, an autocorrelation circuit 323, and a determination circuit 324. The circuits 321 through 323 are each designed to process the detection output S31 by a different method when detecting the probability of the user's movement being either walking or running. The determination circuit 324 evaluates outputs S21 through S23 coming from the circuits 321 through 323, thereby determining whether the user is walking or running.
Illustratively, the period detection circuit 321 subject the detection output S31 from the acceleration circuit 31 to spectrum analysis in order to detect periodicity of peaks (marked by small circles in
The amplitude detection circuit 322 illustratively demodulates the detection output S31 from the acceleration sensor 31 to detect the amplitude of the peaks (marked by small circles in
The autocorrelation circuit 323 performs autocorrelation calculations on the detection output S31 from the acceleration sensor 31 to obtain the magnitude of autocorrelation in the output S31. On the basis of the magnitude of autocorrelation thus acquired, the autocorrelation circuit 323 detects the probability of the user either walking or running by use of the procedure discussed in the paragraph (2-3) above. The resulting detection output S23 from the autocorrelation circuit 323 is sent to the determination circuit 324.
The determination circuit 324 evaluates the detection outputs S21 through S23 coming from the circuits 321 through 323 respectively in order to determine whether the user's activity is walking or running. The result of the determination is output as a discrimination output S24 of the discrimination circuit 32B. Illustratively, if the detection outputs S21 through S23 each indicate the probability of the user's walking or running in percentage points, these values are weighted before they are added up. The addition allows the determination circuit 324 to determine whether the user is walking or running. If the detection outputs S21 through S23 each indicate the probability of walking or running in binary form, the determination circuit 324 may determine whether the user is walking or running by a majority decision derived from the outputs S21 through S23.
The discrimination output S24 from the discrimination circuit 32B is supplied as a control parameter to the analysis circuit 32A. Given the discrimination output S24, the analysis circuit 32A switches accordingly to a suitable algorithm for analyzing the detection output S31 from the acceleration sensor 31. The analysis algorithm derived from the switchover is an optimal algorithm for analyzing the tempo of the user's walking or running. In this setup, the discrimination output S24 is also supplied to the control circuit 10.
The user's walking or running is analyzed specifically by different methods as follows: in the waveform of the detection output S31 of
In the waveform of the detection output S31 of
Many people are unconsciously in the habit of exerting a more force on either foot than the other during walking or running. For that reason, it is preferred that the length of time for analysis using the above-mentioned autocorrelation calculations include a peak period of not one step but two steps. When a difference is observed between the right and left feet in terms of the force exerted thereon, this characteristic may also be taken into consideration in the search for waveform peaks representative of the tempo of walking.
The waveforms in
Where the discrimination/analysis circuit 32 of
In the paragraphs that follow, the tempo of walking in general and a typical method for creating the conversion table CNVTBL will be discussed.
Fourteen test subjects (eight adult males and six adult females) were observed in their walking habits in daily life. The observations revealed that their walking movements could be roughly classified into four groups: low-speed walking, normal walking, jogging, and dash as shown in
The test subjects were also measured for their walking tempos. The resulting measurements are shown graphically in
The measurements above reveal that the walking tempos in daily life are not uniformly distributed; they tend to be included in one of the groups. It is also revealed that the walking tempos of less than 69 bpm, 140 to 159 bpm, and 240 bpm and higher rarely occur in everyday life. For each group, it is possible to obtain the mean value, standard deviation, and coefficient of variation of the tempos involved and to estimate their ranges.
People are thought to select automatically an optimally efficient state of transport energy consumption when walking or running. The walking tempos in the range of 140 to 159 bpm come between walking and jogging and fall into the state generally known as race walking. In daily life, people rarely, if ever, walk in the state of race walking. Hence the resulting measurements obtained as described above.
Each user has a particular pattern of walking as mentioned earlier. The audio reproducing apparatus 100 is arranged to learn its user's pattern of walking. The results of such learning are then turned into the conversion table such as one (CNVTBL) shown in
(5-2) Learning of Walking Tempos
For the purpose of learning, the user carries the audio reproducing apparatus 100 and takes a walk. During the walking, as shown in
The walking tempos m_MT(t) thus calculated are accumulated in the storage 21 of the audio reproducing apparatus 100. This is how the reproducing apparatus 100 learns the user's walking tempos m_MT(t).
Once the walking tempos are learned, the audio reproducing apparatus 100 is connected to the personal computer 70 as shown in
(5-3) Division of Walking Tempos into Groups
The personal computer 70 creates a histogram of walking tempo appearances based on the transferred walking tempos m_MT(t) and timestamp information. From the histogram, maximum values MD(i) max (i=1, 2, 3, . . .) are detected and the detected values are taken as vertexes representing the walking tempos classified into groups MD(i).
For each of the groups MD(n), a lower limit value MD(n) lower and an upper limit value MD(n)upper are obtained. If a given group MD(n) does not overlap with any other group, attention is paid to both ends of the group MD(n); the value on the horizontal axis at which the number of appearances is zero is taken either as the lower limit value MD(n)lower or as the upper limit value MD(n) upper.
If two groups MD(n−1) and MD(n) overlap with each other, a median value between the maximum value MD(n−1) max of the group MD(n−1) and the maximum value MD(n) max of the group MD(n) is regarded both as the upper limit value MD(n−1)upper of the group MD(n−1) and as the lower limit value MD(n)lower of the group MD(n).
If the maximum value is positioned at the top or bottom end of the histogram as in the case of the maximum value MD(5) max of the group MD(5) in
When the groups MD(n) are reorganized using the above-described procedure, it is possible to obtain four pairs of the lower limit value MD(n)lower and upper limit value MD(n) upper (n=1 to 4) from the histogram of
The values n=1, 2, 3, 4 are associated with the tempo numbers TN=2, 3, 5, 6 respectively, as shown in the right-hand side column of
Where the walking tempos m_MT(t) are less than 69 bpm (too slow), between 140 and 159 bpm (race walking), and higher than 210 bpm (too fast) in
The values n=5, 6, 7 are associated with the tempo numbers TN=1, 4, 7 respectively, as shown in the right-hand side column of
The conversion table CNVTBL is thus created by the procedure discussed above. The created conversion table is transferred from the personal computer 70 to the audio reproducing apparatus 100 wherein the transferred table is retained illustratively in the memory 14.
The above-described reproducing apparatus 100 analyzes the detection output S31 from the acceleration sensor 31 to discriminate whether the user's movement is walking or running, and changes analysis algorithms for detecting the tempos of the user's walking or running determined on the basis of the discrimination output. This makes it possible to use an optimal algorithm for analyzing the walking or running tempos and thereby to reduce errors significantly in the analysis.
Illustratively, the audio reproducing apparatus 100 creates the play lists PL(1) through PL(7) by walking tempo as shown in
Because the play lists PL(1) through PL(7) have been acquired through learning, there is no need for the user to fine-tune the listed choices or make additional adjustments to the lists. Furthermore, the listings are affected very little by the user's physical conditions, variations among individual users, or fluctuations in a given user's walking.
In the foregoing description, the embodiment of the invention was shown using the seven play lists PL(1) through PL(7). Alternatively, there may be prepared one play list of songs at tempos of 69 bpm or lower, 14 play lists covering songs at tempos between 70 and 209 bpm in increments of 10 bpm, and one play list of songs at tempos of 210 bpm or higher. Any of these play lists may be selected for reproduction of the songs contained inside in keeping with the detected tempo of walking or running. As another alternative, there may be prepared two play lists, one covering songs at tempos of 139 bpm or lower and the other containing songs at tempos of 140 bpm or higher. Either of the two play lists may then be selected for reproduction of the songs contained inside in keeping with the detected tempo of walking or running.
The personal computer 70 may create the play lists PL(1) through PL(7) and transfer the created lists to the audio reproducing apparatus 100 together with the digital audio data constituting the songs held inside the lists. It is also possible for the audio reproducing apparatus 100 to have a standard conversion table CNVTBL installed therein beforehand. Every time the user takes a walk, the standard conversion table CNVTBL may be corrected or adjusted to reflect the user's own walking pattern. In this case, the longer the audio reproducing apparatus 100 is used, the more accurate the selection of songs to be reproduced in accordance with the user's unique walking pattern.
In the above-described example, the audio reproducing apparatus 100 was described as being hung from the user's neck by a neck strap. Alternatively, it might happen that the user wants to carry the apparatus around in a pocket of the clothes he or she wears or in a bag he or she carries. In such cases, appropriate analysis algorithms may be devised to address the tempos of the user keeping the apparatus in his or her pocket or bag while walking or running.
The discrimination/analysis circuit 32 may be implemented either by hardware such as a DSP (digital signal processor) or by software made up of programs performed by the CPU 11. Whenever any song to be reproduced is changed from the initial category, that change may be evaluated in terms of how the operation keys 41 are operated. The evaluations may then be used as the basis for subsequently selecting songs more to the user's taste. It is also possible for the user of the audio reproducing apparatus 100 to establish conditions for changing songs or to set or vary the lower limit value MD(n)lower and upper limit value MD(n)upper by himself or herself by taking a look at the histogram of
The acceleration sensor 31 may be separated from the audio reproducing apparatus 100 and attached to, say, the headphones 60. In this case, the detection signal from the acceleration sensor 31 may be sent to the discrimination/analysis circuit 32 in wired or wireless fashion. The acceleration sensor 31 may be replaced by a speed sensor or by a gyro sensor. Furthermore, the music data may be integrated with video digital data.
It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factor in so far as they are within the scope of the appended claims or the equivalents thereof.
|Brevet cité||Date de dépôt||Date de publication||Déposant||Titre|
|US4776323||3 juin 1987||11 oct. 1988||Donald Spector||Biofeedback system for an exerciser|
|US5002491||28 avr. 1989||26 mars 1991||Comtek||Electronic classroom system enabling interactive self-paced learning|
|US5119474||11 juil. 1991||2 juin 1992||International Business Machines Corp.||Computer-based, audio/visual creation and presentation system and method|
|US5137501||7 juil. 1988||11 août 1992||Mertesdorf Frank L||Process and device for supporting fitness training by means of music|
|US5215468 *||11 mars 1991||1 juin 1993||Lauffer Martha A||Method and apparatus for introducing subliminal changes to audio stimuli|
|US5648627||20 sept. 1996||15 juil. 1997||Yamaha Corporation||Musical performance control apparatus for processing a user's swing motion with fuzzy inference or a neural network|
|US6142913 *||3 nov. 1999||7 nov. 2000||Ewert; Bruce||Dynamic real time exercise video apparatus and method|
|US6157744||21 févr. 1996||5 déc. 2000||Hitachi, Ltd.||Method and apparatus for detecting a point of change in a moving image|
|US6230192||16 juil. 1999||8 mai 2001||Cddb, Inc.||Method and system for accessing remote data based on playback of recordings|
|US6312363 *||8 juil. 1999||6 nov. 2001||Icon Health & Fitness, Inc.||Systems and methods for providing an improved exercise device with motivational programming|
|US6336891||8 janv. 1997||8 janv. 2002||Real Vision Corporation||Interactive exercise pad system|
|US6349275||24 nov. 1997||19 févr. 2002||International Business Machines Corporation||Multiple concurrent language support system for electronic catalogue using a concept based knowledge representation|
|US6389222||30 déc. 1999||14 mai 2002||Kabushiki Kaisha Toshiba||Management system for protected and temporarily-erased still picture information|
|US6390923||24 oct. 2000||21 mai 2002||Konami Corporation||Music playing game apparatus, performance guiding image display method, and readable storage medium storing performance guiding image forming program|
|US6408128||12 nov. 1998||18 juin 2002||Max Abecassis||Replaying with supplementary information a segment of a video|
|US6570078||19 mars 2001||27 mai 2003||Lester Frank Ludwig||Tactile, visual, and array controllers for real-time control of music signal processing, mixing, video, and lighting|
|US6571193 *||16 oct. 2000||27 mai 2003||Hitachi, Ltd.||Method, apparatus and system for recognizing actions|
|US6578047||22 sept. 1999||10 juin 2003||Sony Corporation||System for searching a data base for information associated with broadcast segments based upon broadcast time|
|US6623427 *||25 sept. 2001||23 sept. 2003||Hewlett-Packard Development Company, L.P.||Biofeedback based personal entertainment system|
|US6662231||30 juin 2000||9 déc. 2003||Sei Information Technology||Method and system for subscriber-based audio service over a communication network|
|US6697824||31 août 1999||24 févr. 2004||Accenture Llp||Relationship management in an E-commerce application framework|
|US6704729||19 mai 2000||9 mars 2004||Microsoft Corporation||Retrieval of relevant information categories|
|US6757482||26 févr. 1999||29 juin 2004||Nec Corporation||Method and device for dynamically editing received broadcast data|
|US6807558||2 juin 1998||19 oct. 2004||Pointcast, Inc.||Utilization of information “push” technology|
|US6813438||6 sept. 2000||2 nov. 2004||International Business Machines Corporation||Method to customize the playback of compact and digital versatile disks|
|US6839680||30 sept. 1999||4 janv. 2005||Fujitsu Limited||Internet profiling|
|US6868440||4 févr. 2000||15 mars 2005||Microsoft Corporation||Multi-level skimming of multimedia content using playlists|
|US6944542||12 mars 2003||13 sept. 2005||Trimble Navigation, Ltd.||Position determination system for movable objects or personnel|
|US6944621||3 janv. 2000||13 sept. 2005||Interactual Technologies, Inc.||System, method and article of manufacture for updating content stored on a portable storage medium|
|US7161887||13 nov. 2001||9 janv. 2007||Digeo, Inc.||Method and apparatus for extracting digital data from a medium|
|US7260402||30 mai 2003||21 août 2007||Oa Systems, Inc.||Apparatus for and method of creating and transmitting a prescription to a drug dispensing location|
|US7293066||21 janv. 2004||6 nov. 2007||Cisco Technology, Inc.||Methods and apparatus supporting access to stored data|
|US7320137||6 déc. 2001||15 janv. 2008||Digeo, Inc.||Method and system for distributing personalized editions of media programs using bookmarks|
|US7346920||2 juil. 2001||18 mars 2008||Sonic Solutions, A California Corporation||System, method and article of manufacture for a common cross platform framework for development of DVD-Video content integrated with ROM content|
|US7395549||17 oct. 2000||1 juil. 2008||Sun Microsystems, Inc.||Method and apparatus for providing a key distribution center without storing long-term server secrets|
|US7451177||11 avr. 2000||11 nov. 2008||Avintaquin Capital, Llc||System for and method of implementing a closed loop response architecture for electronic commerce|
|US7464137||8 juin 2006||9 déc. 2008||Cisco Technology, Inc.||On-line conference recording system|
|US7521623||24 nov. 2004||21 avr. 2009||Apple Inc.||Music synchronization arrangement|
|US7521624 *||12 févr. 2007||21 avr. 2009||Sony Corporation||Content reproduction list generation device, content reproduction list generation method, and program-recorded recording medium|
|US7542816 *||3 nov. 2005||2 juin 2009||Outland Research, Llc||System, method and computer program product for automatically selecting, suggesting and playing music media files|
|US7546626||24 juil. 2002||9 juin 2009||Sony Corporation||Information providing system, information processing apparatus, and method|
|US7790976||27 mars 2006||7 sept. 2010||Sony Corporation||Content searching method, content list searching method, content searching apparatus, and searching server|
|US7894424||28 mars 2007||22 févr. 2011||Sony Corporation||Content using method, content using apparatus, content recording method, content recording apparatus, content providing system, content receiving method, content receiving apparatus, and content data format|
|US7930385||13 sept. 2006||19 avr. 2011||Sony Corporation||Determining content-preference score for controlling subsequent playback|
|US8010489||21 août 2006||30 août 2011||Sony Corporation||Content communication system, content communication method, and communication terminal apparatus|
|US8027965||26 juin 2006||27 sept. 2011||Sony Corporation||Content providing system, content providing apparatus and method, content distribution server, and content receiving terminal|
|US8079962||20 janv. 2006||20 déc. 2011||Sony Corporation||Method and apparatus for reproducing content data|
|US8135700||22 juin 2011||13 mars 2012||Sony Corporation||Content providing system, content providing apparatus and method, content distribution server, and content receiving terminal|
|US8135736||13 juil. 2006||13 mars 2012||Sony Corporation||Content providing system, content providing apparatus and method, content distribution server, and content receiving terminal|
|US8170003||28 mars 2006||1 mai 2012||Sony Corporation||Content recommendation system and method, and communication terminal device|
|US8311654||5 févr. 2007||13 nov. 2012||Sony Corporation||Content reproducing apparatus, audio reproducing apparatus and content reproducing method|
|US8451832||26 oct. 2005||28 mai 2013||Sony Corporation||Content using apparatus, content using method, distribution server apparatus, information distribution method, and recording medium|
|US8608621 *||16 févr. 2005||17 déc. 2013||Koninklijke Philips N.V.||Audio pacing device|
|US8837469||28 sept. 2010||16 sept. 2014||Sony Corporation||Content using method, content using apparatus, content recording method, content recording apparatus, content providing system, content receiving method, content receiving apparatus, and content data format|
|US9230029 *||22 déc. 2005||5 janv. 2016||Creative Technology Ltd||System and method for modifying media content playback based on an intelligent random selection|
|US20010010754||15 mars 2001||2 août 2001||Hideo Ando||Recording method of stream data and data structure thereof|
|US20010014620||15 févr. 2001||16 août 2001||Kazuhiko Nobe||Game device, game device control method, information storage medium, game distribution device, and game distribution method|
|US20010015123||10 janv. 2001||23 août 2001||Yoshiki Nishitani||Apparatus and method for detecting performer's motion to interactively control performance of music or the like|
|US20010043198||22 mars 2001||22 nov. 2001||Ludtke Harold Aaron||Data entry user interface|
|US20010055038||1 mai 2001||27 déc. 2001||Samsung Electronics Co., Ltd.||Method for changing menu icon and editing menu configuration in a mobile telephone|
|US20020056142||2 janv. 2001||9 mai 2002||Redmond Scott D.||Portable apparatus for providing wireless media access and storage and method thereof|
|US20020073417||28 sept. 2001||13 juin 2002||Tetsujiro Kondo||Audience response determination apparatus, playback output control system, audience response determination method, playback output control method, and recording media|
|US20020085833||25 sept. 2001||4 juil. 2002||Konami Corporation||Information storage medium, video recording method and information reproducing device|
|US20020104101||31 janv. 2002||1 août 2002||Yamato Jun-Ichi||Information providing system and information providing method|
|US20020152122||29 juin 2001||17 oct. 2002||Tatsuya Chino||Information distribution system, information distribution method, and computer program for executing the method|
|US20030007777||26 juin 2002||9 janv. 2003||Pioneer Corporation||Commercial cut apparatus, commercial cut method, recording-reproducing apparatus comprising commercial cut function, and commercial cut program|
|US20030018622||16 juil. 2001||23 janv. 2003||Microsoft Corporation||Method, apparatus, and computer-readable medium for searching and navigating a document database|
|US20030026433||31 juil. 2001||6 févr. 2003||Matt Brian J.||Method and apparatus for cryptographic key establishment using an identity based symmetric keying technique|
|US20030034996||20 août 2001||20 févr. 2003||Baoxin Li||Summarization of baseball video content|
|US20030060728||25 sept. 2001||27 mars 2003||Mandigo Lonnie D.||Biofeedback based personal entertainment system|
|US20030065665||19 sept. 2002||3 avr. 2003||Fuji Photo Film Co., Ltd.||Device, method and recording medium for information distribution|
|US20030069893||29 mars 2001||10 avr. 2003||Kabushiki Kaisha Toshiba||Scheme for multimedia data retrieval using event names and time/location information|
|US20030088647||6 nov. 2001||8 mai 2003||Shamrao Andrew Divaker||Communication process for retrieving information for a computer|
|US20030093790||8 juin 2002||15 mai 2003||Logan James D.||Audio and video program recording, editing and playback systems using metadata|
|US20030113096||3 févr. 2003||19 juin 2003||Kabushiki Kaisha Toshiba||Multi-screen display system for automatically changing a plurality of simultaneously displayed images|
|US20030126604||24 déc. 2002||3 juil. 2003||Lg Electronics Inc.||Apparatus for automatically generating video highlights and method thereof|
|US20030163693||28 févr. 2002||28 août 2003||General Instrument Corporation||Detection of duplicate client identities in a communication system|
|US20030212810||26 mars 2003||13 nov. 2003||Yuko Tsusaka||Content distribution system that distributes line of stream data generated by splicing plurality of pieces of stream data|
|US20040000225||13 juin 2003||1 janv. 2004||Yoshiki Nishitani||Music apparatus with motion picture responsive to body action|
|US20040044724||27 août 2002||4 mars 2004||Bell Cynthia S.||Apparatus and methods to exchange menu information among processor-based devices|
|US20040049405||11 août 2003||11 mars 2004||Christof Buerger||Management system for the provision of services|
|US20040064209||30 sept. 2002||1 avr. 2004||Tong Zhang||System and method for generating an audio thumbnail of an audio track|
|US20040126038||31 déc. 2002||1 juil. 2004||France Telecom Research And Development Llc||Method and system for automated annotation and retrieval of remote digital content|
|US20040220830||1 déc. 2003||4 nov. 2004||Advancepcs Health, L.P.||Physician information system and software with automated data capture feature|
|US20040252397||16 juin 2003||16 déc. 2004||Apple Computer Inc.||Media player with acceleration protection|
|US20040255335||22 oct. 2003||16 déc. 2004||Ascent Media Group, Inc.||Multicast media distribution system|
|US20040259529||30 janv. 2004||23 déc. 2004||Sony Corporation||Wireless adhoc communication system, terminal, authentication method for use in terminal, encryption method, terminal management method, and program for enabling terminal to perform those methods|
|US20050041951||16 juil. 2004||24 févr. 2005||Sony Corporation||Content playback method, content playback apparatus, content recording method, and content recording medium|
|US20050102365||6 nov. 2003||12 mai 2005||International Business Machines Corporation||Method and system for multiple instant messaging login sessions|
|US20050126370||29 oct. 2004||16 juin 2005||Motoyuki Takai||Playback mode control device and playback mode control method|
|US20050241465||23 oct. 2003||3 nov. 2005||Institute Of Advanced Industrial Science And Techn||Musical composition reproduction method and device, and method for detecting a representative motif section in musical composition data|
|US20050249080||7 mai 2004||10 nov. 2005||Fuji Xerox Co., Ltd.||Method and system for harvesting a media stream|
|US20050278758||5 août 2003||15 déc. 2005||Koninklijke Philips Electronics, N.V.||Data network, user terminal and method for providing recommendations|
|US20050288991||28 juin 2004||29 déc. 2005||Thomas Hubbard||Collecting preference information|
|US20060078297||17 févr. 2005||13 avr. 2006||Sony Corporation||Method and apparatus for customizing content navigation|
|US20060087925||26 oct. 2005||27 avr. 2006||Sony Corporation||Content using apparatus, content using method, distribution server apparatus, infomation distribution method, and recording medium|
|US20060107822 *||24 nov. 2004||25 mai 2006||Apple Computer, Inc.||Music synchronization arrangement|
|US20060112411||26 oct. 2005||25 mai 2006||Sony Corporation||Content using apparatus, content using method, distribution server apparatus, information distribution method, and recording medium|
|US20060174291||20 janv. 2006||3 août 2006||Sony Corporation||Playback apparatus and method|
|US20060189902||20 janv. 2006||24 août 2006||Sony Corporation||Method and apparatus for reproducing content data|
|US20060190413||23 févr. 2006||24 août 2006||Trans World New York Llc||Digital content distribution systems and methods|
|US20060220882||20 mars 2006||5 oct. 2006||Sony Corporation||Body movement detecting apparatus and method, and content playback apparatus and method|
|US20060243120||27 mars 2006||2 nov. 2006||Sony Corporation||Content searching method, content list searching method, content searching apparatus, and searching server|
|US20060245599||27 avr. 2005||2 nov. 2006||Regnier Patrice M||Systems and methods for choreographing movement|
|US20060250994||28 mars 2006||9 nov. 2006||Sony Corporation||Content recommendation system and method, and communication terminal device|
|US20070005655||26 juin 2006||4 janv. 2007||Sony Corporation|
|US20070025194||22 déc. 2005||1 févr. 2007||Creative Technology Ltd||System and method for modifying media content playback based on an intelligent random selection|
|US20070044010||14 août 2006||22 févr. 2007||Sanghoon Sull||System and method for indexing, searching, identifying, and editing multimedia files|
|US20070067311||21 août 2006||22 mars 2007||Sony Corporation||Content communication system, content communication method, and communication terminal apparatus|
|US20070074253||13 sept. 2006||29 mars 2007||Sony Corporation||Content-preference-score determining method, content playback apparatus, and content playback method|
|US20070074619 *||6 avr. 2006||5 avr. 2007||Linda Vergo||System and method for tailoring music to an activity based on an activity goal|
|US20070085759||12 oct. 2006||19 avr. 2007||Lg Electronics Inc.||Method for displaying multimedia contents and mobile communications terminal capable of implementing the same|
|US20070098354||29 nov. 2006||3 mai 2007||Hideo Ando||Information playback system using information storage medium|
|US20070186752||26 janv. 2007||16 août 2007||Alain Georges||Systems and methods for creating, modifying, interacting with and playing musical compositions|
|US20070204744||5 févr. 2007||6 sept. 2007||Sony Corporation||Content reproducing apparatus, audio reproducing apparatus and content reproducing method|
|US20070221045||13 févr. 2007||27 sept. 2007||Sony Corporation||Playback device, contents selecting method, contents distribution system, information processing device, contents transfer method, and storing medium|
|US20070237136||28 mars 2007||11 oct. 2007||Sony Corporation||Content using method, content using apparatus, content recording method, content recording apparatus, content providing system, content receiving method, content receiving apparatus, and content data format|
|US20070265720||9 mai 2007||15 nov. 2007||Sony Corporation||Content marking method, content playback apparatus, content playback method, and storage medium|
|US20080153671||19 févr. 2004||26 juin 2008||Koninklijke Philips Electronics, N.V.||Audio Pacing Device|
|US20080263020||13 juil. 2006||23 oct. 2008||Sony Corporation|
|US20090028009||12 août 2008||29 janv. 2009||Microsoft Corporation||Dynamic Mobile CD Music Attributes Database|
|US20110016149||28 sept. 2010||20 janv. 2011||Sony Corporation|
|US20110252053||22 juin 2011||13 oct. 2011||Sony Corporation|
|US20140344407||31 juil. 2014||20 nov. 2014||Sony Corporation|
|US20150066983||6 nov. 2014||5 mars 2015||Sony Corporation|
|EP1039400A2||29 sept. 1999||27 sept. 2000||Sony Corporation||Searching a data base|
|EP1128358A1||21 févr. 2000||29 août 2001||In2Sports B.V.||Method of generating an audio program on a portable device|
|EP1160651A1||28 mai 2001||5 déc. 2001||Ein-Gal Moshe||Wireless cursor control|
|EP1320101A2||10 déc. 2002||18 juin 2003||Matsushita Electric Industrial Co., Ltd.||Sound critical points retrieving apparatus and method, sound reproducing apparatus and sound signal editing apparatus using sound critical points retrieving method|
|EP1503376A2||27 juil. 2004||2 févr. 2005||Sony Corporation||Content playback|
|EP1705588A1||24 mars 2006||27 sept. 2006||Sony Corporation||Content searching method, content list searching method, content searching apparatus, content list searching apparatus, and searching server|
|EP1729290A1||30 mai 2006||6 déc. 2006||Sony Corporation||Music playback apparatus and processing control method|
|EP1746520A2||4 juil. 2006||24 janv. 2007||Sony Corporation||Content providing system, content, providing apparatus and method, content distribution server, and content receiving terminal|
|JP3088409B2||Titre non disponible|
|JP3147888B2||Titre non disponible|
|JP3598613B2||Titre non disponible|
|JP2000003174A||Titre non disponible|
|JP2000020054A||Titre non disponible|
|JP2000207263A||Titre non disponible|
|JP2000214851A||Titre non disponible|
|JP2000285059A||Titre non disponible|
|JP2001022350A||Titre non disponible|
|JP2001075995A||Titre non disponible|
|JP2001166772A||Titre non disponible|
|JP2001282813A||Titre non disponible|
|JP2001297090A||Titre non disponible|
|JP2001299980A||Titre non disponible|
|JP2001321564A||Titre non disponible|
|JP2001324984A||Titre non disponible|
|JP2001325787A||Titre non disponible|
|JP2001357008A||Titre non disponible|
|JP2001359096A||Titre non disponible|
|JP2002023746A||Titre non disponible|
|JP2002049631A||Titre non disponible|
|JP2002092013A||Titre non disponible|
|JP2002108918A||Titre non disponible|
|JP2002189663A||Titre non disponible|
|JP2002238022A||Titre non disponible|
|JP2002251185A||Titre non disponible|
|JP2002282227A||Titre non disponible|
|JP2002330411A||Titre non disponible|
|JP2003023589A||Titre non disponible|
|JP2003037856A||Titre non disponible|
|JP2003050816A||Titre non disponible|
|JP2003058770A||Titre non disponible|
|JP2003085888A||Titre non disponible|
|JP2003108154A||Titre non disponible|
|JP2003150173A||Titre non disponible|
|JP2003157375A||Titre non disponible|
|JP2003162285A||Titre non disponible|
|JP2003177749A||Titre non disponible|
|JP2003224677A||Titre non disponible|
|JP2004073272A||Titre non disponible|
|JP2004078467A||Titre non disponible|
|JP2004113552A *||Titre non disponible|
|JP2004139576A||Titre non disponible|
|JP2004151855A||Titre non disponible|
|JP2004173102A||Titre non disponible|
|JP2004185535A||Titre non disponible|
|JP2004199667A||Titre non disponible|
|JP2004222239A||Titre non disponible|
|JP2004226625A||Titre non disponible|
|JP2004234807A||Titre non disponible|
|JP2004240252A||Titre non disponible|
|JP2004252654A||Titre non disponible|
|JP2004259313A||Titre non disponible|
|JP2004259430A||Titre non disponible|
|JP2004282775A||Titre non disponible|
|JP2004317819A||Titre non disponible|
|JP2004326840A||Titre non disponible|
|JP2004361713A||Titre non disponible|
|JP2004362145A||Titre non disponible|
|JP2004362489A||Titre non disponible|
|JP2004362601A||Titre non disponible|
|JP2004526372A||Titre non disponible|
|JP2005004604A||Titre non disponible|
|JP2005043916A||Titre non disponible|
|JP2005062971A||Titre non disponible|
|JP2005084336A||Titre non disponible|
|JP2005093068A||Titre non disponible|
|JP2005107867A||Titre non disponible|
|JP2005156641A||Titre non disponible|
|JP2005196918A||Titre non disponible|
|JP2005202319A||Titre non disponible|
|JP2007149218A||Titre non disponible|
|JPH0444096A||Titre non disponible|
|JPH0764547A||Titre non disponible|
|JPH1055174A||Titre non disponible|
|JPH05273971A||Titre non disponible|
|JPH06249977A||Titre non disponible|
|JPH06290574A||Titre non disponible|
|JPH07110681A||Titre non disponible|
|JPH08131425A||Titre non disponible|
|JPH08152880A||Titre non disponible|
|JPH08286663A||Titre non disponible|
|JPH08322014A||Titre non disponible|
|JPH08328555A||Titre non disponible|
|JPH09107517A||Titre non disponible|
|JPH10124047A||Titre non disponible|
|JPH10254445A||Titre non disponible|
|JPH11126067A||Titre non disponible|
|JPP2001299980A *||Titre non disponible|
|JPP2004113552A *||Titre non disponible|
|NL1023191C2||Titre non disponible|
|WO1993022762A1||20 avr. 1993||11 nov. 1993||The Walt Disney Company||Apparatus and method for tracking movement to generate a control signal|
|WO2001082302A1||21 déc. 2000||1 nov. 2001||Samsung Electronics Co., Ltd.||Method and apparatus for recording and reproducing audio highlight portion and recording medium for storing additional information for the same|
|WO2002005124A1||10 juil. 2001||17 janv. 2002||William H Hagey||Portable electronic percussion instrument|
|WO2002080524A2||19 mars 2002||10 oct. 2002||Koninklijke Philips Electronics N.V.||Streaming video bookmarks|
|WO2002093344A1||14 mai 2002||21 nov. 2002||Koninklijke Philips Electronics N.V.||Device for interacting with real-time streams of content|
|WO2003043007A2||13 nov. 2002||22 mai 2003||Digeo Inc.||A method and apparatus for extracting digital data from a medium|
|WO2004023358A1||5 août 2003||18 mars 2004||Koninklijke Philips Electronics N.V.||A data network, user terminal and method for providing recommendations|
|WO2004077706A1||16 févr. 2004||10 sept. 2004||Philips Intellectual Property & Standards Gmbh||System for determining user preferences|
|WO2004077760A1||9 févr. 2004||10 sept. 2004||Siemens Aktiegesellschaft||Efficient implementation of simple nodes in a switched network|
|1||Chang et al., Overview of the MPeg-7 standard. IEEE Transactions on Circuits and Systems for Video Technology. Jun. 2001; 11(6):688-695.|
|2||Hawley, Structure out of sound. MIT PhD Thesis. 1993, pp. 1-185.|
|3||JIANTING ZHANG, LE GRUENWALD, CHRIS CANDLER, GARY MCNUTT, WEI SHUNG CHUNG: "Database and Metadata Support of a Web-Based Multimedia Digital Library for Medical Education", ELECTRONIC PUBLISHING, ARTISTIC IMAGING, AND DIGITAL TYPOGRAPHY, SPRINGER VERLAG, DE, vol. 2436, 17 August 2002 (2002-08-17), DE, pages 339 - 350, XP002429295, ISBN: 978-3-540-24128-7|
|4||Koike et al., Timeslider: An Interface to specify time point. Proc. of the ACM 10th Annual Symposium on User Interface Software and Technology, Oct. 17, 1997, pp. 43-44, Alberta, Canada.|
|5||Little et al., "A digital on-demand video service supporting content-based queries." http://portal.acm.org/ft-gateway.cfm?id=168450&type=pdf&coll=GUIDE&d1=GUIDE&CFID=16387603&CFTOKEN=17953305. Proc. of the First ACM International Conference on Multimedia. New York, Aug. 1, 1993, 9 pages, XP-002429294.|
|6||LITTLE T D C ET AL: "A digital on-demand video service supporting content-based queries", XP002429294, Retrieved from the Internet <URL:http://portal.acm.org/ft_gateway.cfm?id=168450&type=pdf&coll=GUIDE&dl=GUIDE&CFID=16387603&CFTOKEN=17953305> [retrieved on 20070413]|
|7||McParland et al., Exchanging TV-anytime metadata over IP networks. Document AN462 submitted to the TV-anytime forum, Sep. 17, 2002, pp. 1-38.|
|8||O'Keeffe, Karl, Dancing monkeys. Masters project. Jun. 18, 2003, pp. 1-66.|
|9||Zhang et al., Database and metadata support of a web-based multimedia digital library for Medical Education, http://www.springerlink.com/content/69Ohglrxv19gwy2q/fulltext.pdf. Proc. of the First International Conference on Advances in Web-based Learning, ICWL 2002. China, Aug. 17, 2002, pp. 339-350, XP002429295.|
|Classification internationale||G06F17/00, G11B27/10, G11B27/32|
|Classification coopérative||G11B27/105, G11B27/329|
|3 avr. 2017||AS||Assignment|
Owner name: SONY CORPORATION, JAPAN
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SAKO, YOICHIRO;MAKINO, KENICHI;SANO, AKANE;AND OTHERS;SIGNING DATES FROM 20070404 TO 20070418;REEL/FRAME:041821/0034