WO2009157339A1 - コンテンツ話題性判定システム、その方法及びプログラム - Google Patents
コンテンツ話題性判定システム、その方法及びプログラム Download PDFInfo
- Publication number
- WO2009157339A1 WO2009157339A1 PCT/JP2009/060908 JP2009060908W WO2009157339A1 WO 2009157339 A1 WO2009157339 A1 WO 2009157339A1 JP 2009060908 W JP2009060908 W JP 2009060908W WO 2009157339 A1 WO2009157339 A1 WO 2009157339A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- content
- topicality
- contents
- viewing
- derived
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/02—Feature extraction for speech recognition; Selection of recognition unit
Definitions
- the present invention relates to a content topicality determination system, a method thereof, and a program.
- Patent Document 1 An example of a system that measures the frequency of viewing at the same time that content is distributed over a network is described in Patent Document 1.
- the receiving terminal has a means for measuring viewing data.
- the viewing data is measured, and the viewing data and the attribute information of the user are combined.
- the content server aggregates viewing data sent from the user and calculates viewing frequency data for each content.
- the first problem is that the topical content cannot be correctly extracted from the viewing frequency disclosed in Patent Document 1.
- the viewing frequency is calculated for each ID (identification information) for identifying the content. For this reason, content ID management is not properly performed, and the same content is distributed from a plurality of moving image sites with different content ID assignment policies when the same content is assigned with a plurality of IDs. In this case, the viewing frequency could be calculated only for each individual ID.
- the viewing of the user is distributed, and the content viewing frequency per one identification ID decreases, and the viewing frequency accurately determines the topicality of the content. No longer reflects.
- the derived content when there is content that can be derived from a certain content (such as when a part of video of a certain news content is used in another program), the derived content includes the original content (simply included) In some cases, it may be included with some processing, such as text superimposition, image size, and color changes). As a result, viewing the derived content is also viewing the original content.
- the present invention has been invented in view of the above problems, and an object thereof is to provide a content topicality determination system, method, and program capable of appropriately determining the topicality of content.
- the present invention that solves the above-described problem is performed by comparing feature quantity extraction means for extracting content feature quantities from a plurality of contents, and feature quantities of the plurality of contents extracted by the feature quantity extraction means, Content grouping means for calculating the same / derived content grouping information by obtaining the same content and the derived content produced using the same content and calculating the same / derived content grouping information, viewing history information of the plurality of content and the same / derived content From the grouping information, the viewing frequency is totaled between the contents determined as the same / derived content, and the total viewing frequency for each identical / derived content is calculated. Based on the total viewing frequency, the same / derived content Content topicality determination system having topicality determination means for determining topicality A.
- the present invention for solving the above-mentioned problems is characterized in that a feature amount extraction step for extracting feature amounts of content from a plurality of contents and a feature amount of the extracted plurality of contents are collated, and the same content included in the plurality of contents And a content grouping step for obtaining and grouping the derived content produced using the content and calculating the same / derived content grouping information, viewing history information of the plurality of contents, and the same / derived content grouping information, From the content determined as the same / derived content, and calculates the total viewing frequency for each identical / derived content, and determines the topicality of the same / derived content based on the total viewing frequency.
- Content topicality determination method having a topicality determination step
- the present invention that solves the above-described problem is characterized in that a feature amount extraction process for extracting a feature amount of a content from a plurality of contents and a feature amount of the extracted plurality of contents are collated, and the same content included in the plurality of contents And content grouping processing for obtaining derived contents produced using the contents and grouping them to calculate the same / derived content grouping information, viewing history information of the plurality of contents, and the same / derived content grouping information, From the content determined as the same / derived content, the total viewing frequency is calculated for each identical / derived content, and the topicality of the same / derived content is determined based on the total viewing frequency.
- Content topicality determination process that causes the information processing device to execute the topicality determination processing to be performed A gram.
- FIG. 1 is a block diagram showing a configuration of a content topicality determination system according to an embodiment of the present invention.
- FIG. 2 is a diagram illustrating an example of grouping between contents having a time axis.
- FIG. 3 is a block diagram showing an embodiment of the topicality determining means 102 of FIG.
- FIG. 4 is a block diagram showing another embodiment of the topicality determining means 102 of FIG.
- FIG. 5 is a flowchart showing the overall operation of the content topicality determination system in the present embodiment.
- FIG. 6 is a flowchart showing the operation of the topicality determination unit 102 shown in FIG.
- FIG. 7 is a flowchart showing the operation of the topicality determination unit 102 shown in FIG.
- the content topicality determination system compares a feature amount extraction unit (100 in FIG. 1) that extracts content feature amounts from a plurality of contents and a feature amount of the plurality of contents extracted by the feature amount extraction unit.
- Content grouping means (101 in FIG. 1) for calculating the same / derived content grouping information by obtaining and grouping the same content included in the plurality of content and the derived content produced using the content, Based on the viewing history information of multiple contents and the same / derived content grouping information, the viewing frequency is totaled between the contents determined to be the same / derived content, and the total viewing frequency is calculated for each identical / derived content.
- Topicity determination means for determining the topicality of the same / derived content based on viewing frequency ( Comprising a first 102) and.
- the content topicality determination system configured as described above, even when a plurality of different IDs are given to the same content or when the content is used as a part of another content, Group and calculate the viewing frequency for the group. By determining the topicality based on the viewing frequency, even if a plurality of different IDs are given to the same content or the content is used as a part of another content, The topicality can be accurately determined, and the object of the present invention can be achieved.
- FIG. 1 is a diagram showing a content topicality determination system according to an embodiment of the present invention.
- the content topicality determination system according to the present exemplary embodiment includes a feature amount extraction unit 100, an identical / derived content grouping unit 101, a topicality determination unit 102, a content viewing unit 104, a content storage unit 105, and a content. It comprises viewing history storage means 106.
- the content storage unit 105 stores a plurality of contents, and is connected to the feature amount extraction unit 100 and the content viewing unit 104.
- the feature amount extraction unit 100 receives content from the content storage unit 105, obtains a feature amount for the content, and outputs the feature amount to the same / derived content grouping unit 101.
- the same / derived content grouping unit 101 receives the feature amount of the content output from the feature amount extraction unit 100, obtains content link information representing a link relationship between the feature amounts, and obtains topicality determination unit as grouping information.
- Topicity determination means 102 receives grouping information from identical / derived content grouping means 101, receives content viewing history information from content viewing history storage means 106, and generates and outputs topic content information.
- the content viewing unit 104 receives the content from the content storage unit 105 and outputs the viewing history information to the content viewing history storage unit 106.
- the content is stored in the content storage means 105.
- the content refers to, for example, digitized multimedia content, and corresponds to digitalized photos, videos, music, or content that can be combined with them.
- the content may be a so-called CGM (Consumer Generated Media) that is not only a content produced by a professional but also a content produced by a consumer, such as a broadcast program.
- CGM Consumer Generated Media
- moving image content will be described as an example of content, but the same applies to music, photos, and the like.
- the content storage means 105 is written as if the content is stored in one place for convenience, but the content may be distributed and stored in a plurality of storages. For example, in a plurality of moving image posting sites on the Internet, moving image content may be accumulated for each site. In each site, content may be stored separately in a plurality of storages. The content stored in the content storage unit 105 is input to the feature amount extraction unit 100.
- Feature amount extraction means 100 performs feature amount extraction for each input content.
- the feature amount is a visual feature amount such as a color, a pattern, or a shape.
- a feature amount standardized by ISO / IEC 15938-3 can be used.
- music it is an acoustic feature quantity such as sound power or frequency component, and for example, a feature quantity standardized by ISO / IEC 15938-4 can be used.
- a visual feature amount that expresses movement can be used.
- a feature amount standardized by ISO / IEC 15938-3 can be used.
- the above-described acoustic feature amount may be used, or both the acoustic feature amount and the visual feature amount may be used.
- the extracted feature amount of each content is output to the same / derived content grouping means 101.
- the identical / derived content grouping means 101 performs matching between the feature quantities of the input contents, and if the similarity between the feature quantities is sufficiently large, the contents are regarded as the same contents, and the group Turn into. Specifically, the similarity (or distance) is calculated between the feature quantities of two contents, and if the similarity is greater than or equal to a threshold (or less than the threshold in the case of distance), the two contents are grouped.
- the same photo in the case of photo content, the same photo can be grouped by comparing the feature quantities of the entire photo and calculating the similarity. Further, the similarity may be calculated by performing collation between some areas of the photo. In this case, another image that can be created using one photo (for example, an image that can be created by putting a frame on a photo or pasting it on another photo), that is, derived content can also be grouped. .
- content grouping is performed for each time section (section length is arbitrary). For example, if the contents A, B, C, and D are collated as shown in FIG. 2, the time intervals indicated by diagonal stripes and the time intervals indicated by vertical stripes are grouped. The grouping information obtained in this way is output to the topicality determination means 102.
- the content stored in the content storage unit 105 is also input to the content viewing unit 104 when the user selects and views the content.
- the content viewing means 104 the user reproduces and views the content.
- the content viewing history is recorded.
- the viewing history may simply record whether or not the content has been reproduced, or if the user does not see it from the beginning, it skips at first and describes only the information of the part actually reproduced. If there is a fast-forwarded section, the section information is also recorded.
- the viewing history information for each content is output to the content viewing history storage means 106.
- the content viewing history storage means 106 stores the input viewing history information. This viewing history information may also be distributed and stored in a plurality of storages, similar to the content storage unit 105. The viewing history information is input to the topicality determination unit 102.
- the topicality determining means 102 calculates the viewing frequency of each content from the grouping information and viewing history information.
- the viewing history information is calculated for each content, and the total viewing frequency is calculated by summing within the group based on the grouping information. Then, the topicality of the content is determined based on the total viewing frequency, and is output as topicality information. Details of the operation of the topicality determination unit 102 will be described later.
- FIG. 5 is a flowchart showing the overall processing flow of the content topicality determination system in the present embodiment shown in FIG.
- step S500 feature amount extraction for each content is performed. Details of the extraction are as described in the feature amount extraction means 100.
- step S501 the extracted feature quantities are collated between the contents, the contents are grouped, and grouping information is obtained. Details of the grouping are as described in the identical / derived content grouping means 101.
- step S502 the topicality of the content is determined using the grouping information and the viewing history information, and the topicality information is calculated.
- FIG. 3 is a diagram illustrating an embodiment of the topicality determination unit 102.
- FIG. 3 includes a viewing frequency calculation unit 200, a total viewing frequency calculation unit 201, and a topicality index calculation unit 202.
- the viewing frequency calculation means 200 receives the viewing history information and outputs the viewing frequency information for each content to the total viewing frequency calculation means 201.
- Total viewing frequency calculation means 201 receives grouping information and viewing frequency information output from viewing frequency calculation means 200 and outputs the total viewing frequency to topicality index calculation means 202.
- Topicity index calculating means 202 receives the total viewing frequency and calculates topical information and outputs it.
- the viewing frequency calculation means 200 calculates the viewing frequency of each content from the viewing history information.
- the obtained viewing frequency information for each content is output to the total viewing frequency calculation means 201.
- the total viewing frequency calculating means 201 calculates the total viewing frequency, which is the viewing frequency of the grouped content as a whole, by summing the viewing frequencies of the individual contents.
- the total viewing frequency of the content is calculated for each time interval (interval length is arbitrary).
- the viewing frequency of each content obtained by the viewing frequency calculation means 200 is N A , N B , N C , and N D without depending on time
- the total viewing of the portion indicated by hatching The frequency is Equation 1
- the total viewing frequency of the portion indicated by vertical stripes is Equation 2.
- the viewing frequency with respect to the media time (relative time from the beginning of the content) of the content A, B, C, D is expressed as N A (t), N Assuming that B (t), N C (t), and N D (t), the total viewing frequency of the portion indicated by the diagonal lines is given by Equation 3, and the total viewing frequency of the portion indicated by the vertical stripes is given by Equation 4.
- each content may be summed by weighting whether it is the original content or the derived content.
- the summation may be performed by weighting depending on the site in consideration of the reliability of the site where each content exists. In this case, the total viewing frequency of the hatched portion is expressed by Equation 5 when not depending on time, and by Equation 6 when depending on time.
- W A , W B , and W C are weighting coefficients.
- W A N A + W B N B + W C N C Formula 5 W A N A (t) + W B N B (t) + W C N C (t + t 1 ) Equation 6
- the viewing frequency may be calculated by weighting the viewing time closer to the time from the present time. For example, the weight when viewed today is 1, and when viewed before k days, the weight is 1-k / N, and when viewed before N days, the weight is 0 and the viewing frequency is Control that cannot be counted is conceivable. This makes it possible to extract more topical content by placing importance on the latest viewing frequency.
- the topicality index calculation means 202 determines the topicality for each content or time interval of the content based on the total viewing frequency. The simplest is to determine that the topic is larger as the total is larger. That is, the total viewing frequency can be used as a topicality index as it is.
- FIG. 6 is a flowchart of the entire processing in the topicality determination unit 102 shown in FIG.
- step S550 the viewing frequency for each content is calculated.
- step 551 the viewing frequency in units of content is totaled in the grouped content to calculate the total viewing frequency.
- step S552 a topicality index is calculated from the total viewing frequency and output.
- FIG. 4 is a diagram showing another embodiment of the topicality determination unit 102.
- FIG. 4 is a diagram showing another embodiment of the topicality determination unit 102.
- the 4 includes a viewing frequency calculation unit 200, a total viewing frequency calculation unit 201, a topicality index calculation unit 202, and an effective viewing section determination 250.
- the input of the effective viewing section determination 250 is viewing history information, and the output is input to the viewing frequency calculation means 200.
- the other configuration is the same as that of the topicality determination unit 102 of FIG.
- the operations of the viewing frequency calculating means 200, the total viewing frequency calculating means 201, and the topicality index calculating means 202 are the same as those in FIG. Here, the operation of the effective viewing section determination 250 will be described.
- viewing history information is input.
- a history that is not suitable for viewing is determined from the input viewing history and deleted. For example, when the playback time is very short, there is a high possibility that the user does not watch the content but touches it when he / she selects the content by zapping. Or, special playback such as fast-forwarding is different from normal viewing. Therefore, these logs are excluded and the remaining history is output to the viewing frequency calculation means 200.
- FIG. 7 is a flowchart of the entire processing of the topicality determination unit 102 shown in FIG.
- Effective viewing section determination determines a history that is not suitable for viewing from the input viewing history and deletes it.
- the subsequent determination is the same as in the flowchart of FIG.
- the topicality is determined by summing the viewing frequency within the group for each section. It is possible to accurately determine the topicality of a certain section.
- each unit is configured by hardware, but may be configured by an information processing apparatus such as a CPU that operates by a program.
- the program causes an information processing apparatus such as a CPU to execute the above-described operation.
- the feature amount extraction unit that extracts the feature amount of the content from the plurality of contents and the feature amount of the plurality of contents extracted by the feature amount extraction unit are collated, Content grouping means for calculating the same / derived content grouping information by obtaining the same content included in the plurality of contents and the derivative content produced using the content, and viewing history information of the plurality of contents, Based on the same / derived content grouping information, the viewing frequency is totaled between the contents determined to be the same / derived content, the total viewing frequency is calculated for each identical / derived content, and the same frequency is calculated based on the total viewing frequency.
- Content topicality determination system having topicality determination means for determining the topicality of derived content Is Temu.
- the content is content having a time axis
- the content grouping unit performs grouping of the same / derived content for each time interval by the collation, and the same / Derived content grouping information is calculated
- the topicality determining means calculates the total viewing frequency for each time interval and determines the topicality for each time interval.
- the content is music or video.
- the feature amount of the content includes at least one of a visual feature amount and an acoustic feature amount.
- the topicality determination unit determines that the content having a high total viewing frequency is content with topicality.
- the topicality determining means determines the topicality of the content in a time interval.
- the content viewing according to the above aspect wherein the content is selected from the plurality of contents to be viewed, and the identification information and the viewing section of the viewed content are output as the content viewing history information.
- the topicality determination unit includes an effective viewing section determination unit that determines that only history satisfying a certain viewing condition is valid from the content viewing history information, The total viewing frequency is calculated using the section determined to be valid by the viewing section determination means.
- the feature amount extraction step of extracting the feature amount of the content from the plurality of contents and the feature amount of the extracted plurality of contents are collated,
- a content grouping step for determining the same / derived content grouping information by grouping the same content and the derived content produced using the same content, and viewing history information of the plurality of content and the same / derived content Based on the grouping information, the viewing frequency is summed between the contents determined to be the same / derived content, the total viewing frequency for each identical / derived content is calculated, and based on the total viewing frequency, the same / derived content Content topicality determination having topicality determination step for determining topicality It is a method.
- the content is content having a time axis
- the content grouping step performs grouping of the same / derived content for each time interval by the collation, and the same / Derived content grouping information is calculated
- the topicality determination step calculates the total viewing frequency for each time interval and determines the topicality for each time interval.
- the content is music or video.
- the content feature amount includes at least one of a visual feature amount and an acoustic feature amount.
- the topicality determination step determines that the content having the high total viewing frequency is content with topicality.
- the topicality determining step determines the topicality of the content in a time interval.
- the content viewing according to the above aspect wherein the content is selected from the plurality of contents to be viewed and the identification information and the viewing section of the viewed content are output as the content viewing history information. Has steps.
- the topicality determining step determines that only history satisfying a certain viewing condition is valid from the content viewing history information, and uses the section determined to be valid. Calculate the total viewing frequency.
- a feature amount extraction process for extracting a feature amount of a content from a plurality of contents and a matching between the extracted feature amounts of the plurality of contents are performed, Content grouping processing for calculating the same / derived content grouping information by grouping the same content and the derived content produced using the content, and viewing history information of the plurality of content and the same / derived content From the grouping information, the viewing frequency is totaled between the contents determined as the same / derived content, and the total viewing frequency for each identical / derived content is calculated. Based on the total viewing frequency, the same / derived content Content that causes an information processing device to execute topicality determination processing for determining topicality Is a problem determination program.
- the present invention can be applied to a purpose of determining the topicality of content posted on the network for each section. Further, the above uses are not limited to the network, and can be similarly applied to content stored in the same hard disk recorder when the same or derived content exists.
Abstract
Description
NA + NB + NC 式1
NA + NB + NC + ND 式2
一方、コンテンツの部分再生によって、各コンテンツの視聴頻度が時間によって異なる場合、コンテンツA、B、C、Dのメディア時間(コンテンツの先頭からの相対時刻)に対する視聴頻度をNA(t)、NB(t)、NC(t)、ND(t)とすると、斜線で示す部分の合計視聴頻度は式3になり、縦縞で示す部分の合計視聴頻度は式4になる。
NA(t) + NB(t) + NC(t + t1) 式3
NA(t) + NB(t) + NC(t + t1) + ND(t - t3 + t2) 式4
このようにして算出した合計視聴頻度は、話題性指標算出手段202へ出力される。あるいは、各コンテンツが、もとのコンテンツか、派生コンテンツかで重み付けを行って合計するようになっていてもよい。あるいは、各コンテンツの存在するサイトの信頼性を加味して、サイトに依存した重み付けを行って合計するようになっていてもよい。この場合には、斜線で示す部分の合計視聴頻度は、時間に依存しない場合は式5、依存する場合は式6のようになる。ここで、WA、WB、WCは、重み付け係数である。
WANA + WBNB + WCNC 式5
WANA(t) + WBNB(t) + WCNC(t + t1) 式6
なお、視聴頻度は、視聴した時刻が現在からの時間に近いものほど重み付けして算出されていてもよい。例えば、本日視聴された場合の重みを1、k日前に視聴された場合には、重みを1-k/Nとし、N日以前に視聴された場合には、重みを0にして視聴頻度のカウントに入れないといった制御が考えられる。これにより、直近の視聴頻度を重要視することによって、より旬な話題性のあるコンテンツを抽出できるようになる。
101 同一/派生コンテンツグループ化手段
102 話題性判定手段
104 コンテンツ視聴手段
105 コンテンツ蓄積手段
106 コンテンツ視聴履歴蓄積手段
200 視聴頻度算出手段
201 合計視聴頻度算出手段
202 話題性指標算出手段
250 有効視聴区間手段
Claims (17)
- 複数のコンテンツからコンテンツの特徴量を抽出する特徴量抽出手段と、
前記特徴量抽出手段で抽出した複数コンテンツの特徴量間で照合を行い、前記複数コンテンツ中に含まれる同一コンテンツおよびそのコンテンツを使って制作された派生コンテンツを求めてグループ化し、同一/派生コンテンツグループ化情報を算出するコンテンツグループ化手段と、
前記複数コンテンツの視聴履歴情報と前記同一/派生コンテンツグループ化情報とから、同一/派生コンテンツと判定されたコンテンツ間で視聴頻度を合計し、同一/派生コンテンツごとの合計視聴頻度を算出し、前記合計視聴頻度に基づいて、前記同一/派生コンテンツの話題性を判定する話題性判定手段と
を有するコンテンツ話題性判定システム。 - 前記コンテンツが時間軸を有するコンテンツであり、
前記コンテンツグループ化手段は、前記照合によって時間区間ごとに同一/派生コンテンツのグループ化を行い、前記同一/派生コンテンツグループ化情報を算出し、
前記話題性判定手段は、時間区間ごとに前記合計視聴頻度を算出して時間区間ごとに話題性を判定する
請求項1に記載のコンテンツ話題性判定システム。 - 前記コンテンツが音楽または映像である請求項2に記載のコンテンツ話題性判定システム。
- 前記コンテンツの特徴量は、視覚特徴量、又は音響特徴量の少なくとも1つを含む請求項3に記載のコンテンツ話題性判定システム。
- 前記話題性判定手段は、前記合計視聴頻度が大きいコンテンツを話題性のあるコンテンツと判定する請求項1から請求項4のいずれかに記載の話題性判定システム。
- 前記話題性判定手段は、コンテンツの話題性を時間区間で判断する請求項5に記載の話題性判定システム。
- 前記複数のコンテンツからコンテンツを選択してコンテンツを視聴し、視聴されたコンテンツの識別情報と視聴区間とを前記コンテンツ視聴履歴情報として出力するコンテンツ視聴手段を有する請求項1から請求項6のいずれかに記載の話題性判定システム。
- 前記話題性判定手段は、
前記コンテンツ視聴履歴情報から一定の視聴条件を満たす履歴のみを有効と判定する有効視聴区間判定手段を有し、
前記有効視聴区間判定手段で有効と判定された区間を使って合計視聴頻度を算出する
請求項1から請求項7のいずれかに記載の話題性判定システム。 - 複数のコンテンツからコンテンツの特徴量を抽出する特徴量抽出ステップと、
前記抽出された複数コンテンツの特徴量間で照合を行い、前記複数コンテンツ中に含まれる同一コンテンツおよびそのコンテンツを使って制作された派生コンテンツを求めてグループ化し、同一/派生コンテンツグループ化情報を算出するコンテンツグループ化ステップと、
前記複数コンテンツの視聴履歴情報と前記同一/派生コンテンツグループ化情報とから、同一/派生コンテンツと判定されたコンテンツ間で視聴頻度を合計し、同一/派生コンテンツごとの合計視聴頻度を算出し、前記合計視聴頻度に基づいて、前記同一/派生コンテンツの話題性を判定する話題性判定ステップと
を有するコンテンツ話題性判定方法。 - 前記コンテンツが時間軸を有するコンテンツであり、
前記コンテンツグループ化ステップは、前記照合によって時間区間ごとに同一/派生コンテンツのグループ化を行い、前記同一/派生コンテンツグループ化情報を算出し、
前記話題性判定ステップは、時間区間ごとに前記合計視聴頻度を算出して時間区間ごとに話題性を判定する
請求項9に記載のコンテンツ話題性判定方法。 - 前記コンテンツが音楽または映像である請求項10に記載のコンテンツ話題性判定方法。
- 前記コンテンツの特徴量は、視覚特徴量、又は音響特徴量の少なくとも1つを含む請求項11に記載のコンテンツ話題性判定方法。
- 前記話題性判定ステップは、前記合計視聴頻度が大きいコンテンツを話題性のあるコンテンツと判定する請求項9から請求項12のいずれかに記載の話題性判定方法。
- 前記話題性判定ステップは、コンテンツの話題性を時間区間で判断する請求項13に記載の話題性判定方法。
- 前記複数のコンテンツからコンテンツを選択してコンテンツを視聴し、視聴されたコンテンツの識別情報と視聴区間とを前記コンテンツ視聴履歴情報として出力するコンテンツ視聴ステップを有する請求項9から請求項14のいずれかに記載の話題性判定方法。
- 前記話題性判定ステップは、
前記コンテンツ視聴履歴情報から一定の視聴条件を満たす履歴のみを有効と判定し、
前記有効と判定された区間を使って合計視聴頻度を算出する
請求項9から請求項15のいずれかに記載の話題性判定方法。 - 複数のコンテンツからコンテンツの特徴量を抽出する特徴量抽出処理と、
前記抽出された複数コンテンツの特徴量間で照合を行い、前記複数コンテンツ中に含まれる同一コンテンツおよびそのコンテンツを使って制作された派生コンテンツを求めてグループ化し、同一/派生コンテンツグループ化情報を算出するコンテンツグループ化処理と、
前記複数コンテンツの視聴履歴情報と前記同一/派生コンテンツグループ化情報とから、同一/派生コンテンツと判定されたコンテンツ間で視聴頻度を合計し、同一/派生コンテンツごとの合計視聴頻度を算出し、前記合計視聴頻度に基づいて、前記同一/派生コンテンツの話題性を判定する話題性判定処理と
を情報処理装置に実行させるコンテンツ話題性判定プログラム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/001,369 US8306992B2 (en) | 2008-06-26 | 2009-06-16 | System for determining content topicality, and method and program thereof |
JP2010517912A JP5387860B2 (ja) | 2008-06-26 | 2009-06-16 | コンテンツ話題性判定システム、その方法及びプログラム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008167344 | 2008-06-26 | ||
JP2008-167344 | 2008-06-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2009157339A1 true WO2009157339A1 (ja) | 2009-12-30 |
Family
ID=41444407
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/060908 WO2009157339A1 (ja) | 2008-06-26 | 2009-06-16 | コンテンツ話題性判定システム、その方法及びプログラム |
Country Status (3)
Country | Link |
---|---|
US (1) | US8306992B2 (ja) |
JP (1) | JP5387860B2 (ja) |
WO (1) | WO2009157339A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2020042332A (ja) * | 2018-09-06 | 2020-03-19 | 株式会社ビデオリサーチ | コンテンツ接触状況調査システム、コンテンツ接触状況調査方法、及びプログラム |
JP6964367B1 (ja) * | 2021-01-05 | 2021-11-10 | 株式会社Rilarc | 情報処理装置、情報処理方法及び情報処理プログラム |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9912973B2 (en) * | 2014-08-07 | 2018-03-06 | Echostar Technologies L.L.C. | Systems and methods for facilitating content discovery based on viewer ratings |
US10380633B2 (en) * | 2015-07-02 | 2019-08-13 | The Nielsen Company (Us), Llc | Methods and apparatus to generate corrected online audience measurement data |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002279026A (ja) * | 2001-03-19 | 2002-09-27 | Nec Corp | 番組関心度提示サーバ、番組関心度提示方法、および番組関心度提示プログラム |
JP2004206679A (ja) * | 2002-12-12 | 2004-07-22 | Sony Corp | 情報処理装置および方法、記録媒体、並びにプログラム |
JP2005236646A (ja) * | 2004-02-19 | 2005-09-02 | Fuji Xerox Co Ltd | 画像表示装置および方法およびプログラム |
JP2005274991A (ja) * | 2004-03-25 | 2005-10-06 | Sony Corp | 楽曲データ格納装置および重複楽曲削除方法 |
JP2005333453A (ja) * | 2004-05-20 | 2005-12-02 | Matsushita Electric Ind Co Ltd | デジタル情報コンテンツの配信システム |
JP2007272651A (ja) * | 2006-03-31 | 2007-10-18 | Research Organization Of Information & Systems | 映像提供装置及び映像提供方法 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7720723B2 (en) * | 1998-09-18 | 2010-05-18 | Amazon Technologies, Inc. | User interface and methods for recommending items to users |
US20020198882A1 (en) * | 2001-03-29 | 2002-12-26 | Linden Gregory D. | Content personalization based on actions performed during a current browsing session |
JP3713043B2 (ja) * | 2002-11-14 | 2005-11-02 | 松下電器産業株式会社 | 視聴履歴記録方法および視聴履歴利用方法 |
US7260568B2 (en) * | 2004-04-15 | 2007-08-21 | Microsoft Corporation | Verifying relevance between keywords and web site contents |
JP4892993B2 (ja) | 2006-01-30 | 2012-03-07 | 大日本印刷株式会社 | 携帯型端末、コンテンツ配信システム、uimカード、プログラム及び記録媒体 |
-
2009
- 2009-06-16 JP JP2010517912A patent/JP5387860B2/ja active Active
- 2009-06-16 US US13/001,369 patent/US8306992B2/en active Active
- 2009-06-16 WO PCT/JP2009/060908 patent/WO2009157339A1/ja active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002279026A (ja) * | 2001-03-19 | 2002-09-27 | Nec Corp | 番組関心度提示サーバ、番組関心度提示方法、および番組関心度提示プログラム |
JP2004206679A (ja) * | 2002-12-12 | 2004-07-22 | Sony Corp | 情報処理装置および方法、記録媒体、並びにプログラム |
JP2005236646A (ja) * | 2004-02-19 | 2005-09-02 | Fuji Xerox Co Ltd | 画像表示装置および方法およびプログラム |
JP2005274991A (ja) * | 2004-03-25 | 2005-10-06 | Sony Corp | 楽曲データ格納装置および重複楽曲削除方法 |
JP2005333453A (ja) * | 2004-05-20 | 2005-12-02 | Matsushita Electric Ind Co Ltd | デジタル情報コンテンツの配信システム |
JP2007272651A (ja) * | 2006-03-31 | 2007-10-18 | Research Organization Of Information & Systems | 映像提供装置及び映像提供方法 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2020042332A (ja) * | 2018-09-06 | 2020-03-19 | 株式会社ビデオリサーチ | コンテンツ接触状況調査システム、コンテンツ接触状況調査方法、及びプログラム |
JP6964367B1 (ja) * | 2021-01-05 | 2021-11-10 | 株式会社Rilarc | 情報処理装置、情報処理方法及び情報処理プログラム |
JP2022105890A (ja) * | 2021-01-05 | 2022-07-15 | 株式会社Rilarc | 情報処理装置、情報処理方法及び情報処理プログラム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2009157339A1 (ja) | 2011-12-08 |
JP5387860B2 (ja) | 2014-01-15 |
US8306992B2 (en) | 2012-11-06 |
US20110153609A1 (en) | 2011-06-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5546246B2 (ja) | コンテンツ管理システム | |
US11030462B2 (en) | Systems and methods for storing content | |
JP6903751B2 (ja) | 一致するコンテンツを特定するためのシステムおよび方法 | |
US9251406B2 (en) | Method and system for detecting users' emotions when experiencing a media program | |
JP6316787B2 (ja) | 広告タグを介したウェブベースのメディアにおけるコンテンツシンジケーション | |
CN104768082B (zh) | 一种音视频播放信息处理方法及服务器 | |
CA2758121C (en) | Policy-based media syndication and monetization | |
US20090100456A1 (en) | Method and apparatus for monitoring online video | |
CN110290400B (zh) | 可疑刷量视频的识别方法、真实播放量预估方法及装置 | |
EP2417553A1 (en) | Policy-based video content syndication | |
US9684907B2 (en) | Networking with media fingerprints | |
JP5387860B2 (ja) | コンテンツ話題性判定システム、その方法及びプログラム | |
EP3346396A1 (en) | Multimedia resource quality assessment method and apparatus | |
CN107733874A (zh) | 信息处理方法、装置、计算机设备和存储介质 | |
US20090049390A1 (en) | Methods and apparatuses for distributing content based on profile information and rating the content | |
CN112218118A (zh) | 一种音视频裁剪方法及装置 | |
JP2009130529A (ja) | 広告映像再生方法及び装置及びプログラム | |
AU2016253600A1 (en) | Content management system | |
AU2013201930A1 (en) | Content Management System |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09770049 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010517912 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13001369 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 09770049 Country of ref document: EP Kind code of ref document: A1 |